Compare commits

..

18 Commits

Author SHA1 Message Date
Manuel Raynaud
9427b17a15 save work 2026-04-27 16:21:15 +02:00
Manuel Raynaud
5e31eb0caa ♻️(backend) use additional http extra methods for content action
We used one drf extra action with both PATCH and GET https methods and
then split in two private methods and call them based on the http method
of the request. DRF allow to do this by using a mapping annotation
allowing us to have directly twi viewset actions used
django-rest-framework.org/api-guide/viewsets/#marking-extra-actions-for-routing
2026-04-27 15:07:35 +02:00
Manuel Raynaud
a00c51247d 🔧(helm) set logger to debug level for feature environment
The feature environment are here for demo and debug purpose. For this we
want to have more logs and set them to the debug level.
2026-04-27 15:07:35 +02:00
Anthony LC
100817b0e6 🥅(sw) improve requests fallback
We improve overall SW requests fallback.
If the plugin fails we try to refetch the request
without the plugin modifications, meaning the
status code will be more in correlation with the
actual server response and not the plugin error.

We improved as well the cache fallback, if
the cache failed because a store was missing,
we delete the DB to be sure to have a DB in
correlation with the current app version.
2026-04-27 15:07:34 +02:00
Anthony LC
ff2c61a3dc ✈️(SW) add offline support for content
We have added offline support for content.
When the content update fails, we save the new
content in the cache, and we will sync it later
with the SyncManager.
2026-04-27 15:07:34 +02:00
Anthony LC
4d250a7342 ️(SW) cache content and metadata for API requests
We cache the content of API responses in the service
worker, so that we can serve them when the user
is offline.
We also cache the ETag and Last-Modified headers,
so that we can make conditional requests to the
server and avoid downloading the content again if
it hasn't changed.
2026-04-27 15:07:34 +02:00
Manuel Raynaud
6f2cd8a829 ️(backend) implement etag and last_modified headers to fetch content
We want to give to the js client the ability to use some headers to
avoid fetching a content it already have. For this, the content endpoint
will return an ETag and Last-Modified headers corresponding to the file
content ETag and its last modification. For future fetch, the client can
use the If-None-Match or If-Modified-Since request headers, if one of
these headers are satisfied, the endpoint will return a 304 response. If
not it will still return a 200
2026-04-27 15:07:33 +02:00
Anthony LC
b6c6fc8217 👔(frontend) integrate dedicated content endpoint
To improve the performance of loading document content,
we have implemented a dedicated endpoint for
fetching document content. This allows us to load
the document metadata and content separately.
We updated the different components to utilize
this new endpoint, ensuring that the document content is
fetched and updated correctly.
2026-04-27 15:07:33 +02:00
Anthony LC
68f1600c2b 🔥(clients) remove without_content query string
We now have a dedicated API to fetch only the doc
content, so we can remove the without_content
query string from the doc fetching API.
2026-04-27 15:07:33 +02:00
Manuel Raynaud
1c2bafb0f7 📝(backend) add breaking changes document in UPGRADE.md file
We need to list the breaking changes made for the future version 5.0.0
2026-04-27 15:07:31 +02:00
Manuel Raynaud
6b3d19715b ️(backend) stream s3 file content with a dedicated endpoint
We created a dedicated endpoint to retrieve a document content. The
content of the s3 file is stream when this endpoint is fetch.
2026-04-27 15:06:59 +02:00
Manuel Raynaud
51d4746435 🔥(backend) remove content in document responses
The content was always loaded in the document reponse. We remove this
behavior in order to not make an http call to the s3 storage. To get the
document content it is needed now to use the new endpoint dedicated to
retrive the document content.
2026-04-27 15:06:57 +02:00
Manuel Raynaud
d7a186a98b (backend) create a dedicated endpoint to update document content
We want a dedicated endpoint to update a document content. Previously,
updating the content was made on the update action shared with all other
document's properties. When the title is updated, the response contains
the content, so a call to the s3 storage is made and we don't want this.
Isolating the content update will allow us in the next commit to remove
the content from the Document serializer.
2026-04-27 15:06:34 +02:00
Manuel Raynaud
207f21447d ♻️(backend) rename documents content endpoint in formatted-content
The endpoint /api/v1.0/documents/{document_id}/content/ has been renamed
in /api/v1.0/documents/{document_id}/formatted-content/. formatted-content
seems more accurante and the content endpoint will be used for another
purpose more appropriated.
2026-04-27 15:06:33 +02:00
Manuel Raynaud
3433d6de9a 📄(upgrade) specify docspec upgrade version
The version o docspec must be upgraded to version >= 3.0.0
2026-04-27 14:52:27 +02:00
Manuel Raynaud
5e22bc4736 🔥(backend) remove deprecated descendants endpoint
We can remove the deprecated and unused descendants endpoint. We will
release a new major version now.
2026-04-27 14:52:27 +02:00
Stephan Meijer
2d2e326cb6 ⬆️(backend) upgrade docspec to v3.0.0 and adapt converter API
Summary

- Bump docspec Docker image from `2.6.3` to `3.0.0` and adapt
`DocSpecConverter` to the new API (raw body upload with explicit
`Content-Type`/`Accept` headers instead of multipart form)

Important

**The Docker image (`ghcr.io/docspecio/api:3.0.0`) must be updated
alongside the code changes.** The new request format is incompatible
with v2.x — deploying only the code without updating the image (or vice
versa) will break document conversion.
2026-04-27 11:41:43 +00:00
Manuel Raynaud
ef9376368f 🔧(docker) run django app with uvicorn in dev environment
The django application is running in ASGI in production, to have the
same behavior we run the development container with uvicorn too with
options more appropriated for a development evironment.
2026-04-27 08:49:55 +02:00
76 changed files with 3343 additions and 2826 deletions

5
.gitignore vendored
View File

@@ -82,8 +82,3 @@ db.sqlite3
# Cursor rules
.cursorrules
# Claude
CLAUDE.md
.claude/
openspec/

View File

@@ -6,11 +6,18 @@ and this project adheres to
## [Unreleased]
### Added
- ✨(backend) create a dedicated endpoint to update document content
- ⚡️(backend) stream s3 file content with a dedicated endpoint
### Changed
- ♻️(backend) rename documents content endpoint in `formatted-content` (BC)
- 🚸(frontend) show Crisp from the help menu #2222
- ♿️(frontend) structure correctly 5xx error alerts #2128
- ♿️(frontend) make doc search result labels uniquely identifiable #2212
- ⬆️(backend) upgrade docspec to v3.0.x and adapt converter API #2220
### Fixed
@@ -22,6 +29,11 @@ and this project adheres to
- 🛂(frontend) fix cannot manage member on small screen #2226
- 🐛(backend) load jwks url when OIDC_RS_PRIVATE_KEY_STR is set
### Removed
- 🔥(backend) remove deprecated descendants endpoint #2243
- 🔥(backend) remove content in document responses
## [v4.8.6] - 2026-04-08
### Added

View File

@@ -134,7 +134,15 @@ ENV DB_HOST=postgresql \
DB_PORT=5432
# Run django development server
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
CMD [\
"uvicorn",\
"--app-dir=/app",\
"--host=0.0.0.0",\
"--lifespan=off",\
"--reload",\
"--reload-dir=/app",\
"impress.asgi:application"\
]
# ---- Production image ----
FROM core AS backend-production

View File

@@ -16,6 +16,24 @@ the following command inside your docker container:
## [Unreleased]
We made several changes around document content management leading to several breaking changes in the API.
- The endpoint `/api/v1.0/documents/{document_id}/content/` has been renamed in `/api/v1.0/documents/{document_id}/formatted-content/`
- There is no more `content` attribute in the response of `/api/v1.0/documents/{document_id}/`, two new endpoints have been added to retrieve or update the document content.
- A new `GET /api/v1.0/documents/{document_id}/content/` endpoint has been implemented to fetch the document content ; this endpoint streams the whole content with a `text/plain` content-type response.
- A new `PATCH /api/v1.0/documents/{document_id}/content/` endpoint has been added to update the document content ; expected payload is:
```json
{
"content": "document content in base64",
}
```
Other changes:
- The deprecated endpoint `/api/v1.0/documents/<document_id>/descendants` is removed. The search endpoint should be used instead.
- Upgrade docspec dependency to version >= 3.0.0
The docspec service has changed since version 3.0.0, we ware now compatible with this version and not with version 2.x.x anymore
## [4.6.0] - 2026-02-27
- ⚠️ Some setup have changed to offer a bigger flexibility and consistency, overriding the favicon and logo are now from the theme configuration.

View File

@@ -29,8 +29,8 @@ services:
- MINIO_ROOT_USER=impress
- MINIO_ROOT_PASSWORD=password
ports:
- '9000:9000'
- '9001:9001'
- "9000:9000"
- "9001:9001"
healthcheck:
test: ["CMD", "mc", "ready", "local"]
interval: 1s
@@ -81,16 +81,16 @@ services:
- ./src/backend:/app
- ./data/static:/data/static
depends_on:
postgresql:
condition: service_healthy
restart: true
mailcatcher:
condition: service_started
redis:
condition: service_started
createbuckets:
condition: service_started
postgresql:
condition: service_healthy
restart: true
mailcatcher:
condition: service_started
redis:
condition: service_started
createbuckets:
condition: service_started
celery-dev:
user: ${DOCKER_USER:-1000}
image: impress:backend-development
@@ -143,7 +143,7 @@ services:
frontend-development:
user: "${DOCKER_USER:-1000}"
build:
build:
context: .
dockerfile: ./src/frontend/Dockerfile
target: impress-dev
@@ -173,13 +173,13 @@ services:
image: node:22
user: "${DOCKER_USER:-1000}"
environment:
HOME: /tmp
HOME: /tmp
volumes:
- ".:/app"
y-provider-development:
user: ${DOCKER_USER:-1000}
build:
build:
context: .
dockerfile: ./src/frontend/servers/y-provider/Dockerfile
target: y-provider-development
@@ -221,7 +221,11 @@ services:
- --health-enabled=true
- --metrics-enabled=true
healthcheck:
test: ['CMD-SHELL', 'exec 3<>/dev/tcp/localhost/9000; echo -e "GET /health/live HTTP/1.1\r\nHost: localhost\r\nConnection: close\r\n\r\n" >&3; grep "HTTP/1.1 200 OK" <&3']
test:
[
"CMD-SHELL",
'exec 3<>/dev/tcp/localhost/9000; echo -e "GET /health/live HTTP/1.1\r\nHost: localhost\r\nConnection: close\r\n\r\n" >&3; grep "HTTP/1.1 200 OK" <&3',
]
start_period: 5s
interval: 1s
timeout: 2s
@@ -235,7 +239,7 @@ services:
KC_DB_PASSWORD: pass
KC_DB_USERNAME: impress
KC_DB_SCHEMA: public
PROXY_ADDRESS_FORWARDING: 'true'
PROXY_ADDRESS_FORWARDING: "true"
ports:
- "8080:8080"
depends_on:
@@ -244,7 +248,7 @@ services:
restart: true
docspec:
image: ghcr.io/docspecio/api:2.6.3
image: ghcr.io/docspecio/api:3.0.1
ports:
- "4000:4000"

View File

@@ -12,6 +12,7 @@ from core.models import DocumentAccess, RoleChoices, get_trashbin_cutoff
ACTION_FOR_METHOD_TO_PERMISSION = {
"versions_detail": {"DELETE": "versions_destroy", "GET": "versions_retrieve"},
"children": {"GET": "children_list", "POST": "children_create"},
"content": {"PATCH": "content_patch", "GET": "content_retrieve"},
}

View File

@@ -16,7 +16,7 @@ from django.utils.translation import gettext_lazy as _
import magic
from rest_framework import serializers
from core import choices, enums, models, utils, validators
from core import choices, enums, models, validators
from core.services import mime_types
from core.services.ai_services import AI_ACTIONS
from core.services.converter_services import (
@@ -178,7 +178,6 @@ class DocumentLightSerializer(serializers.ModelSerializer):
class DocumentSerializer(ListDocumentSerializer):
"""Serialize documents with all fields for display in detail views."""
content = serializers.CharField(required=False)
websocket = serializers.BooleanField(required=False, write_only=True)
file = serializers.FileField(
required=False, write_only=True, allow_null=True, max_length=255
@@ -193,7 +192,6 @@ class DocumentSerializer(ListDocumentSerializer):
"ancestors_link_role",
"computed_link_reach",
"computed_link_role",
"content",
"created_at",
"creator",
"deleted_at",
@@ -242,13 +240,6 @@ class DocumentSerializer(ListDocumentSerializer):
if request:
if request.method == "POST":
fields["id"].read_only = False
if (
serializers.BooleanField().to_internal_value(
request.query_params.get("without_content", False)
)
is True
):
del fields["content"]
return fields
@@ -265,18 +256,6 @@ class DocumentSerializer(ListDocumentSerializer):
return value
def validate_content(self, value):
"""Validate the content field."""
if not value:
return None
try:
b64decode(value, validate=True)
except binascii.Error as err:
raise serializers.ValidationError("Invalid base64 content.") from err
return value
def validate_file(self, file):
"""Add file size and type constraints as defined in settings."""
if not file:
@@ -310,52 +289,33 @@ class DocumentSerializer(ListDocumentSerializer):
return instance # No data provided, skip the update
return super().update(instance, validated_data)
def save(self, **kwargs):
class DocumentContentSerializer(serializers.Serializer):
"""Serializer for updating only the raw content of a document stored in S3."""
content = serializers.CharField(required=True)
websocket = serializers.BooleanField(required=False)
def validate_content(self, value):
"""Validate the content field."""
try:
b64decode(value, validate=True)
except binascii.Error as err:
raise serializers.ValidationError("Invalid base64 content.") from err
return value
def update(self, instance, validated_data):
"""
Process the content field to extract attachment keys and update the document's
"attachments" field for access control.
This serializer does not support updates.
"""
content = self.validated_data.get("content", "")
extracted_attachments = set(utils.extract_attachments(content))
raise NotImplementedError("Update is not supported for this serializer.")
existing_attachments = (
set(self.instance.attachments or []) if self.instance else set()
)
new_attachments = extracted_attachments - existing_attachments
if new_attachments:
attachments_documents = (
models.Document.objects.filter(
attachments__overlap=list(new_attachments)
)
.only("path", "attachments")
.order_by("path")
)
user = self.context["request"].user
readable_per_se_paths = (
models.Document.objects.readable_per_se(user)
.order_by("path")
.values_list("path", flat=True)
)
readable_attachments_paths = utils.filter_descendants(
[doc.path for doc in attachments_documents],
readable_per_se_paths,
skip_sorting=True,
)
readable_attachments = set()
for document in attachments_documents:
if document.path not in readable_attachments_paths:
continue
readable_attachments.update(set(document.attachments) & new_attachments)
# Update attachments with readable keys
self.validated_data["attachments"] = list(
existing_attachments | readable_attachments
)
return super().save(**kwargs)
def create(self, validated_data):
"""
This serializer does not support create.
"""
raise NotImplementedError("Create is not supported for this serializer.")
class DocumentAccessSerializer(serializers.ModelSerializer):

View File

@@ -194,3 +194,8 @@ class AIUserRateThrottle(AIBaseRateThrottle):
if x_forwarded_for
else request.META.get("REMOTE_ADDR")
)
def get_content_metadata_cache_key(document_id):
"""Return the cache key used to store content metadata."""
return f"docs:content-metadata:{document_id!s}"

View File

@@ -3,6 +3,7 @@
# pylint: disable=too-many-lines
import base64
import datetime as dt
import ipaddress
import json
import logging
@@ -776,17 +777,15 @@ class DocumentViewSet(
def perform_update(self, serializer):
"""Check rules about collaboration."""
if (
serializer.validated_data.get("websocket", False)
or not settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY
not serializer.validated_data.get("websocket", False)
and settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY
and not self._can_user_edit_document(serializer.instance.id, set_cache=True)
):
return super().perform_update(serializer)
raise drf.exceptions.PermissionDenied(
"You are not allowed to edit this document."
)
if self._can_user_edit_document(serializer.instance.id, set_cache=True):
return super().perform_update(serializer)
raise drf.exceptions.PermissionDenied(
"You are not allowed to edit this document."
)
return super().perform_update(serializer)
@drf.decorators.action(
detail=True,
@@ -1112,30 +1111,6 @@ class DocumentViewSet(
return self.get_response_for_queryset(queryset)
@drf.decorators.action(
detail=True,
methods=["get"],
ordering=["path"],
)
def descendants(self, request, *args, **kwargs):
"""Deprecated endpoint to list descendants of a document."""
logger.warning(
"The 'descendants' endpoint is deprecated and will be removed in a future release. "
"The search endpoint should be used for all document retrieval use cases."
)
document = self.get_object()
queryset = document.get_descendants().filter(ancestors_deleted_at__isnull=True)
queryset = self.filter_queryset(queryset)
filterset = DocumentFilter(request.GET, queryset=queryset)
if not filterset.is_valid():
raise drf.exceptions.ValidationError(filterset.errors)
queryset = filterset.qs
return self.get_response_for_queryset(queryset)
@drf.decorators.action(
detail=True,
methods=["get"],
@@ -1875,6 +1850,170 @@ class DocumentViewSet(
return drf.response.Response("authorized", headers=request.headers, status=200)
@drf.decorators.action(detail=True, methods=["patch"])
def content(self, request, *args, **kwargs):
"""Update the raw Yjs content of a document stored in S3."""
document = self.get_object()
serializer = serializers.DocumentContentSerializer(data=request.data)
serializer.is_valid(raise_exception=True)
if (
not serializer.validated_data.get("websocket", False)
and settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY
and not self._can_user_edit_document(document.id, set_cache=True)
):
raise drf.exceptions.PermissionDenied(
"You are not allowed to edit this document."
)
content = serializer.validated_data["content"]
try:
extracted_attachments = set(extract_attachments(content))
except ValueError:
return drf_response.Response(
"invalid yjs document", status=status.HTTP_400_BAD_REQUEST
)
existing_attachments = set(document.attachments or [])
new_attachments = extracted_attachments - existing_attachments
# Ensure we update attachments the request user is allowed to read
if new_attachments:
attachments_documents = (
models.Document.objects.filter(
attachments__overlap=list(new_attachments)
)
.only("path", "attachments")
.order_by("path")
)
user = self.request.user
readable_per_se_paths = (
models.Document.objects.readable_per_se(user)
.order_by("path")
.values_list("path", flat=True)
)
readable_attachments_paths = filter_descendants(
[doc.path for doc in attachments_documents],
readable_per_se_paths,
skip_sorting=True,
)
readable_attachments = set()
for attachments_document in attachments_documents:
if attachments_document.path not in readable_attachments_paths:
continue
readable_attachments.update(
set(attachments_document.attachments) & new_attachments
)
# Update attachments with readable keys
document.attachments = list(existing_attachments | readable_attachments)
document.content = content
document.save()
cache.delete(utils.get_content_metadata_cache_key(document.id))
return drf_response.Response(status=status.HTTP_204_NO_CONTENT)
@content.mapping.get
def content_retrieve(self, request, *args, **kwargs):
"""
Retrieve the raw content file from s3 and stream it.
We implement a HTTP cache based on the ETag and LastModified headers.
We retrieve the ETag and LastModified from the S3 head operation, save them in cache to
reuse them in future requests.
We check in the request if the ETag is present in the If-None-Match header and if it's the
same as the one from the S3 head operation, we return a 304 response.
If the ETag is not present or not the same, we do the same check based on the LastModifed
value if present in the If-Modified-Since header.
"""
document = self.get_object()
# The S3 call to fetch the document can take time and the database
# connection is useless in this process. Hence we are closing it now
# to prevent having a massive number of database connections during
# the web-socket re-connection burst.
connection.close()
if not (
content_metadata := cache.get(
utils.get_content_metadata_cache_key(document.id)
)
):
try:
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
except ClientError:
return StreamingHttpResponse(
b"", content_type="text/plain", status=status.HTTP_200_OK
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
utils.get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
settings.CONTENT_METADATA_CACHE_TIMEOUT,
)
else:
last_modified = dt.datetime.fromisoformat(
content_metadata.get("last_modified")
)
etag = content_metadata.get("etag")
size = content_metadata.get("size")
# --- Check conditional headers from any client ---
if_none_match = request.META.get("HTTP_IF_NONE_MATCH") # contains ETag
if_modified_since = request.META.get("HTTP_IF_MODIFIED_SINCE")
# Strip the W/ weak prefix. Proxies (e.g. nginx with gzip) convert strong
# ETags to weak ones, so a strict equality check would fail on production
# even when unchanged.
if if_none_match and if_none_match.startswith("W/"):
if_none_match = if_none_match.removeprefix("W/")
if if_none_match and if_none_match == etag:
return drf_response.Response(status=status.HTTP_304_NOT_MODIFIED)
if if_modified_since:
try:
since = dt.datetime.strptime(
if_modified_since, "%a, %d %b %Y %H:%M:%S %Z"
)
except ValueError:
pass
else:
if not since.tzinfo:
since = since.replace(tzinfo=dt.timezone.utc)
if last_modified <= since:
return drf_response.Response(status=status.HTTP_304_NOT_MODIFIED)
def _stream(file_key):
with default_storage.open(file_key, "rb") as f:
while chunk := f.read(8192):
yield chunk
response = StreamingHttpResponse(
streaming_content=_stream(document.file_key),
content_type="text/plain",
status=status.HTTP_200_OK,
)
response["Content-Length"] = size
response["ETag"] = etag
response["Last-Modified"] = last_modified.strftime("%a, %d %b %Y %H:%M:%S %Z")
response["Cache-Control"] = "private, no-cache"
return response
@drf.decorators.action(detail=True, methods=["get"], url_path="media-check")
def media_check(self, request, *args, **kwargs):
"""
@@ -2193,10 +2332,10 @@ class DocumentViewSet(
@drf.decorators.action(
detail=True,
methods=["get"],
url_path="content",
name="Get document content in different formats",
url_path="formatted-content",
name="Convert document content to different formats",
)
def content(self, request, pk=None):
def formatted_content(self, request, pk=None):
"""
Retrieve document content in different formats (JSON, Markdown, HTML).

View File

@@ -1308,7 +1308,9 @@ class Document(MP_Node, BaseModel):
"children_create": can_create_children,
"collaboration_auth": can_get,
"comment": can_comment,
"content": can_get,
"formatted_content": can_get,
"content_patch": can_update,
"content_retrieve": retrieve,
"cors_proxy": can_get,
"descendants": can_get,
"destroy": can_destroy,

View File

@@ -49,7 +49,7 @@ class Converter:
if content_type == mime_types.DOCX and accept == mime_types.YJS:
blocknote_data = self.docspec.convert(
data, mime_types.DOCX, mime_types.BLOCKNOTE
data, content_type, mime_types.BLOCKNOTE
)
return self.ydoc.convert(
blocknote_data, mime_types.BLOCKNOTE, mime_types.YJS
@@ -66,8 +66,11 @@ class DocSpecConverter:
response = requests.post(
url,
headers={"Accept": mime_types.BLOCKNOTE},
files={"file": ("document.docx", data, content_type)},
headers={
"Content-Type": content_type,
"Accept": mime_types.BLOCKNOTE,
},
data=data,
timeout=settings.CONVERSION_API_TIMEOUT,
verify=settings.CONVERSION_API_SECURE,
)

View File

@@ -0,0 +1,440 @@
"""
Tests for the GET /api/v1.0/documents/{id}/content/ endpoint.
"""
from datetime import timedelta
from uuid import uuid4
from django.core.cache import cache
from django.core.files.storage import default_storage
from django.utils import timezone
import pytest
from rest_framework import status
from rest_framework.test import APIClient
from core import factories
from core.api.utils import get_content_metadata_cache_key
from core.tests.conftest import TEAM, USER, VIA
pytestmark = pytest.mark.django_db
@pytest.mark.parametrize("reach", ["authenticated", "restricted"])
def test_api_documents_content_retrieve_anonymous_non_public(reach):
"""Anonymous users cannot retrieve content of non-public documents."""
document = factories.DocumentFactory(link_reach=reach)
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_401_UNAUTHORIZED
def test_api_documents_content_retrieve_anonymous_public():
"""Anonymous users can retrieve content of a public document."""
document = factories.DocumentFactory(link_reach="public")
assert not cache.get(get_content_metadata_cache_key(document.id))
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
assert response["Content-Type"] == "text/plain"
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"
assert cache.get(get_content_metadata_cache_key(document.id))
def test_api_documents_content_retrieve_authenticated_no_access():
"""Authenticated users without access cannot retrieve content of a restricted document."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
client = APIClient()
client.force_login(user)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_403_FORBIDDEN
@pytest.mark.parametrize("link_reach", ["authenticated", "public"])
def test_api_documents_content_retrieve_authenticated_not_restricted(link_reach):
"""
Authenticated users can retrieve content of a public document
without any explicit access grant.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach=link_reach)
client = APIClient()
client.force_login(user)
assert not cache.get(get_content_metadata_cache_key(document.id))
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"
assert cache.get(get_content_metadata_cache_key(document.id))
@pytest.mark.parametrize("via", VIA)
@pytest.mark.parametrize(
"role", ["reader", "commenter", "editor", "administrator", "owner"]
)
def test_api_documents_content_retrieve_success(role, via, mock_user_teams):
"""Users with any role can retrieve document content, directly or via a team."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
if via == USER:
factories.UserDocumentAccessFactory(document=document, user=user, role=role)
elif via == TEAM:
mock_user_teams.return_value = ["lasuite"]
factories.TeamDocumentAccessFactory(
document=document, team="lasuite", role=role
)
client = APIClient()
client.force_login(user)
assert not cache.get(get_content_metadata_cache_key(document.id))
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"
assert cache.get(get_content_metadata_cache_key(document.id))
def test_api_documents_content_retrieve_nonexistent_document():
"""Retrieving content of a non-existent document returns 404."""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
response = client.get(f"/api/v1.0/documents/{uuid4()!s}/content/")
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_api_documents_content_retrieve_file_not_in_storage():
"""Returns an empty string when the file does not exists on the storage."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="reader")
client = APIClient()
client.force_login(user)
default_storage.delete(document.file_key)
assert not default_storage.exists(document.file_key)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
assert b"".join(response.streaming_content) == b""
assert not response.get("Content-Length")
assert not response.get("ETag")
assert not response.get("Last-Modified")
assert not response.get("Cache-Control")
assert not cache.get(get_content_metadata_cache_key(document.id))
def test_api_documents_content_retrieve_content_length_header():
"""The response includes the Content-Length header when available from storage."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="reader")
client = APIClient()
client.force_login(user)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
expected_size = default_storage.size(document.file_key)
assert int(response["Content-Length"]) == expected_size
@pytest.mark.parametrize("role", ["reader", "commenter", "editor", "administrator"])
def test_api_documents_content_retrieve_deleted_document_for_non_owners_all_roles(role):
"""
Retrieving content of a soft-deleted document returns 404 for any non-owner role.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role=role)
document.soft_delete()
document.refresh_from_db()
client = APIClient()
client.force_login(user)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_api_documents_content_retrieve_deleted_document_for_owner():
"""
Owners can still retrieve content of a soft-deleted document.
The 'retrieve' ability is True for owners regardless of deletion state.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
document.soft_delete()
document.refresh_from_db()
client = APIClient()
client.force_login(user)
assert not cache.get(get_content_metadata_cache_key(document.id))
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
assert response.status_code == status.HTTP_200_OK
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"
assert cache.get(get_content_metadata_cache_key(document.id))
def test_api_documents_content_retrieve_reusing_etag():
"""Fetching content reusing a valid ETag header should return a 304."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={"If-None-Match": etag},
)
assert response.status_code == status.HTTP_304_NOT_MODIFIED
def test_api_documents_content_retrieve_reusing_invalid_etag():
"""Fetching content using an invalid ETag header should return a 200."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={"If-None-Match": "invalid"},
)
assert response.status_code == status.HTTP_200_OK
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"
def test_api_documents_content_retrieve_using_etag_without_cache():
"""
Fetching content using a valid ETag header but without existing cache should return a 304.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
etag = file_metadata["ETag"]
assert not cache.get(get_content_metadata_cache_key(document.id))
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={"If-None-Match": etag},
)
assert response.status_code == status.HTTP_304_NOT_MODIFIED
def test_api_documents_content_retrieve_reusing_last_modified_since():
"""Fetching a content using a If-Modified-Since valid should return a 304."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={
"If-Modified-Since": timezone.now().strftime("%a, %d %b %Y %H:%M:%S %Z")
},
)
assert response.status_code == status.HTTP_304_NOT_MODIFIED
def test_api_documents_content_retrieve_using_last_modified_since_without_cache():
"""
Fetching a content using a If-Modified-Since valid should return a 304
even if content metadata are not present in cache.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
assert not cache.get(get_content_metadata_cache_key(document.id))
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={
"If-Modified-Since": timezone.now().strftime("%a, %d %b %Y %H:%M:%S %Z")
},
)
assert response.status_code == status.HTTP_304_NOT_MODIFIED
def test_api_documents_content_retrieve_reusing_last_modified_since_invalid():
"""Fetching a content using a If-Modified-Since invalid should return a 200."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
client = APIClient()
client.force_login(user)
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/content/",
headers={
"If-Modified-Since": (timezone.now() - timedelta(minutes=60)).strftime(
"%a, %d %b %Y %H:%M:%S %Z"
)
},
)
assert response.status_code == status.HTTP_200_OK
assert b"".join(
response.streaming_content
) == factories.YDOC_HELLO_WORLD_BASE64.encode("utf-8")
assert response["Content-Length"] is not None
assert response["ETag"] is not None
assert response["Last-Modified"] is not None
assert response["Cache-Control"] == "private, no-cache"

View File

@@ -0,0 +1,587 @@
"""
Tests for the PATCH /api/v1.0/documents/{id}/content/ endpoint.
"""
import base64
from functools import cache
from uuid import uuid4
from django.core.cache import cache as django_cache
from django.core.files.storage import default_storage
import pycrdt
import pytest
import responses
from rest_framework import status
from rest_framework.test import APIClient
from core import factories, models
from core.tests.conftest import TEAM, USER, VIA
pytestmark = pytest.mark.django_db
@cache
def get_sample_ydoc():
"""Return a ydoc from text for testing purposes."""
ydoc = pycrdt.Doc()
ydoc["document-store"] = pycrdt.Text("Hello")
update = ydoc.get_update()
return base64.b64encode(update).decode("utf-8")
def get_s3_content(document):
"""Read the raw content currently stored in S3 for the given document."""
with default_storage.open(document.file_key, mode="rb") as file:
return file.read().decode()
def test_api_documents_content_update_anonymous():
"""Anonymous users without access cannot update document content."""
document = factories.DocumentFactory(link_reach="restricted")
response = APIClient().patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
def test_api_documents_content_update_authenticated_no_access():
"""Authenticated users without access cannot update document content."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
@pytest.mark.parametrize("role", ["reader", "commenter"])
def test_api_documents_content_update_read_only_role(role):
"""Users with reader or commenter role cannot update document content."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role=role)
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
@pytest.mark.parametrize("via", VIA)
@pytest.mark.parametrize("role", ["editor", "administrator", "owner"])
def test_api_documents_content_update_success(role, via, mock_user_teams):
"""Users with editor, administrator, or owner role can update document content."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
if via == USER:
factories.UserDocumentAccessFactory(document=document, user=user, role=role)
elif via == TEAM:
mock_user_teams.return_value = ["lasuite"]
factories.TeamDocumentAccessFactory(
document=document, team="lasuite", role=role
)
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": True},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
def test_api_documents_content_update_missing_content_field():
"""A request body without the content field returns 400."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="editor")
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json() == {
"content": [
"This field is required.",
]
}
def test_api_documents_content_update_invalid_base64():
"""A non-base64 content value returns 400."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="editor")
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": "not-valid-base64!!!"},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json() == {
"content": [
"Invalid base64 content.",
]
}
def test_api_documents_content_update_nonexistent_document():
"""Updating the content of a non-existent document returns 404."""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{uuid4()!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_api_documents_content_update_replaces_existing():
"""Patching content replaces whatever was previously in S3."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="editor")
client = APIClient()
client.force_login(user)
assert get_s3_content(document) == factories.YDOC_HELLO_WORLD_BASE64
new_content = get_sample_ydoc()
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": new_content, "websocket": True},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == new_content
@pytest.mark.parametrize("role", ["editor", "administrator"])
def test_api_documents_content_update_deleted_document_for_non_owners(role):
"""Updating content on a soft-deleted document returns 404 for non-owners.
Soft-deleted documents are excluded from the queryset for non-owners,
so the endpoint returns 404 rather than 403.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role=role)
document.soft_delete()
document.refresh_from_db()
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_api_documents_content_update_deleted_document_for_owners():
"""Updating content on a soft-deleted document returns 403 for owners."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
document.soft_delete()
document.refresh_from_db()
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc()},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
def test_api_documents_content_update_link_editor():
"""
A public document with link_role=editor allows any authenticated user to
update content via the link role.
"""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="public", link_role="editor")
client = APIClient()
client.force_login(user)
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": True},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert models.Document.objects.filter(id=document.id).exists()
@responses.activate
def test_api_documents_content_update_authenticated_no_websocket(settings):
"""
When a user updates the document content, not connected to the websocket and is the first
to update, the content should be updated.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, json={"count": 0, "exists": False})
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert django_cache.get(f"docs:no-websocket:{document.id}") == session_key
assert ws_resp.call_count == 1
@responses.activate
def test_api_documents_content_update_authenticated_no_websocket_user_already_editing(
settings,
):
"""
When a user updates the document content, not connected to the websocket and another session
is already editing, the update should be denied.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, json={"count": 0, "exists": False})
django_cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.json() == {"detail": "You are not allowed to edit this document."}
assert ws_resp.call_count == 1
@responses.activate
def test_api_documents_content_update_no_websocket_other_user_connected_to_websocket(
settings,
):
"""
When a user updates document content without websocket and another user is connected
to the websocket, the update should be denied.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": False})
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.json() == {"detail": "You are not allowed to edit this document."}
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 1
@responses.activate
def test_api_documents_content_update_user_connected_to_websocket(settings):
"""
When a user updates document content and is connected to the websocket,
the content should be updated.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": True})
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 1
@responses.activate
def test_api_documents_content_update_websocket_server_unreachable_fallback_to_no_websocket(
settings,
):
"""
When the websocket server is unreachable, the content should be updated like if the user
was not connected to the websocket.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, status=500)
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert django_cache.get(f"docs:no-websocket:{document.id}") == session_key
assert ws_resp.call_count == 1
@responses.activate
def test_api_content_update_websocket_server_unreachable_fallback_to_no_websocket_other_users(
settings,
):
"""
When the websocket server is unreachable, the behavior fallback to the no websocket one.
If another user is already editing, the content update should be denied.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, status=500)
django_cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert django_cache.get(f"docs:no-websocket:{document.id}") == "other_session_key"
assert ws_resp.call_count == 1
@responses.activate
def test_api_content_update_websocket_server_room_not_found_fallback_to_no_websocket_other_users(
settings,
):
"""
When the WebSocket server does not have the room created, the logic should fallback to
no-WebSocket. If another user is already editing, the update must be denied.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, status=404)
django_cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert django_cache.get(f"docs:no-websocket:{document.id}") == "other_session_key"
assert ws_resp.call_count == 1
@responses.activate
def test_api_documents_content_update_force_websocket_param_to_true(settings):
"""
When the websocket parameter is set to true, the content should be updated without any check.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, status=500)
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": True},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 0
@responses.activate
def test_api_documents_content_update_feature_flag_disabled(settings):
"""
When the feature flag is disabled, the content should be updated without any check.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = False
endpoint_url = (
f"{settings.COLLABORATION_API_URL}get-connections/"
f"?room={document.id}&sessionKey={session_key}"
)
ws_resp = responses.get(endpoint_url, status=500)
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_sample_ydoc(), "websocket": False},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert get_s3_content(document) == get_sample_ydoc()
assert django_cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 0
def test_api_documents_content_upadte_invalid_yjs_doc():
"""sending an invalid yjs doc as content should return a 400."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach="restricted")
factories.UserDocumentAccessFactory(document=document, user=user, role="editor")
client = APIClient()
client.force_login(user)
assert get_s3_content(document) == factories.YDOC_HELLO_WORLD_BASE64
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{
"content": base64.b64encode(b"invalid yjs").decode("utf-8"),
"websocket": True,
},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST

View File

@@ -1,807 +0,0 @@
"""
Tests for Documents API endpoint in impress's core app: descendants
"""
import random
from django.contrib.auth.models import AnonymousUser
import pytest
from rest_framework.test import APIClient
from core import factories
pytestmark = pytest.mark.django_db
def test_api_documents_descendants_list_anonymous_public_standalone():
"""Anonymous users should be allowed to retrieve the descendants of a public document."""
document = factories.DocumentFactory(link_reach="public")
child1, child2 = factories.DocumentFactory.create_batch(2, parent=document)
grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1)
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/descendants/")
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": document.link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": grand_child.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": "editor"
if (child1.link_reach == "public" and child1.link_role == "editor")
else document.link_role,
"computed_link_reach": "public",
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 3,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": child2.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": document.link_role,
"computed_link_reach": "public",
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 0,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
],
}
def test_api_documents_descendants_list_anonymous_public_parent():
"""
Anonymous users should be allowed to retrieve the descendants of a document who
has a public ancestor.
"""
grand_parent = factories.DocumentFactory(link_reach="public")
parent = factories.DocumentFactory(
parent=grand_parent, link_reach=random.choice(["authenticated", "restricted"])
)
document = factories.DocumentFactory(
link_reach=random.choice(["authenticated", "restricted"]), parent=parent
)
child1, child2 = factories.DocumentFactory.create_batch(2, parent=document)
grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1)
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/descendants/")
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": grand_child.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": grand_child.ancestors_link_role,
"computed_link_reach": "public",
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 5,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": child2.get_abilities(AnonymousUser()),
"ancestors_link_reach": "public",
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": "public",
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 0,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
],
}
@pytest.mark.parametrize("reach", ["restricted", "authenticated"])
def test_api_documents_descendants_list_anonymous_restricted_or_authenticated(reach):
"""
Anonymous users should not be able to retrieve descendants of a document that is not public.
"""
document = factories.DocumentFactory(link_reach=reach)
child = factories.DocumentFactory(parent=document)
_grand_child = factories.DocumentFactory(parent=child)
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/descendants/")
assert response.status_code == 401
assert response.json() == {
"detail": "Authentication credentials were not provided."
}
@pytest.mark.parametrize("reach", ["public", "authenticated"])
def test_api_documents_descendants_list_authenticated_unrelated_public_or_authenticated(
reach,
):
"""
Authenticated users should be able to retrieve the descendants of a public/authenticated
document to which they are not related.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(link_reach=reach)
child1, child2 = factories.DocumentFactory.create_batch(
2, parent=document, link_reach="restricted"
)
grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/",
)
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": document.link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": grand_child.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": document.link_role,
"computed_link_reach": grand_child.computed_link_reach,
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 3,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": child2.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": document.link_role,
"computed_link_reach": child2.computed_link_reach,
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 0,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
],
}
@pytest.mark.parametrize("reach", ["public", "authenticated"])
def test_api_documents_descendants_list_authenticated_public_or_authenticated_parent(
reach,
):
"""
Authenticated users should be allowed to retrieve the descendants of a document who
has a public or authenticated ancestor.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
grand_parent = factories.DocumentFactory(link_reach=reach)
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
document = factories.DocumentFactory(link_reach="restricted", parent=parent)
child1, child2 = factories.DocumentFactory.create_batch(
2, parent=document, link_reach="restricted"
)
grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1)
response = client.get(f"/api/v1.0/documents/{document.id!s}/descendants/")
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": grand_child.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": grand_child.computed_link_reach,
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 5,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
{
"abilities": child2.get_abilities(user),
"ancestors_link_reach": reach,
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": child2.computed_link_reach,
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 0,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": None,
},
],
}
def test_api_documents_descendants_list_authenticated_unrelated_restricted():
"""
Authenticated users should not be allowed to retrieve the descendants of a document that is
restricted and to which they are not related.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(link_reach="restricted")
child1, _child2 = factories.DocumentFactory.create_batch(2, parent=document)
_grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/",
)
assert response.status_code == 403
assert response.json() == {
"detail": "You do not have permission to perform this action."
}
def test_api_documents_descendants_list_authenticated_related_direct():
"""
Authenticated users should be allowed to retrieve the descendants of a document
to which they are directly related whatever the role.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory()
access = factories.UserDocumentAccessFactory(document=document, user=user)
factories.UserDocumentAccessFactory(document=document)
child1, child2 = factories.DocumentFactory.create_batch(2, parent=document)
factories.UserDocumentAccessFactory(document=child1)
grand_child = factories.DocumentFactory(parent=child1)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/",
)
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(user),
"ancestors_link_reach": child1.ancestors_link_reach,
"ancestors_link_role": child1.ancestors_link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 3,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
{
"abilities": grand_child.get_abilities(user),
"ancestors_link_reach": grand_child.ancestors_link_reach,
"ancestors_link_role": grand_child.ancestors_link_role,
"computed_link_reach": grand_child.computed_link_reach,
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 3,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 3,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
{
"abilities": child2.get_abilities(user),
"ancestors_link_reach": child2.ancestors_link_reach,
"ancestors_link_role": child2.ancestors_link_role,
"computed_link_reach": child2.computed_link_reach,
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 2,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
],
}
def test_api_documents_descendants_list_authenticated_related_parent():
"""
Authenticated users should be allowed to retrieve the descendants of a document if they
are related to one of its ancestors whatever the role.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
grand_parent = factories.DocumentFactory(link_reach="restricted")
grand_parent_access = factories.UserDocumentAccessFactory(
document=grand_parent, user=user
)
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
child1, child2 = factories.DocumentFactory.create_batch(2, parent=document)
factories.UserDocumentAccessFactory(document=child1)
grand_child = factories.DocumentFactory(parent=child1)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/",
)
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(user),
"ancestors_link_reach": child1.ancestors_link_reach,
"ancestors_link_role": child1.ancestors_link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 2,
"nb_accesses_direct": 1,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": grand_parent_access.role,
},
{
"abilities": grand_child.get_abilities(user),
"ancestors_link_reach": grand_child.ancestors_link_reach,
"ancestors_link_role": grand_child.ancestors_link_role,
"computed_link_reach": grand_child.computed_link_reach,
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 5,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 2,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": grand_parent_access.role,
},
{
"abilities": child2.get_abilities(user),
"ancestors_link_reach": child2.ancestors_link_reach,
"ancestors_link_role": child2.ancestors_link_role,
"computed_link_reach": child2.computed_link_reach,
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 4,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": grand_parent_access.role,
},
],
}
def test_api_documents_descendants_list_authenticated_related_child():
"""
Authenticated users should not be allowed to retrieve all the descendants of a document
as a result of being related to one of its children.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(link_reach="restricted")
child1, _child2 = factories.DocumentFactory.create_batch(2, parent=document)
_grand_child = factories.DocumentFactory(parent=child1)
factories.UserDocumentAccessFactory(document=child1, user=user)
factories.UserDocumentAccessFactory(document=document)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/",
)
assert response.status_code == 403
assert response.json() == {
"detail": "You do not have permission to perform this action."
}
def test_api_documents_descendants_list_authenticated_related_team_none(
mock_user_teams,
):
"""
Authenticated users should not be able to retrieve the descendants of a restricted document
related to teams in which the user is not.
"""
mock_user_teams.return_value = []
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(link_reach="restricted")
factories.DocumentFactory.create_batch(2, parent=document)
factories.TeamDocumentAccessFactory(document=document, team="myteam")
response = client.get(f"/api/v1.0/documents/{document.id!s}/descendants/")
assert response.status_code == 403
assert response.json() == {
"detail": "You do not have permission to perform this action."
}
def test_api_documents_descendants_list_authenticated_related_team_members(
mock_user_teams,
):
"""
Authenticated users should be allowed to retrieve the descendants of a document to which they
are related via a team whatever the role.
"""
mock_user_teams.return_value = ["myteam"]
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(link_reach="restricted")
child1, child2 = factories.DocumentFactory.create_batch(2, parent=document)
grand_child = factories.DocumentFactory(parent=child1)
access = factories.TeamDocumentAccessFactory(document=document, team="myteam")
response = client.get(f"/api/v1.0/documents/{document.id!s}/descendants/")
# pylint: disable=R0801
assert response.status_code == 200
assert response.json() == {
"count": 3,
"next": None,
"previous": None,
"results": [
{
"abilities": child1.get_abilities(user),
"ancestors_link_reach": child1.ancestors_link_reach,
"ancestors_link_role": child1.ancestors_link_role,
"computed_link_reach": child1.computed_link_reach,
"computed_link_role": child1.computed_link_role,
"created_at": child1.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child1.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child1.excerpt,
"id": str(child1.id),
"is_favorite": False,
"link_reach": child1.link_reach,
"link_role": child1.link_role,
"numchild": 1,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": child1.path,
"title": child1.title,
"updated_at": child1.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
{
"abilities": grand_child.get_abilities(user),
"ancestors_link_reach": grand_child.ancestors_link_reach,
"ancestors_link_role": grand_child.ancestors_link_role,
"computed_link_reach": grand_child.computed_link_reach,
"computed_link_role": grand_child.computed_link_role,
"created_at": grand_child.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(grand_child.creator.id),
"deleted_at": None,
"depth": 3,
"excerpt": grand_child.excerpt,
"id": str(grand_child.id),
"is_favorite": False,
"link_reach": grand_child.link_reach,
"link_role": grand_child.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": grand_child.path,
"title": grand_child.title,
"updated_at": grand_child.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
{
"abilities": child2.get_abilities(user),
"ancestors_link_reach": child2.ancestors_link_reach,
"ancestors_link_role": child2.ancestors_link_role,
"computed_link_reach": child2.computed_link_reach,
"computed_link_role": child2.computed_link_role,
"created_at": child2.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(child2.creator.id),
"deleted_at": None,
"depth": 2,
"excerpt": child2.excerpt,
"id": str(child2.id),
"is_favorite": False,
"link_reach": child2.link_reach,
"link_role": child2.link_role,
"numchild": 0,
"nb_accesses_ancestors": 1,
"nb_accesses_direct": 0,
"path": child2.path,
"title": child2.title,
"updated_at": child2.updated_at.isoformat().replace("+00:00", "Z"),
"user_role": access.role,
},
],
}

View File

@@ -1,95 +0,0 @@
"""
Tests for Documents API endpoint in impress's core app: list
"""
import pytest
from faker import Faker
from rest_framework.test import APIClient
from core import factories
from core.api.filters import remove_accents
fake = Faker()
pytestmark = pytest.mark.django_db
# Filters: unknown field
def test_api_documents_descendants_filter_unknown_field():
"""
Trying to filter by an unknown field should be ignored.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
factories.DocumentFactory()
document = factories.DocumentFactory(users=[user])
expected_ids = {
str(document.id)
for document in factories.DocumentFactory.create_batch(2, parent=document)
}
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/?unknown=true"
)
assert response.status_code == 200
results = response.json()["results"]
assert len(results) == 2
assert {result["id"] for result in results} == expected_ids
# Filters: title
@pytest.mark.parametrize(
"query,nb_results",
[
("Project Alpha", 1), # Exact match
("project", 2), # Partial match (case-insensitive)
("Guide", 2), # Word match within a title
("Special", 0), # No match (nonexistent keyword)
("2024", 2), # Match by numeric keyword
("", 6), # Empty string
("velo", 1), # Accent-insensitive match (velo vs vélo)
("bêta", 1), # Accent-insensitive match (bêta vs beta)
],
)
def test_api_documents_descendants_filter_title(query, nb_results):
"""Authenticated users should be able to search documents by their unaccented title."""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(users=[user])
# Create documents with predefined titles
titles = [
"Project Alpha Documentation",
"Project Beta Overview",
"User Guide",
"Financial Report 2024",
"Annual Review 2024",
"Guide du vélo urbain", # <-- Title with accent for accent-insensitive test
]
for title in titles:
factories.DocumentFactory(title=title, parent=document)
# Perform the search query
response = client.get(
f"/api/v1.0/documents/{document.id!s}/descendants/?title={query:s}"
)
assert response.status_code == 200
results = response.json()["results"]
assert len(results) == nb_results
# Ensure all results contain the query in their title
for result in results:
assert (
remove_accents(query).lower().strip()
in remove_accents(result["title"]).lower()
)

View File

@@ -70,7 +70,6 @@ def test_api_document_favorite_list_authenticated_with_favorite():
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
"content": document.content,
"depth": document.depth,
"excerpt": document.excerpt,
"id": str(document.id),

View File

@@ -1,5 +1,5 @@
"""
Tests for Documents API endpoint in impress's core app: content
Tests for Documents API endpoint in impress's core app: convert
"""
import base64
@@ -23,12 +23,14 @@ pytestmark = pytest.mark.django_db
],
)
@patch("core.services.converter_services.YdocConverter.convert")
def test_api_documents_content_public(mock_content, reach, role):
def test_api_documents_formatted_content_public(mock_content, reach, role):
"""Anonymous users should be allowed to access content of public documents."""
document = factories.DocumentFactory(link_reach=reach, link_role=role)
mock_content.return_value = {"some": "data"}
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/content/")
response = APIClient().get(
f"/api/v1.0/documents/{document.id!s}/formatted-content/"
)
assert response.status_code == status.HTTP_200_OK
data = response.json()
@@ -58,7 +60,9 @@ def test_api_documents_content_public(mock_content, reach, role):
],
)
@patch("core.services.converter_services.YdocConverter.convert")
def test_api_documents_content_not_public(mock_content, reach, doc_role, user_role):
def test_api_documents_formatted_content_not_public(
mock_content, reach, doc_role, user_role
):
"""Authenticated users need access to get non-public document content."""
user = factories.UserFactory()
document = factories.DocumentFactory(link_reach=reach, link_role=doc_role)
@@ -66,14 +70,14 @@ def test_api_documents_content_not_public(mock_content, reach, doc_role, user_ro
# First anonymous request should fail
client = APIClient()
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
response = client.get(f"/api/v1.0/documents/{document.id!s}/formatted-content/")
assert response.status_code == status.HTTP_401_UNAUTHORIZED
mock_content.assert_not_called()
# Login and try again
client.force_login(user)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
response = client.get(f"/api/v1.0/documents/{document.id!s}/formatted-content/")
# If restricted, we still should not have access
if user_role is not None:
@@ -85,7 +89,7 @@ def test_api_documents_content_not_public(mock_content, reach, doc_role, user_ro
document=document, user=user, role=user_role
)
response = client.get(f"/api/v1.0/documents/{document.id!s}/content/")
response = client.get(f"/api/v1.0/documents/{document.id!s}/formatted-content/")
assert response.status_code == status.HTTP_200_OK
data = response.json()
@@ -108,13 +112,13 @@ def test_api_documents_content_not_public(mock_content, reach, doc_role, user_ro
],
)
@patch("core.services.converter_services.YdocConverter.convert")
def test_api_documents_content_format(mock_content, content_format, accept):
"""Test that the content endpoint returns a specific format."""
def test_api_documents_formatted_content_format(mock_content, content_format, accept):
"""Test that the convert endpoint returns a specific format."""
document = factories.DocumentFactory(link_reach="public")
mock_content.return_value = {"some": "data"}
response = APIClient().get(
f"/api/v1.0/documents/{document.id!s}/content/?content_format={content_format}"
f"/api/v1.0/documents/{document.id!s}/formatted-content/?content_format={content_format}"
)
assert response.status_code == status.HTTP_200_OK
@@ -128,45 +132,49 @@ def test_api_documents_content_format(mock_content, content_format, accept):
@patch("core.services.converter_services.YdocConverter._request")
def test_api_documents_content_invalid_format(mock_request):
"""Test that the content endpoint rejects invalid formats."""
def test_api_documents_formatted_content_invalid_format(mock_request):
"""Test that the convert endpoint rejects invalid formats."""
document = factories.DocumentFactory(link_reach="public")
response = APIClient().get(
f"/api/v1.0/documents/{document.id!s}/content/?content_format=invalid"
f"/api/v1.0/documents/{document.id!s}/formatted-content/?content_format=invalid"
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
mock_request.assert_not_called()
@patch("core.services.converter_services.YdocConverter._request")
def test_api_documents_content_yservice_error(mock_request):
def test_api_documents_formatted_content_yservice_error(mock_request):
"""Test that service errors are handled properly."""
document = factories.DocumentFactory(link_reach="public")
mock_request.side_effect = requests.RequestException()
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/content/")
response = APIClient().get(
f"/api/v1.0/documents/{document.id!s}/formatted-content/"
)
mock_request.assert_called_once()
assert response.status_code == status.HTTP_500_INTERNAL_SERVER_ERROR
@patch("core.services.converter_services.YdocConverter._request")
def test_api_documents_content_nonexistent_document(mock_request):
def test_api_documents_formatted_content_nonexistent_document(mock_request):
"""Test that accessing a nonexistent document returns 404."""
client = APIClient()
response = client.get(
"/api/v1.0/documents/00000000-0000-0000-0000-000000000000/content/"
"/api/v1.0/documents/00000000-0000-0000-0000-000000000000/formatted-content/"
)
assert response.status_code == status.HTTP_404_NOT_FOUND
mock_request.assert_not_called()
@patch("core.services.converter_services.YdocConverter._request")
def test_api_documents_content_empty_document(mock_request):
def test_api_documents_formatted_content_empty_document(mock_request):
"""Test that accessing an empty document returns empty content."""
document = factories.DocumentFactory(link_reach="public", content="")
response = APIClient().get(f"/api/v1.0/documents/{document.id!s}/content/")
response = APIClient().get(
f"/api/v1.0/documents/{document.id!s}/formatted-content/"
)
assert response.status_code == status.HTTP_200_OK
data = response.json()

View File

@@ -39,7 +39,7 @@ def test_api_documents_retrieve_anonymous_public_standalone():
"collaboration_auth": True,
"comment": document.link_role in ["commenter", "editor"],
"cors_proxy": True,
"content": True,
"formatted_content": True,
"descendants": True,
"destroy": False,
"duplicate": False,
@@ -53,6 +53,8 @@ def test_api_documents_retrieve_anonymous_public_standalone():
"restricted": None,
},
"mask": False,
"content_patch": document.link_role == "editor",
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -70,7 +72,6 @@ def test_api_documents_retrieve_anonymous_public_standalone():
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
@@ -120,7 +121,7 @@ def test_api_documents_retrieve_anonymous_public_parent():
"comment": grand_parent.link_role in ["commenter", "editor"],
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": False,
# Anonymous user can't favorite a document even with read access
@@ -131,6 +132,8 @@ def test_api_documents_retrieve_anonymous_public_parent():
**links_definition
),
"mask": False,
"content_patch": grand_parent.link_role == "editor",
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -148,7 +151,6 @@ def test_api_documents_retrieve_anonymous_public_parent():
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": "public",
"computed_link_role": grand_parent.link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
@@ -230,7 +232,7 @@ def test_api_documents_retrieve_authenticated_unrelated_public_or_authenticated(
"comment": document.link_role in ["commenter", "editor"],
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -242,6 +244,8 @@ def test_api_documents_retrieve_authenticated_unrelated_public_or_authenticated(
"restricted": None,
},
"mask": True,
"content_patch": document.link_role == "editor",
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -259,7 +263,6 @@ def test_api_documents_retrieve_authenticated_unrelated_public_or_authenticated(
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"depth": 1,
@@ -317,7 +320,7 @@ def test_api_documents_retrieve_authenticated_public_or_authenticated_parent(rea
"comment": grand_parent.link_role in ["commenter", "editor"],
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -328,6 +331,8 @@ def test_api_documents_retrieve_authenticated_public_or_authenticated_parent(rea
),
"mask": True,
"move": False,
"content_patch": grand_parent.link_role == "editor",
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"partial_update": grand_parent.link_role == "editor",
@@ -344,7 +349,6 @@ def test_api_documents_retrieve_authenticated_public_or_authenticated_parent(rea
"ancestors_link_role": grand_parent.link_role,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"depth": 3,
@@ -459,7 +463,6 @@ def test_api_documents_retrieve_authenticated_related_direct():
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"creator": str(document.creator.id),
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"deleted_at": None,
@@ -517,7 +520,7 @@ def test_api_documents_retrieve_authenticated_related_parent():
"comment": access.role != "reader",
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": access.role in ["administrator", "owner"],
"duplicate": True,
"favorite": True,
@@ -527,6 +530,8 @@ def test_api_documents_retrieve_authenticated_related_parent():
**link_definition
),
"mask": True,
"content_patch": access.role not in ["reader", "commenter"],
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": access.role in ["administrator", "owner"],
@@ -544,7 +549,6 @@ def test_api_documents_retrieve_authenticated_related_parent():
"ancestors_link_role": None,
"computed_link_reach": "restricted",
"computed_link_role": None,
"content": document.content,
"creator": str(document.creator.id),
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"depth": 3,
@@ -701,7 +705,6 @@ def test_api_documents_retrieve_authenticated_related_team_members(
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
@@ -768,7 +771,6 @@ def test_api_documents_retrieve_authenticated_related_team_administrators(
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
@@ -835,7 +837,6 @@ def test_api_documents_retrieve_authenticated_related_team_owners(
"ancestors_link_role": None,
"computed_link_reach": document.computed_link_reach,
"computed_link_role": document.computed_link_role,
"content": document.content,
"created_at": document.created_at.isoformat().replace("+00:00", "Z"),
"creator": str(document.creator.id),
"deleted_at": None,
@@ -1067,48 +1068,3 @@ def test_api_documents_retrieve_permanently_deleted_related(role, depth):
assert response.status_code == 404
assert response.json() == {"detail": "Not found."}
def test_api_documents_retrieve_without_content():
"""
Test retrieve using without_content query string should remove the content in the response
"""
user = factories.UserFactory()
document = factories.DocumentFactory(creator=user, users=[(user, "owner")])
client = APIClient()
client.force_login(user)
with mock.patch("core.models.Document.content") as mock_document_content:
response = client.get(
f"/api/v1.0/documents/{document.id!s}/?without_content=true"
)
assert response.status_code == 200
payload = response.json()
assert "content" not in payload
mock_document_content.assert_not_called()
def test_api_documents_retrieve_without_content_invalid_value():
"""
Test retrieve using without_content query string but an invalid value
should return a 400
"""
user = factories.UserFactory()
document = factories.DocumentFactory(creator=user, users=[(user, "owner")])
client = APIClient()
client.force_login(user)
response = client.get(
f"/api/v1.0/documents/{document.id!s}/?without_content=invalid-value"
)
assert response.status_code == 400
assert response.json() == ["Must be a valid boolean."]

View File

@@ -83,7 +83,7 @@ def test_api_documents_trashbin_format():
"descendants": False,
"cors_proxy": False,
"comment": False,
"content": False,
"formatted_content": False,
"destroy": False,
"duplicate": False,
"favorite": False,
@@ -95,6 +95,8 @@ def test_api_documents_trashbin_format():
"restricted": None,
},
"mask": False,
"content_patch": False,
"content_retrieve": True,
"media_auth": False,
"media_check": False,
"move": False, # Can't move a deleted document

View File

@@ -19,25 +19,6 @@ from core.tests.conftest import TEAM, USER, VIA
pytestmark = pytest.mark.django_db
# A valid Yjs document derived from YDOC_HELLO_WORLD_BASE64 with "Hello" replaced by "World",
# used in PATCH tests to guarantee a real content change distinct from what DocumentFactory
# produces.
YDOC_UPDATED_CONTENT_BASE64 = (
"AR717vLVDgAHAQ5kb2N1bWVudC1zdG9yZQMKYmxvY2tHcm91cAcA9e7y1Q4AAw5ibG9ja0NvbnRh"
"aW5lcgcA9e7y1Q4BAwdoZWFkaW5nBwD17vLVDgIGBgD17vLVDgMGaXRhbGljAnt9hPXu8tUOBAVX"
"b3JsZIb17vLVDgkGaXRhbGljBG51bGwoAPXu8tUOAg10ZXh0QWxpZ25tZW50AXcEbGVmdCgA9e7y"
"1Q4CBWxldmVsAX0BKAD17vLVDgECaWQBdyQwNGQ2MjM0MS04MzI2LTQyMzYtYTA4My00ODdlMjZm"
"YWQyMzAoAPXu8tUOAQl0ZXh0Q29sb3IBdwdkZWZhdWx0KAD17vLVDgEPYmFja2dyb3VuZENvbG9y"
"AXcHZGVmYXVsdIf17vLVDgEDDmJsb2NrQ29udGFpbmVyBwD17vLVDhADDmJ1bGxldExpc3RJdGVt"
"BwD17vLVDhEGBAD17vLVDhIBd4b17vLVDhMEYm9sZAJ7fYT17vLVDhQCb3KG9e7y1Q4WBGJvbGQE"
"bnVsbIT17vLVDhcCbGQoAPXu8tUOEQ10ZXh0QWxpZ25tZW50AXcEbGVmdCgA9e7y1Q4QAmlkAXck"
"ZDM1MWUwNjgtM2U1NS00MjI2LThlYTUtYWJiMjYzMTk4ZTJhKAD17vLVDhAJdGV4dENvbG9yAXcH"
"ZGVmYXVsdCgA9e7y1Q4QD2JhY2tncm91bmRDb2xvcgF3B2RlZmF1bHSH9e7y1Q4QAw5ibG9ja0Nv"
"bnRhaW5lcgcA9e7y1Q4eAwlwYXJhZ3JhcGgoAPXu8tUOHw10ZXh0QWxpZ25tZW50AXcEbGVmdCgA"
"9e7y1Q4eAmlkAXckODk3MDBjMDctZTBlMS00ZmUwLWFjYTItODQ5MzIwOWE3ZTQyKAD17vLVDh4J"
"dGV4dENvbG9yAXcHZGVmYXVsdCgA9e7y1Q4eD2JhY2tncm91bmRDb2xvcgF3B2RlZmF1bHQA"
)
@pytest.mark.parametrize("via_parent", [True, False])
@pytest.mark.parametrize(
@@ -736,25 +717,6 @@ def test_api_documents_update_administrator_or_owner_of_another(via, mock_user_t
assert other_document_values == old_document_values
def test_api_documents_update_invalid_content():
"""
Updating a document with a non base64 encoded content should raise a validation error.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(users=[[user, "owner"]])
response = client.put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": "invalid content"},
format="json",
)
assert response.status_code == 400
assert response.json() == {"content": ["Invalid base64 content."]}
# =============================================================================
# PATCH tests
# =============================================================================
@@ -784,11 +746,10 @@ def test_api_documents_patch_anonymous_forbidden(reach, role, via_parent):
document = factories.DocumentFactory(link_reach=reach, link_role=role)
old_document_values = serializers.DocumentSerializer(instance=document).data
new_content = YDOC_UPDATED_CONTENT_BASE64
response = APIClient().patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 401
@@ -828,11 +789,10 @@ def test_api_documents_patch_authenticated_unrelated_forbidden(reach, role, via_
document = factories.DocumentFactory(link_reach=reach, link_role=role)
old_document_values = serializers.DocumentSerializer(instance=document).data
new_content = YDOC_UPDATED_CONTENT_BASE64
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
@@ -876,11 +836,10 @@ def test_api_documents_patch_anonymous_or_authenticated_unrelated(
old_document_values = serializers.DocumentSerializer(instance=document).data
old_path = document.path
new_content = YDOC_UPDATED_CONTENT_BASE64
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content, "websocket": True},
{"title": "new title", "websocket": True},
format="json",
)
assert response.status_code == 200
@@ -889,11 +848,10 @@ def test_api_documents_patch_anonymous_or_authenticated_unrelated(
# Force reloading it by fetching the document in the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
document_values = serializers.DocumentSerializer(instance=document).data
for key in [
"id",
"title",
"link_reach",
"link_role",
"creator",
@@ -933,11 +891,10 @@ def test_api_documents_patch_authenticated_reader(via, via_parent, mock_user_tea
)
old_document_values = serializers.DocumentSerializer(instance=document).data
new_content = YDOC_UPDATED_CONTENT_BASE64
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
@@ -983,11 +940,10 @@ def test_api_documents_patch_authenticated_editor_administrator_or_owner(
old_document_values = serializers.DocumentSerializer(instance=document).data
old_path = document.path
new_content = YDOC_UPDATED_CONTENT_BASE64
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content, "websocket": True},
{"title": "new title", "websocket": True},
format="json",
)
assert response.status_code == 200
@@ -996,11 +952,10 @@ def test_api_documents_patch_authenticated_editor_administrator_or_owner(
# Force reloading it by fetching the document in the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
document_values = serializers.DocumentSerializer(instance=document).data
for key in [
"id",
"title",
"link_reach",
"link_role",
"creator",
@@ -1025,7 +980,6 @@ def test_api_documents_patch_authenticated_no_websocket(settings):
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1041,7 +995,7 @@ def test_api_documents_patch_authenticated_no_websocket(settings):
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 200
@@ -1050,7 +1004,7 @@ def test_api_documents_patch_authenticated_no_websocket(settings):
# Force reloading it by fetching the document from the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
assert ws_resp.call_count == 1
@@ -1067,7 +1021,6 @@ def test_api_documents_patch_authenticated_no_websocket_user_already_editing(set
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1082,7 +1035,7 @@ def test_api_documents_patch_authenticated_no_websocket_user_already_editing(set
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 403
@@ -1103,7 +1056,6 @@ def test_api_documents_patch_no_websocket_other_user_connected_to_websocket(sett
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1118,7 +1070,7 @@ def test_api_documents_patch_no_websocket_other_user_connected_to_websocket(sett
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 403
@@ -1139,7 +1091,6 @@ def test_api_documents_patch_user_connected_to_websocket(settings):
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1155,7 +1106,7 @@ def test_api_documents_patch_user_connected_to_websocket(settings):
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 200
@@ -1164,7 +1115,7 @@ def test_api_documents_patch_user_connected_to_websocket(settings):
# Force reloading it by fetching the document in the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
assert cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 1
@@ -1183,7 +1134,6 @@ def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websock
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1199,7 +1149,7 @@ def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websock
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 200
@@ -1208,7 +1158,7 @@ def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websock
# Force reloading it by fetching the document from the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
assert ws_resp.call_count == 1
@@ -1227,7 +1177,6 @@ def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websock
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1242,7 +1191,7 @@ def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websock
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 403
@@ -1265,7 +1214,6 @@ def test_api_documents_patch_websocket_server_room_not_found_fallback_to_no_webs
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1280,7 +1228,7 @@ def test_api_documents_patch_websocket_server_room_not_found_fallback_to_no_webs
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 403
@@ -1300,7 +1248,6 @@ def test_api_documents_patch_force_websocket_param_to_true(settings):
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1315,7 +1262,7 @@ def test_api_documents_patch_force_websocket_param_to_true(settings):
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content, "websocket": True},
{"title": "new title", "websocket": True},
format="json",
)
assert response.status_code == 200
@@ -1324,7 +1271,7 @@ def test_api_documents_patch_force_websocket_param_to_true(settings):
# Force reloading it by fetching the document from the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
assert cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 0
@@ -1340,7 +1287,6 @@ def test_api_documents_patch_feature_flag_disabled(settings):
session_key = client.session.session_key
document = factories.DocumentFactory(users=[(user, "editor")])
new_content = YDOC_UPDATED_CONTENT_BASE64
settings.COLLABORATION_API_URL = "http://example.com/"
settings.COLLABORATION_SERVER_SECRET = "secret-token"
@@ -1356,7 +1302,7 @@ def test_api_documents_patch_feature_flag_disabled(settings):
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
assert response.status_code == 200
@@ -1365,7 +1311,7 @@ def test_api_documents_patch_feature_flag_disabled(settings):
# Force reloading it by fetching the document from the database.
document = models.Document.objects.get(id=document.id)
assert document.path == old_path
assert document.content == new_content
assert document.title == "new title"
assert cache.get(f"docs:no-websocket:{document.id}") is None
assert ws_resp.call_count == 0
@@ -1396,11 +1342,10 @@ def test_api_documents_patch_administrator_or_owner_of_another(via, mock_user_te
other_document = factories.DocumentFactory(title="Old title", link_role="reader")
old_document_values = serializers.DocumentSerializer(instance=other_document).data
new_content = YDOC_UPDATED_CONTENT_BASE64
response = client.patch(
f"/api/v1.0/documents/{other_document.id!s}/",
{"content": new_content},
{"title": "new title"},
format="json",
)
@@ -1413,25 +1358,6 @@ def test_api_documents_patch_administrator_or_owner_of_another(via, mock_user_te
)
def test_api_documents_patch_invalid_content():
"""
Patching a document with a non base64 encoded content should raise a validation error.
"""
user = factories.UserFactory(with_owned_document=True)
client = APIClient()
client.force_login(user)
document = factories.DocumentFactory(users=[[user, "owner"]])
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/",
{"content": "invalid content"},
format="json",
)
assert response.status_code == 400
assert response.json() == {"content": ["Invalid base64 content."]}
@responses.activate
def test_api_documents_patch_empty_body(settings):
"""

View File

@@ -14,7 +14,7 @@ from core import factories
pytestmark = pytest.mark.django_db
def get_ydoc_with_mages(image_keys):
def get_ydoc_with_images(image_keys):
"""Return a ydoc from text for testing purposes."""
ydoc = pycrdt.Doc()
fragment = pycrdt.XmlFragment(
@@ -36,7 +36,7 @@ def test_api_documents_update_new_attachment_keys_anonymous(django_assert_num_qu
"""
image_keys = [f"{uuid4()!s}/attachments/{uuid4()!s}.png" for _ in range(4)]
document = factories.DocumentFactory(
content=get_ydoc_with_mages(image_keys[:1]),
content=get_ydoc_with_images(image_keys[:1]),
attachments=[image_keys[0]],
link_reach="public",
link_role="editor",
@@ -47,13 +47,13 @@ def test_api_documents_update_new_attachment_keys_anonymous(django_assert_num_qu
factories.DocumentFactory(attachments=[image_keys[3]], link_reach="restricted")
expected_keys = {image_keys[i] for i in [0, 1]}
with django_assert_num_queries(11):
response = APIClient().put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": get_ydoc_with_mages(image_keys), "websocket": True},
with django_assert_num_queries(9):
response = APIClient().patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_ydoc_with_images(image_keys)},
format="json",
)
assert response.status_code == 200
assert response.status_code == 204
document.refresh_from_db()
assert set(document.attachments) == expected_keys
@@ -61,12 +61,12 @@ def test_api_documents_update_new_attachment_keys_anonymous(django_assert_num_qu
# Check that the db query to check attachments readability for extracted
# keys is not done if the content changes but no new keys are found
with django_assert_num_queries(7):
response = APIClient().put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": get_ydoc_with_mages(image_keys[:2]), "websocket": True},
response = APIClient().patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_ydoc_with_images(image_keys[:2]), "websocket": True},
format="json",
)
assert response.status_code == 200
assert response.status_code == 204
document.refresh_from_db()
assert len(document.attachments) == 2
@@ -87,7 +87,7 @@ def test_api_documents_update_new_attachment_keys_authenticated(
image_keys = [f"{uuid4()!s}/attachments/{uuid4()!s}.png" for _ in range(5)]
document = factories.DocumentFactory(
content=get_ydoc_with_mages(image_keys[:1]),
content=get_ydoc_with_images(image_keys[:1]),
attachments=[image_keys[0]],
users=[(user, "editor")],
)
@@ -98,13 +98,13 @@ def test_api_documents_update_new_attachment_keys_authenticated(
factories.DocumentFactory(attachments=[image_keys[4]], users=[user])
expected_keys = {image_keys[i] for i in [0, 1, 2, 4]}
with django_assert_num_queries(12):
response = client.put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": get_ydoc_with_mages(image_keys)},
with django_assert_num_queries(10):
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_ydoc_with_images(image_keys)},
format="json",
)
assert response.status_code == 200
assert response.status_code == 204
document.refresh_from_db()
assert set(document.attachments) == expected_keys
@@ -112,12 +112,12 @@ def test_api_documents_update_new_attachment_keys_authenticated(
# Check that the db query to check attachments readability for extracted
# keys is not done if the content changes but no new keys are found
with django_assert_num_queries(8):
response = client.put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": get_ydoc_with_mages(image_keys[:2])},
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_ydoc_with_images(image_keys[:2])},
format="json",
)
assert response.status_code == 200
assert response.status_code == 204
document.refresh_from_db()
assert len(document.attachments) == 4
@@ -135,19 +135,19 @@ def test_api_documents_update_new_attachment_keys_duplicate():
image_key1 = f"{uuid4()!s}/attachments/{uuid4()!s}.png"
image_key2 = f"{uuid4()!s}/attachments/{uuid4()!s}.png"
document = factories.DocumentFactory(
content=get_ydoc_with_mages([image_key1]),
content=get_ydoc_with_images([image_key1]),
attachments=[image_key1],
users=[(user, "editor")],
)
factories.DocumentFactory(attachments=[image_key2], users=[user])
response = client.put(
f"/api/v1.0/documents/{document.id!s}/",
{"content": get_ydoc_with_mages([image_key1, image_key2, image_key2])},
response = client.patch(
f"/api/v1.0/documents/{document.id!s}/content/",
{"content": get_ydoc_with_images([image_key1, image_key2, image_key2])},
format="json",
)
assert response.status_code == 200
assert response.status_code == 204
document.refresh_from_db()
assert len(document.attachments) == 2

View File

@@ -165,13 +165,15 @@ def test_models_documents_get_abilities_forbidden(
"collaboration_auth": False,
"descendants": False,
"cors_proxy": False,
"content": False,
"formatted_content": False,
"destroy": False,
"duplicate": False,
"favorite": False,
"comment": False,
"invite_owner": False,
"mask": False,
"content_patch": False,
"content_retrieve": False,
"media_auth": False,
"media_check": False,
"move": False,
@@ -233,7 +235,7 @@ def test_models_documents_get_abilities_reader(
"comment": False,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": is_authenticated,
"favorite": is_authenticated,
@@ -245,6 +247,8 @@ def test_models_documents_get_abilities_reader(
"restricted": None,
},
"mask": is_authenticated,
"content_patch": False,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -303,7 +307,7 @@ def test_models_documents_get_abilities_commenter(
"children_list": True,
"collaboration_auth": True,
"comment": True,
"content": True,
"formatted_content": True,
"descendants": True,
"cors_proxy": True,
"destroy": False,
@@ -317,6 +321,8 @@ def test_models_documents_get_abilities_commenter(
"restricted": None,
},
"mask": is_authenticated,
"content_patch": False,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -374,7 +380,7 @@ def test_models_documents_get_abilities_editor(
"comment": True,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": is_authenticated,
"favorite": is_authenticated,
@@ -386,6 +392,8 @@ def test_models_documents_get_abilities_editor(
"restricted": None,
},
"mask": is_authenticated,
"content_patch": True,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -432,7 +440,7 @@ def test_models_documents_get_abilities_owner(django_assert_num_queries):
"comment": True,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": True,
"duplicate": True,
"favorite": True,
@@ -444,6 +452,8 @@ def test_models_documents_get_abilities_owner(django_assert_num_queries):
"restricted": None,
},
"mask": True,
"content_patch": True,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": True,
@@ -476,7 +486,7 @@ def test_models_documents_get_abilities_owner(django_assert_num_queries):
"comment": False,
"descendants": False,
"cors_proxy": False,
"content": False,
"formatted_content": False,
"destroy": False,
"duplicate": False,
"favorite": False,
@@ -488,6 +498,8 @@ def test_models_documents_get_abilities_owner(django_assert_num_queries):
"restricted": None,
},
"mask": False,
"content_patch": False,
"content_retrieve": True,
"media_auth": False,
"media_check": False,
"move": False,
@@ -524,7 +536,7 @@ def test_models_documents_get_abilities_administrator(django_assert_num_queries)
"comment": True,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -536,6 +548,8 @@ def test_models_documents_get_abilities_administrator(django_assert_num_queries)
"restricted": None,
},
"mask": True,
"content_patch": True,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": True,
@@ -582,7 +596,7 @@ def test_models_documents_get_abilities_editor_user(django_assert_num_queries):
"comment": True,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -594,6 +608,8 @@ def test_models_documents_get_abilities_editor_user(django_assert_num_queries):
"restricted": None,
},
"mask": True,
"content_patch": True,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -648,7 +664,7 @@ def test_models_documents_get_abilities_reader_user(
and document.link_role in ["commenter", "editor"],
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -660,6 +676,8 @@ def test_models_documents_get_abilities_reader_user(
"restricted": None,
},
"mask": True,
"content_patch": access_from_link,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -713,7 +731,7 @@ def test_models_documents_get_abilities_commenter_user(
"children_list": True,
"collaboration_auth": True,
"comment": True,
"content": True,
"formatted_content": True,
"descendants": True,
"cors_proxy": True,
"destroy": False,
@@ -727,6 +745,8 @@ def test_models_documents_get_abilities_commenter_user(
"restricted": None,
},
"mask": True,
"content_patch": access_from_link,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,
@@ -778,7 +798,7 @@ def test_models_documents_get_abilities_preset_role(django_assert_num_queries):
"comment": False,
"descendants": True,
"cors_proxy": True,
"content": True,
"formatted_content": True,
"destroy": False,
"duplicate": True,
"favorite": True,
@@ -790,6 +810,8 @@ def test_models_documents_get_abilities_preset_role(django_assert_num_queries):
"restricted": None,
},
"mask": True,
"content_patch": False,
"content_retrieve": True,
"media_auth": True,
"media_check": True,
"move": False,

View File

@@ -110,8 +110,11 @@ def test_docspec_convert_success(mock_post, settings):
# Verify the request was made correctly
mock_post.assert_called_once_with(
"http://docspec.test/convert",
headers={"Accept": mime_types.BLOCKNOTE},
files={"file": ("document.docx", docx_data, mime_types.DOCX)},
headers={
"Content-Type": mime_types.DOCX,
"Accept": mime_types.BLOCKNOTE,
},
data=docx_data,
timeout=5,
verify=False,
)

View File

@@ -18,6 +18,7 @@ from django.utils.translation import gettext_lazy as _
import sentry_sdk
from configurations import Configuration, values
from corsheaders.defaults import default_headers
from csp.constants import NONE
from lasuite.configuration.values import SecretFileValue
from sentry_sdk.integrations.django import DjangoIntegration
@@ -1048,6 +1049,10 @@ class Base(Configuration):
),
}
CONTENT_METADATA_CACHE_TIMEOUT = values.IntegerValue(
60 * 60 * 24, environ_name="CONTENT_METADATA_CACHE_TIMEOUT", environ_prefix=None
)
# pylint: disable=invalid-name
@property
def ENVIRONMENT(self):
@@ -1170,6 +1175,12 @@ class Development(Base):
ALLOWED_HOSTS = ["*"]
CORS_ALLOW_ALL_ORIGINS = True
CSRF_TRUSTED_ORIGINS = ["http://localhost:8072", "http://localhost:3000"]
CORS_ALLOW_HEADERS = (
*default_headers,
"if-none-match",
"if-modified-since",
)
CORS_EXPOSE_HEADERS = ["ETag"]
DEBUG = True
USE_SWAGGER = True

View File

@@ -1,165 +0,0 @@
import { Page, expect, test } from '@playwright/test';
import { createDoc, goToGridDoc, mockedDocument } from './utils-common';
import { openSuggestionMenu, writeInEditor } from './utils-editor';
const openPresenter = async (page: Page) => {
await page.getByLabel('Open the document options').click();
await page.getByRole('menuitem', { name: 'Present' }).click();
const overlay = page.getByRole('dialog', { name: 'Presenter mode' });
await expect(overlay).toBeVisible();
return overlay;
};
const insertDivider = async (page: Page) => {
const { suggestionMenu } = await openSuggestionMenu({ page });
await suggestionMenu.getByText('Divider', { exact: true }).click();
};
const writeMultiSlideDoc = async (page: Page) => {
const editor = await writeInEditor({ page, text: 'Slide one' });
await editor.press('Enter');
await insertDivider(page);
await editor.press('Enter');
await writeInEditor({ page, text: 'Slide two' });
await editor.press('Enter');
await insertDivider(page);
await editor.press('Enter');
await writeInEditor({ page, text: 'Slide three' });
};
test.beforeEach(async ({ page }) => {
await page.goto('/');
});
test.describe('Presenter Mode', () => {
test('opens the presenter overlay from the doc options menu and closes with Escape', async ({
page,
browserName,
}) => {
await createDoc(page, 'presenter-open', browserName, 1);
await writeInEditor({ page, text: 'Hello presenter' });
const overlay = await openPresenter(page);
await expect(
overlay.getByRole('toolbar', { name: 'Presenter controls' }),
).toBeVisible();
await expect(overlay.getByText('Hello presenter')).toBeVisible();
await page.keyboard.press('Escape');
await expect(overlay).toBeHidden();
});
test('renders a single-slide doc with counter 1/1 and disabled nav buttons', async ({
page,
browserName,
}) => {
await createDoc(page, 'presenter-single', browserName, 1);
await writeInEditor({ page, text: 'Slide A' });
const overlay = await openPresenter(page);
await expect(overlay.getByText('1 / 1')).toBeVisible();
await expect(
overlay.getByRole('button', { name: 'Previous slide' }),
).toBeDisabled();
await expect(
overlay.getByRole('button', { name: 'Next slide' }),
).toBeDisabled();
await expect(overlay.getByText('Slide A')).toBeVisible();
await overlay.getByRole('button', { name: 'Close presenter' }).click();
await expect(overlay).toBeHidden();
});
test('navigates between slides via the floating bar buttons', async ({
page,
browserName,
}) => {
await createDoc(page, 'presenter-nav-bar', browserName, 1);
await writeMultiSlideDoc(page);
const overlay = await openPresenter(page);
const prev = overlay.getByRole('button', { name: 'Previous slide' });
const next = overlay.getByRole('button', { name: 'Next slide' });
await expect(overlay.getByText('1 / 3')).toBeVisible();
await expect(overlay.getByText('Slide one')).toBeVisible();
await expect(prev).toBeDisabled();
await expect(next).toBeEnabled();
await next.click();
await expect(overlay.getByText('2 / 3')).toBeVisible();
await expect(overlay.getByText('Slide two')).toBeVisible();
await next.click();
await expect(overlay.getByText('3 / 3')).toBeVisible();
await expect(overlay.getByText('Slide three')).toBeVisible();
await expect(next).toBeDisabled();
await expect(prev).toBeEnabled();
await prev.click();
await expect(overlay.getByText('2 / 3')).toBeVisible();
await expect(overlay.getByText('Slide two')).toBeVisible();
});
test('navigates between slides via keyboard shortcuts', async ({
page,
browserName,
}) => {
await createDoc(page, 'presenter-nav-keyboard', browserName, 1);
await writeMultiSlideDoc(page);
const overlay = await openPresenter(page);
await expect(overlay.getByText('1 / 3')).toBeVisible();
await page.keyboard.press('ArrowRight');
await expect(overlay.getByText('2 / 3')).toBeVisible();
await page.keyboard.press('End');
await expect(overlay.getByText('3 / 3')).toBeVisible();
await page.keyboard.press('Home');
await expect(overlay.getByText('1 / 3')).toBeVisible();
// ArrowLeft on the first slide is clamped — counter stays at 1 / 3.
await page.keyboard.press('ArrowLeft');
await expect(overlay.getByText('1 / 3')).toBeVisible();
});
});
test.describe('Presenter Mode mobile', () => {
test.use({ viewport: { width: 500, height: 1200 } });
test.beforeEach(async ({ page }) => {
await page.goto('/');
});
test('hides the Present option on small mobile viewports', async ({
page,
}) => {
await mockedDocument(page, {
abilities: {
destroy: true,
link_configuration: true,
versions_destroy: true,
versions_list: true,
versions_retrieve: true,
accesses_manage: true,
accesses_view: true,
update: true,
partial_update: true,
retrieve: true,
},
});
await goToGridDoc(page);
await page.getByLabel('Open the document options').click();
await expect(page.getByRole('menuitem', { name: 'Present' })).toBeHidden();
});
});

View File

@@ -250,22 +250,16 @@ export const waitForResponseCreateDoc = (page: Page) => {
};
export const mockedDocument = async (page: Page, data: object) => {
await page.route(/\**\/documents\/\**/, async (route) => {
// document/[ID]/ or document/[ID]/tree/ routes
await page.route(/.*\/documents\/[^/]+\/(?:$|tree\/.*)/, async (route) => {
const request = route.request();
if (
request.method().includes('GET') &&
!request.url().includes('page=') &&
!request.url().includes('versions') &&
!request.url().includes('accesses') &&
!request.url().includes('invitations')
) {
if (request.method().includes('GET') && !request.url().includes('page=')) {
const { abilities, ...doc } = data as unknown as {
abilities?: Record<string, unknown>;
};
await route.fulfill({
json: {
id: 'mocked-document-id',
content: '',
title: 'Mocked document',
path: '000000',
abilities: {
@@ -299,6 +293,17 @@ export const mockedDocument = async (page: Page, data: object) => {
await route.continue();
}
});
await page.route(/.*\/documents\/[^/]+\/content\/$/, async (route) => {
const request = route.request();
if (request.method().includes('GET')) {
await route.fulfill({
body: '',
});
} else {
await route.continue();
}
});
};
export const mockedListDocs = async (page: Page, data: object[] = []) => {

View File

@@ -27,25 +27,16 @@ export const overrideDocContent = async ({
browserName: BrowserName;
}) => {
// Override content prop with assets/base-content-test-pdf.txt
await page.route(/\**\/documents\/\**/, async (route) => {
await page.route(/.*\/documents\/[^/]+\/content\/$/, async (route) => {
const request = route.request();
if (
request.method().includes('GET') &&
!request.url().includes('page=') &&
!request.url().includes('versions') &&
!request.url().includes('accesses') &&
!request.url().includes('invitations')
) {
if (request.method() === 'GET') {
const response = await route.fetch();
const json = await response.json();
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
json.content = fs.readFileSync(
path.join(__dirname, 'assets/base-content-test-pdf.txt'),
'utf-8',
);
void route.fulfill({
response,
body: JSON.stringify(json),
body: fs.readFileSync(
path.join(__dirname, 'assets/base-content-test-pdf.txt'),
'utf-8',
),
});
} else {
await route.continue();

View File

@@ -40,9 +40,9 @@
"@fontsource-variable/inter": "5.2.8",
"@fontsource-variable/material-symbols-outlined": "5.2.38",
"@fontsource/material-icons": "5.2.7",
"@gouvfr-lasuite/cunningham-react": "4.3.0",
"@gouvfr-lasuite/cunningham-react": "4.2.0",
"@gouvfr-lasuite/integration": "1.0.3",
"@gouvfr-lasuite/ui-kit": "0.20.1",
"@gouvfr-lasuite/ui-kit": "0.19.10",
"@hocuspocus/provider": "3.4.4",
"@mantine/core": "8.3.18",
"@mantine/hooks": "8.3.18",

View File

@@ -2,7 +2,7 @@ import {
Button,
ButtonProps,
Modal,
ModalDefaultVariantProps,
ModalProps,
ModalSize,
} from '@gouvfr-lasuite/cunningham-react';
import { ReactNode, useEffect } from 'react';
@@ -20,7 +20,7 @@ export type AlertModalProps = {
title: string;
cancelLabel?: string;
confirmLabel?: string;
} & Partial<ModalDefaultVariantProps>;
} & Partial<ModalProps>;
export const AlertModal = ({
cancelLabel,

View File

@@ -1,9 +1,5 @@
import {
Modal,
ModalDefaultVariantProps,
ModalSize,
} from '@gouvfr-lasuite/cunningham-react';
import { PropsWithChildren } from 'react';
import { Modal, ModalSize } from '@gouvfr-lasuite/cunningham-react';
import { ComponentPropsWithRef, PropsWithChildren } from 'react';
import { createGlobalStyle } from 'styled-components';
interface SideModalStyleProps {
@@ -39,7 +35,7 @@ const SideModalStyle = createGlobalStyle<SideModalStyleProps>`
}
`;
type SideModalType = Omit<ModalDefaultVariantProps, 'size'>;
type SideModalType = Omit<ComponentPropsWithRef<typeof Modal>, 'size'>;
type SideModalProps = SideModalType & Partial<SideModalStyleProps>;

View File

@@ -361,7 +361,7 @@
--c--globals--font--weights--medium: 500;
--c--globals--font--weights--bold: 600;
--c--globals--font--weights--extrabold: 800;
--c--globals--font--weights--black: 800;
--c--globals--font--weights--black: 900;
--c--globals--font--families--base:
inter variable, roboto flex variable, sans-serif;
--c--globals--font--families--accent:
@@ -849,18 +849,6 @@
--c--components--forms-checkbox--font-size: var(
--c--globals--font--sizes--sm
);
--c--components--forms-input--border-radius: 4px;
--c--components--forms-input--border-radius--hover: 4px;
--c--components--forms-input--border-radius--focus: 4px;
--c--components--forms-select--border-radius: 4px;
--c--components--forms-select--border-radius--hover: 4px;
--c--components--forms-select--border-radius--focus: 4px;
--c--components--forms-textarea--border-radius: 4px;
--c--components--forms-textarea--border-radius--hover: 4px;
--c--components--forms-textarea--border-radius--focus: 4px;
--c--components--forms-datepicker--border-radius: 4px;
--c--components--forms-datepicker--border-radius--hover: 4px;
--c--components--forms-datepicker--border-radius--focus: 4px;
--c--components--badge--font-size: var(--c--globals--font--sizes--xs);
--c--components--badge--border-radius: 12px;
--c--components--badge--padding-inline: var(--c--globals--spacings--xs);
@@ -1743,6 +1731,7 @@
--c--globals--font--sizes--xs-alt: 3rem;
--c--globals--font--weights--thin: 100;
--c--globals--font--weights--extrabold: 800;
--c--globals--font--weights--black: 900;
--c--globals--font--families--accent:
marianne, inter variable, roboto flex variable, sans-serif;
--c--globals--font--families--base:
@@ -2550,18 +2539,6 @@
--c--components--forms-checkbox--font-size: var(
--c--globals--font--sizes--sm
);
--c--components--forms-input--border-radius: 4px;
--c--components--forms-input--border-radius--hover: 4px;
--c--components--forms-input--border-radius--focus: 4px;
--c--components--forms-select--border-radius: 4px;
--c--components--forms-select--border-radius--hover: 4px;
--c--components--forms-select--border-radius--focus: 4px;
--c--components--forms-textarea--border-radius: 4px;
--c--components--forms-textarea--border-radius--hover: 4px;
--c--components--forms-textarea--border-radius--focus: 4px;
--c--components--forms-datepicker--border-radius: 4px;
--c--components--forms-datepicker--border-radius--hover: 4px;
--c--components--forms-datepicker--border-radius--focus: 4px;
--c--components--badge--font-size: var(--c--globals--font--sizes--xs);
--c--components--badge--border-radius: 12px;
--c--components--badge--padding-inline: var(--c--globals--spacings--xs);

View File

@@ -372,7 +372,7 @@ export const tokens = {
medium: 500,
bold: 600,
extrabold: 800,
black: 800,
black: 900,
},
families: {
base: 'Inter Variable, Roboto Flex Variable, sans-serif',
@@ -664,26 +664,6 @@ export const tokens = {
'body--background-color-hover': '#F0F0F3',
},
'forms-checkbox': { 'font-size': '0.875rem' },
'forms-input': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-select': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-textarea': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-datepicker': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
badge: {
'font-size': '0.75rem',
'border-radius': '12px',
@@ -1354,7 +1334,7 @@ export const tokens = {
'sm-alt': '3.5rem',
'xs-alt': '3rem',
},
weights: { thin: 100, extrabold: 800 },
weights: { thin: 100, extrabold: 800, black: 900 },
families: {
accent:
'Marianne, Inter Variable, Roboto Flex Variable, sans-serif',
@@ -1968,26 +1948,6 @@ export const tokens = {
'body--background-color-hover': '#F0F0F3',
},
'forms-checkbox': { 'font-size': '0.875rem' },
'forms-input': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-select': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-textarea': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
'forms-datepicker': {
'border-radius': '4px',
'border-radius--hover': '4px',
'border-radius--focus': '4px',
},
badge: {
'font-size': '0.75rem',
'border-radius': '12px',

View File

@@ -22,7 +22,7 @@ import * as Y from 'yjs';
import { Box, TextErrors } from '@/components';
import { useConfig } from '@/core';
import { useCunninghamTheme } from '@/cunningham';
import { Doc, useProviderStore } from '@/docs/doc-management';
import { Doc } from '@/docs/doc-management';
import { avatarUrlFromName, useAuth } from '@/features/auth';
import { useAnalytics } from '@/libs/Analytics';
@@ -88,13 +88,12 @@ export const BlockNoteEditor = ({ doc, provider }: BlockNoteEditorProps) => {
const { user } = useAuth();
const { setEditor } = useEditorStore();
const { themeTokens } = useCunninghamTheme();
const { isSynced: isConnectedToCollabServer } = useProviderStore();
const refEditorContainer = useRef<HTMLDivElement>(null);
const canSeeComment = doc.abilities.comment;
// Determine if comments should be visible in the UI
const showComments = canSeeComment;
useSaveDoc(doc.id, provider.document, isConnectedToCollabServer);
useSaveDoc(doc.id, provider.document);
const { i18n, t } = useTranslation();
const langLocalesBN =
!i18n.resolvedLanguage || !(i18n.resolvedLanguage in localesBN)

View File

@@ -7,6 +7,7 @@ import {
Doc,
LinkReach,
getDocLinkReach,
useCollaboration,
useIsCollaborativeEditable,
useProviderStore,
} from '@/docs/doc-management';
@@ -79,6 +80,7 @@ interface DocEditorProps {
}
export const DocEditor = ({ doc }: DocEditorProps) => {
useCollaboration(doc.id);
const { isDesktop } = useResponsiveStore();
const { provider, isReady } = useProviderStore();
const { isEditable, isLoading } = useIsCollaborativeEditable(doc);

View File

@@ -18,7 +18,7 @@ export const LinkSelected = ({
isEditable,
onUpdateTitle,
}: LinkSelectedProps) => {
const { data: doc } = useDoc({ id: docId, withoutContent: true });
const { data: doc } = useDoc({ id: docId });
/**
* Update the content title if the referenced doc title changes

View File

@@ -43,7 +43,7 @@ describe('useSaveDoc', () => {
const addEventListenerSpy = vi.spyOn(window, 'addEventListener');
renderHook(() => useSaveDoc(docId, yDoc, true), {
renderHook(() => useSaveDoc(docId, yDoc), {
wrapper: AppWrapper,
});
@@ -65,17 +65,16 @@ describe('useSaveDoc', () => {
it('should save when there are local changes', async () => {
vi.useFakeTimers();
const yDoc = new Y.Doc();
const docId = 'test-doc-id';
const docId = self.crypto.randomUUID();
fetchMock.patch('http://test.jest/api/v1.0/documents/test-doc-id/', {
fetchMock.patch(`http://test.jest/api/v1.0/documents/${docId}/content/`, {
body: JSON.stringify({
id: 'test-doc-id',
id: docId,
content: 'test-content',
title: 'test-title',
}),
});
renderHook(() => useSaveDoc(docId, yDoc, true), {
renderHook(() => useSaveDoc(docId, yDoc), {
wrapper: AppWrapper,
});
@@ -94,7 +93,7 @@ describe('useSaveDoc', () => {
await waitFor(() => {
expect(fetchMock.lastCall()?.[0]).toBe(
'http://test.jest/api/v1.0/documents/test-doc-id/',
`http://test.jest/api/v1.0/documents/${docId}/content/`,
);
});
});
@@ -104,15 +103,17 @@ describe('useSaveDoc', () => {
const yDoc = new Y.Doc();
const docId = 'test-doc-id';
fetchMock.patch('http://test.jest/api/v1.0/documents/test-doc-id/', {
body: JSON.stringify({
id: 'test-doc-id',
content: 'test-content',
title: 'test-title',
}),
});
fetchMock.patch(
'http://test.jest/api/v1.0/documents/test-doc-id/content/',
{
body: JSON.stringify({
id: 'test-doc-id',
content: 'test-content',
}),
},
);
renderHook(() => useSaveDoc(docId, yDoc, true), {
renderHook(() => useSaveDoc(docId, yDoc), {
wrapper: AppWrapper,
});
@@ -132,7 +133,7 @@ describe('useSaveDoc', () => {
const docId = 'test-doc-id';
const removeEventListenerSpy = vi.spyOn(window, 'removeEventListener');
const { unmount } = renderHook(() => useSaveDoc(docId, yDoc, true), {
const { unmount } = renderHook(() => useSaveDoc(docId, yDoc), {
wrapper: AppWrapper,
});

View File

@@ -1,24 +1,36 @@
import { useRouter } from 'next/router';
import { useCallback, useEffect, useState } from 'react';
import { useCallback, useEffect, useRef, useState } from 'react';
import * as Y from 'yjs';
import { useUpdateDoc } from '@/docs/doc-management/';
import { useDocContentUpdate } from '@/docs/doc-management/api/useDocContentUpdate';
import { useProviderStore } from '@/docs/doc-management/stores/useProviderStore';
import { KEY_LIST_DOC_VERSIONS } from '@/docs/doc-versioning/api/useDocVersions';
import { useIsOffline } from '@/features/service-worker';
import { toBase64 } from '@/utils/string';
import { isFirefox } from '@/utils/userAgent';
const SAVE_INTERVAL = 60000;
export const useSaveDoc = (
docId: string,
yDoc: Y.Doc,
isConnectedToCollabServer: boolean,
) => {
const { mutate: updateDoc } = useUpdateDoc({
export const useSaveDoc = (docId: string, yDoc: Y.Doc) => {
/**
* isSynced is more reliable than isConnected in this cases
* because it indicates that the content is fully synchronised
* with the yjs server
*/
const { isSynced: isConnectedToCollabServer } = useProviderStore();
const { isOffline } = useIsOffline();
const isSavingRef = useRef(false);
const { mutate: updateDocContent } = useDocContentUpdate({
listInvalidQueries: [KEY_LIST_DOC_VERSIONS],
isOptimistic: isOffline, // Enable optimistic updates when offline, to update the cache immediately
onSuccess: () => {
isSavingRef.current = false;
setIsLocalChange(false);
},
onError: () => {
isSavingRef.current = false;
},
});
const [isLocalChange, setIsLocalChange] = useState<boolean>(false);
@@ -64,18 +76,19 @@ export const useSaveDoc = (
}, [yDoc]);
const saveDoc = useCallback(() => {
if (!isLocalChange) {
if (!isLocalChange || isSavingRef.current) {
return false;
}
updateDoc({
isSavingRef.current = true;
updateDocContent({
id: docId,
content: toBase64(Y.encodeStateAsUpdate(yDoc)),
websocket: isConnectedToCollabServer,
});
return true;
}, [isLocalChange, updateDoc, docId, yDoc, isConnectedToCollabServer]);
}, [isLocalChange, updateDocContent, docId, yDoc, isConnectedToCollabServer]);
const router = useRouter();

View File

@@ -1,5 +1,5 @@
import { Button, useModal } from '@gouvfr-lasuite/cunningham-react';
import { Present, useTreeContext } from '@gouvfr-lasuite/ui-kit';
import { useTreeContext } from '@gouvfr-lasuite/ui-kit';
import dynamic from 'next/dynamic';
import { useRouter } from 'next/router';
import { useState } from 'react';
@@ -79,14 +79,6 @@ const ModalExport =
)
: null;
const PresenterOverlay = dynamic(
() =>
import('@/docs/doc-presenter').then((mod) => ({
default: mod.PresenterOverlay,
})),
{ ssr: false },
);
interface DocToolBoxProps {
doc: Doc;
}
@@ -101,7 +93,6 @@ export const DocToolBox = ({ doc }: DocToolBoxProps) => {
const [isModalRemoveOpen, setIsModalRemoveOpen] = useState(false);
const [isModalExportOpen, setIsModalExportOpen] = useState(false);
const [isPresenterOpen, setIsPresenterOpen] = useState(false);
const selectHistoryModal = useModal();
const modalShare = useModal();
@@ -185,15 +176,6 @@ export const DocToolBox = ({ doc }: DocToolBoxProps) => {
showSeparator: true,
show: !emoji && doc.abilities.partial_update && !isTopRoot,
},
{
label: t('Present'),
icon: <Present />,
callback: () => {
setIsPresenterOpen(true);
},
show: !doc.deleted_at && !isSmallMobile,
testId: `docs-actions-present-${doc.id}`,
},
{
label: t('Copy link'),
icon: <AddLinkSVG width={24} height={24} aria-hidden="true" />,
@@ -338,15 +320,6 @@ export const DocToolBox = ({ doc }: DocToolBoxProps) => {
doc={doc}
/>
)}
{isPresenterOpen && (
<PresenterOverlay
doc={doc}
onClose={() => {
setIsPresenterOpen(false);
restoreFocus();
}}
/>
)}
</Box>
);
};

View File

@@ -6,15 +6,10 @@ import { Doc } from '../types';
export type DocParams = {
id: string;
withoutContent?: boolean;
};
export const getDoc = async ({
id,
withoutContent,
}: DocParams): Promise<Doc> => {
const params = withoutContent ? '?without_content=true' : '';
const response = await fetchAPI(`documents/${id}/${params}`);
export const getDoc = async ({ id }: DocParams): Promise<Doc> => {
const response = await fetchAPI(`documents/${id}/`);
if (!response.ok) {
throw new APIError('Failed to get the doc', await errorCauses(response));
@@ -24,7 +19,6 @@ export const getDoc = async ({
};
export const KEY_DOC = 'doc';
export const KEY_DOC_VISIBILITY = 'doc-visibility';
export function useDoc(
param: DocParams,

View File

@@ -0,0 +1,41 @@
import { UseQueryOptions, useQuery } from '@tanstack/react-query';
import { validate as uuidValidate } from 'uuid';
import { APIError, errorCauses, fetchAPI } from '@/api';
export type DocContentParams = {
id: string;
};
export const getDocContent = async ({
id,
}: DocContentParams): Promise<string> => {
if (!uuidValidate(id)) {
throw new Error(`Invalid doc id in getDocContent: ${id}`);
}
const response = await fetchAPI(`documents/${id}/content/`, {
headers: {
accept: 'text/plain,application/json',
},
});
if (!response.ok) {
throw new APIError('Failed to get the doc', await errorCauses(response));
}
return response.text();
};
export const KEY_DOC_CONTENT = 'doc-content';
export function useDocContent(
param: DocContentParams,
queryConfig?: UseQueryOptions<string, APIError, string>,
) {
return useQuery<string, APIError, string>({
queryKey: queryConfig?.queryKey ?? [KEY_DOC_CONTENT, param],
queryFn: () => getDocContent(param),
...queryConfig,
});
}

View File

@@ -0,0 +1,126 @@
import {
UseMutationOptions,
useMutation,
useQueryClient,
} from '@tanstack/react-query';
import { validate as uuidValidate } from 'uuid';
import { APIError, errorCauses, fetchAPI } from '@/api';
import { Doc } from '../types';
import { KEY_CAN_EDIT } from './useDocCanEdit';
import { KEY_DOC_CONTENT } from './useDocContent';
export interface UpdateDocContentParams {
id: Doc['id'];
content: string; // Base64 encoded content
websocket?: boolean;
}
export const updateDocContent = async ({
id,
content,
websocket,
}: UpdateDocContentParams): Promise<void> => {
if (!uuidValidate(id)) {
throw new Error(`Invalid doc id in updateDocContent: ${id}`);
}
const response = await fetchAPI(`documents/${id}/content/`, {
method: 'PATCH',
body: JSON.stringify({
content,
websocket,
}),
});
if (!response.ok) {
throw new APIError(
'Failed to update the doc content',
await errorCauses(response),
);
}
};
type UseDocContentUpdate = UseMutationOptions<
void,
APIError,
UpdateDocContentParams
> & {
isOptimistic?: boolean;
listInvalidQueries?: string[];
};
export function useDocContentUpdate(queryConfig?: UseDocContentUpdate) {
const queryClient = useQueryClient();
return useMutation<void, APIError, UpdateDocContentParams>({
mutationFn: updateDocContent,
...queryConfig,
onMutate: (variables) => {
/**
* If optimistic, we update the content cache immediately with the new content
* It is useful when we are in offline mode because the onSuccess is not always triggered.
*/
if (queryConfig?.isOptimistic) {
const previousContent = queryClient.getQueryData([
KEY_DOC_CONTENT,
{ id: variables.id },
]);
queryClient.setQueryData(
[KEY_DOC_CONTENT, { id: variables.id }],
variables.content,
);
return { previousContent };
}
},
onSuccess: (data, variables, onMutateResult, context) => {
if (!queryConfig?.isOptimistic) {
/**
* If not optimistic, we need to update the content cache with the new content returned
* from the server
*/
queryClient.setQueryData(
[KEY_DOC_CONTENT, { id: variables.id }],
variables.content,
);
}
queryConfig?.listInvalidQueries?.forEach((queryKey) => {
void queryClient.resetQueries({
queryKey: [queryKey],
});
});
if (queryConfig?.onSuccess) {
void queryConfig.onSuccess(data, variables, onMutateResult, context);
}
},
onError: (error, variables, onMutateResult, context) => {
if (
queryConfig?.isOptimistic &&
(onMutateResult as { previousContent: unknown })?.previousContent
) {
const previousContent = (onMutateResult as { previousContent: unknown })
.previousContent;
queryClient.setQueryData(
[KEY_DOC_CONTENT, { id: variables.id }],
previousContent,
);
}
// If error it means the user is probably not allowed to edit the doc
// so we invalidate the canEdit query to update the UI accordingly
void queryClient.invalidateQueries({
queryKey: [KEY_CAN_EDIT],
});
if (queryConfig?.onError) {
queryConfig.onError(error, variables, onMutateResult, context);
}
},
});
}

View File

@@ -17,8 +17,8 @@ import { toBase64 } from '@/utils/string';
import { useProviderStore } from '../stores';
import { Doc } from '../types';
import { useDocContentUpdate } from './useDocContentUpdate';
import { KEY_LIST_DOC } from './useDocs';
import { useUpdateDoc } from './useUpdateDoc';
interface DuplicateDocPayload {
docId: string;
@@ -62,7 +62,7 @@ export function useDuplicateDoc(options?: DuplicateDocOptions) {
const { t } = useTranslation();
const { provider } = useProviderStore();
const { mutateAsync: updateDoc } = useUpdateDoc({
const { mutateAsync: updateDocContent } = useDocContentUpdate({
listInvalidQueries: [KEY_LIST_DOC_VERSIONS],
});
@@ -75,7 +75,7 @@ export function useDuplicateDoc(options?: DuplicateDocOptions) {
provider.document.guid === variables.docId;
if (canSave) {
await updateDoc({
await updateDocContent({
id: variables.docId,
content: toBase64(Y.encodeStateAsUpdate(provider.document)),
});

View File

@@ -8,12 +8,10 @@ import { APIError, errorCauses, fetchAPI } from '@/api';
import { Doc } from '../types';
import { KEY_CAN_EDIT } from './useDocCanEdit';
export type UpdateDocParams = Pick<Doc, 'id'> &
Partial<Pick<Doc, 'content' | 'title'>> & {
websocket?: boolean;
};
export interface UpdateDocParams {
id: Doc['id'];
title?: string;
}
export const updateDoc = async ({
id,
@@ -33,7 +31,7 @@ export const updateDoc = async ({
return response.json() as Promise<Doc>;
};
type UseUpdateDoc = UseMutationOptions<Doc, APIError, Partial<Doc>> & {
type UseUpdateDoc = UseMutationOptions<Doc, APIError, UpdateDocParams> & {
listInvalidQueries?: string[];
};
@@ -54,12 +52,6 @@ export function useUpdateDoc(queryConfig?: UseUpdateDoc) {
}
},
onError: (error, variables, onMutateResult, context) => {
// If error it means the user is probably not allowed to edit the doc
// so we invalidate the canEdit query to update the UI accordingly
void queryClient.invalidateQueries({
queryKey: [KEY_CAN_EDIT],
});
if (queryConfig?.onError) {
queryConfig.onError(error, variables, onMutateResult, context);
}

View File

@@ -15,6 +15,7 @@ import { useConfig } from '@/core';
import { KEY_LIST_DOC_TRASHBIN } from '@/docs/docs-grid';
import { useKeyboardAction } from '@/hooks';
import { KEY_DOC } from '../api';
import { KEY_LIST_DOC } from '../api/useDocs';
import { useRemoveDoc } from '../api/useRemoveDoc';
import { useDocUtils } from '../hooks';
@@ -44,7 +45,7 @@ export const ModalRemoveDoc = ({
isError,
error,
} = useRemoveDoc({
listInvalidQueries: [KEY_LIST_DOC, KEY_LIST_DOC_TRASHBIN],
listInvalidQueries: [KEY_LIST_DOC, KEY_LIST_DOC_TRASHBIN, KEY_DOC],
options: {
onSuccess: () => {
if (onSuccess) {

View File

@@ -1,29 +1,97 @@
import { useQueryClient } from '@tanstack/react-query';
import { useEffect } from 'react';
import { useCollaborationUrl } from '@/core/config';
import {
KEY_DOC_CONTENT,
useDocContent,
} from '@/docs/doc-management/api/useDocContent';
import { useProviderStore } from '@/docs/doc-management/stores/useProviderStore';
import { useIsOffline } from '@/features/service-worker/hooks/useOffline';
import { useBroadcastStore } from '@/stores/useBroadcastStore';
import { useProviderStore } from '../stores/useProviderStore';
import { Base64 } from '../types';
import { KEY_DOC } from '../api';
export const useCollaboration = (room?: string, initialContent?: Base64) => {
export const useCollaboration = (room: string) => {
const collaborationUrl = useCollaborationUrl(room);
const { addTask } = useBroadcastStore();
const queryClient = useQueryClient();
const { setBroadcastProvider, cleanupBroadcast } = useBroadcastStore();
const { provider, createProvider, destroyProvider } = useProviderStore();
const {
provider,
createProvider,
destroyProvider,
setReady,
isReady,
hasLostConnection,
resetLostConnection,
} = useProviderStore();
const isOffline = useIsOffline((state) => state.isOffline);
const { data: docContent } = useDocContent(
{ id: room },
{
staleTime: 30000, // 30 seconds - We keep the data fresh as it is a highly collaborative page
queryKey: [KEY_DOC_CONTENT, { id: room }],
},
);
/**
* When offline, the WebSocket never connects so the provider would stay
* in a non-ready state for a long time. Immediately mark it as ready so
* the editor can render with the cached content.
*/
useEffect(() => {
if (!room || !collaborationUrl || provider) {
if (isOffline && provider && !isReady) {
setReady(true);
}
}, [isOffline, isReady, provider, setReady]);
/**
* When the provider detects a lost connection, we invalidate the document query to trigger a refetch.
* Because it can be because the user has access to the document that are modified
* (e.g., permissions changed, document deleted, user removed)
*/
useEffect(() => {
if (hasLostConnection && room) {
void queryClient.invalidateQueries({
queryKey: [KEY_DOC, { id: room }],
});
resetLostConnection();
}
}, [hasLostConnection, room, queryClient, resetLostConnection]);
/**
* We add a broadcast task to reset the query cache
* when the document visibility changes.
*/
useEffect(() => {
if (!room || !isReady) {
return;
}
const newProvider = createProvider(collaborationUrl, room, initialContent);
addTask(`${KEY_DOC}-${room}`, () => {
void queryClient.invalidateQueries({
queryKey: [KEY_DOC, { id: room }],
});
});
}, [addTask, room, queryClient, isReady]);
/**
* Set the provider when the collaboration URL and the document content are available.
*/
useEffect(() => {
if (!room || !collaborationUrl || provider || docContent === undefined) {
return;
}
const newProvider = createProvider(collaborationUrl, room, docContent);
setBroadcastProvider(newProvider);
}, [
provider,
collaborationUrl,
room,
initialContent,
createProvider,
docContent,
room,
setBroadcastProvider,
]);

View File

@@ -12,6 +12,7 @@ export interface UseCollaborationStore {
initialDoc?: Base64,
) => HocuspocusProvider;
destroyProvider: () => void;
setReady: (value: boolean) => void;
provider: HocuspocusProvider | undefined;
isConnected: boolean;
isReady: boolean;
@@ -161,5 +162,6 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
set(defaultValues);
},
setReady: (value: boolean) => set({ isReady: value }),
resetLostConnection: () => set({ hasLostConnection: false }),
}));

View File

@@ -53,7 +53,6 @@ export interface Doc {
title?: string;
children?: Doc[];
childrenCount?: number;
content?: Base64;
created_at: string;
creator: string;
deleted_at: string | null;
@@ -82,9 +81,12 @@ export interface Doc {
children_list: boolean;
collaboration_auth: boolean;
comment: boolean;
content_patch: boolean;
content_retrieve: boolean;
destroy: boolean;
duplicate: boolean;
favorite: boolean;
formatted_content: boolean;
invite_owner: boolean;
link_configuration: boolean;
media_auth: boolean;

View File

@@ -1,130 +0,0 @@
import { act, renderHook } from '@testing-library/react';
import { afterEach, beforeEach, describe, expect, test, vi } from 'vitest';
import { useBrowserFullscreen } from '../hooks/useBrowserFullscreen';
describe('useBrowserFullscreen', () => {
let fullscreenElement: Element | null = null;
const requestFullscreen = vi.fn(async () => {
fullscreenElement = document.documentElement;
document.dispatchEvent(new Event('fullscreenchange'));
});
const exitFullscreen = vi.fn(async () => {
fullscreenElement = null;
document.dispatchEvent(new Event('fullscreenchange'));
});
beforeEach(() => {
fullscreenElement = null;
Object.defineProperty(document, 'fullscreenElement', {
configurable: true,
get: () => fullscreenElement,
});
Object.defineProperty(document.documentElement, 'requestFullscreen', {
configurable: true,
value: requestFullscreen,
});
Object.defineProperty(document, 'exitFullscreen', {
configurable: true,
value: exitFullscreen,
});
requestFullscreen.mockClear();
exitFullscreen.mockClear();
});
afterEach(() => {
fullscreenElement = null;
});
test('initial state reflects current fullscreen state', () => {
const { result } = renderHook(() => useBrowserFullscreen());
expect(result.current.isFullscreen).toBe(false);
});
test('enter() requests fullscreen and updates state', async () => {
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.enter();
});
expect(requestFullscreen).toHaveBeenCalledTimes(1);
expect(result.current.isFullscreen).toBe(true);
});
test('enter() is a no-op if already fullscreen', async () => {
fullscreenElement = document.documentElement;
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.enter();
});
expect(requestFullscreen).not.toHaveBeenCalled();
});
test('exit() leaves fullscreen and updates state', async () => {
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.enter();
});
await act(async () => {
await result.current.exit();
});
expect(exitFullscreen).toHaveBeenCalledTimes(1);
expect(result.current.isFullscreen).toBe(false);
});
test('toggle() flips state', async () => {
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.toggle();
});
expect(result.current.isFullscreen).toBe(true);
await act(async () => {
await result.current.toggle();
});
expect(result.current.isFullscreen).toBe(false);
});
test('reacts to external fullscreenchange events', () => {
const { result } = renderHook(() => useBrowserFullscreen());
act(() => {
fullscreenElement = document.documentElement;
document.dispatchEvent(new Event('fullscreenchange'));
});
expect(result.current.isFullscreen).toBe(true);
});
test('exitIfOwned() exits when we initiated the fullscreen', async () => {
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.enter();
});
await act(async () => {
await result.current.exitIfOwned();
});
expect(exitFullscreen).toHaveBeenCalledTimes(1);
});
test('exitIfOwned() is a no-op when fullscreen pre-exists', async () => {
fullscreenElement = document.documentElement;
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.exitIfOwned();
});
expect(exitFullscreen).not.toHaveBeenCalled();
});
test('exitIfOwned() is a no-op after user exits fullscreen externally', async () => {
const { result } = renderHook(() => useBrowserFullscreen());
await act(async () => {
await result.current.enter();
});
// User presses Esc — fullscreen ends outside of our control.
act(() => {
fullscreenElement = null;
document.dispatchEvent(new Event('fullscreenchange'));
});
await act(async () => {
await result.current.exitIfOwned();
});
expect(exitFullscreen).not.toHaveBeenCalled();
});
});

View File

@@ -1,109 +0,0 @@
import { renderHook } from '@testing-library/react';
import { describe, expect, test, vi } from 'vitest';
import { usePresenterShortcuts } from '../hooks/usePresenterShortcuts';
const renderShortcuts = (
overrides: Partial<Parameters<typeof usePresenterShortcuts>[0]> = {},
) => {
const handlers = {
onPrev: vi.fn(),
onNext: vi.fn(),
onFirst: vi.fn(),
onLast: vi.fn(),
onToggleFullscreen: vi.fn(),
onClose: vi.fn(),
isFullscreen: false,
...overrides,
};
renderHook(() => usePresenterShortcuts(handlers));
return handlers;
};
const press = (init: KeyboardEventInit) => {
const event = new KeyboardEvent('keydown', { ...init, cancelable: true });
window.dispatchEvent(event);
return event;
};
describe('usePresenterShortcuts', () => {
test('ArrowLeft and PageUp call onPrev', () => {
const h = renderShortcuts();
press({ code: 'ArrowLeft' });
press({ code: 'PageUp' });
expect(h.onPrev).toHaveBeenCalledTimes(2);
});
test('ArrowRight, PageDown and Space call onNext', () => {
const h = renderShortcuts();
press({ code: 'ArrowRight' });
press({ code: 'PageDown' });
press({ code: 'Space' });
expect(h.onNext).toHaveBeenCalledTimes(3);
});
test('Home calls onFirst, End calls onLast', () => {
const h = renderShortcuts();
press({ code: 'Home' });
press({ code: 'End' });
expect(h.onFirst).toHaveBeenCalledTimes(1);
expect(h.onLast).toHaveBeenCalledTimes(1);
});
test('KeyF toggles fullscreen but ignores modifiers', () => {
const h = renderShortcuts();
press({ code: 'KeyF' });
press({ code: 'KeyF', metaKey: true });
press({ code: 'KeyF', ctrlKey: true });
expect(h.onToggleFullscreen).toHaveBeenCalledTimes(1);
});
test('Escape calls onClose only when not fullscreen', () => {
const h1 = renderShortcuts({ isFullscreen: false });
press({ code: 'Escape' });
expect(h1.onClose).toHaveBeenCalledTimes(1);
const h2 = renderShortcuts({ isFullscreen: true });
press({ code: 'Escape' });
expect(h2.onClose).not.toHaveBeenCalled();
});
test('Space prevents default to avoid page scroll', () => {
renderShortcuts();
const event = press({ code: 'Space' });
expect(event.defaultPrevented).toBe(true);
});
test('Arrow keys prevent default', () => {
renderShortcuts();
expect(press({ code: 'ArrowLeft' }).defaultPrevented).toBe(true);
expect(press({ code: 'ArrowRight' }).defaultPrevented).toBe(true);
});
test('non-arrow repeat events are ignored', () => {
const h = renderShortcuts();
press({ code: 'Space', repeat: true });
expect(h.onNext).not.toHaveBeenCalled();
});
test('arrow repeat events are accepted', () => {
const h = renderShortcuts();
press({ code: 'ArrowRight', repeat: true });
expect(h.onNext).toHaveBeenCalledTimes(1);
});
test('Space on a button is ignored to avoid native click double-trigger', () => {
const h = renderShortcuts();
const button = document.createElement('button');
document.body.appendChild(button);
button.dispatchEvent(
new KeyboardEvent('keydown', {
code: 'Space',
bubbles: true,
cancelable: true,
}),
);
expect(h.onNext).not.toHaveBeenCalled();
document.body.removeChild(button);
});
});

View File

@@ -1,128 +0,0 @@
import { describe, expect, test } from 'vitest';
import { isEmptyBlock, splitBlocksIntoSlides } from '../hooks/useSlides';
const para = (text = 'hello') => ({
type: 'paragraph',
content: text === '' ? [] : [{ type: 'text', text }],
});
const heading = (text = 'Title', level = 1) => ({
type: 'heading',
content: [{ type: 'text', text }],
props: { level },
});
const divider = () => ({ type: 'divider' });
const image = () => ({ type: 'image', props: { url: 'x' } });
describe('isEmptyBlock', () => {
test('empty paragraph (no content array entries) is empty', () => {
expect(isEmptyBlock(para(''))).toBe(true);
});
test('whitespace-only paragraph is empty', () => {
expect(isEmptyBlock(para(' '))).toBe(true);
});
test('paragraph with text is not empty', () => {
expect(isEmptyBlock(para('hi'))).toBe(false);
});
test('heading with whitespace is empty', () => {
expect(isEmptyBlock(heading(' '))).toBe(true);
});
test('image is never empty', () => {
expect(isEmptyBlock(image() as any)).toBe(false);
});
test('divider is not "empty" (it is filtered separately)', () => {
expect(isEmptyBlock(divider() as any)).toBe(false);
});
test('block with children is not empty', () => {
const b = { type: 'paragraph', content: [], children: [para()] };
expect(isEmptyBlock(b as any)).toBe(false);
});
});
describe('splitBlocksIntoSlides', () => {
test('no divider yields one slide', () => {
const result = splitBlocksIntoSlides([para('a'), para('b')]);
expect(result).toHaveLength(1);
expect(result[0]).toHaveLength(2);
});
test('one divider yields two slides', () => {
const result = splitBlocksIntoSlides([para('a'), divider(), para('b')]);
expect(result).toHaveLength(2);
expect(result[0]).toHaveLength(1);
expect(result[1]).toHaveLength(1);
});
test('leading divider does not produce an empty slide', () => {
const result = splitBlocksIntoSlides([divider(), para('a')]);
expect(result).toHaveLength(1);
});
test('trailing divider does not produce an empty slide', () => {
const result = splitBlocksIntoSlides([para('a'), divider()]);
expect(result).toHaveLength(1);
});
test('consecutive dividers do not produce empty slides', () => {
const result = splitBlocksIntoSlides([
para('a'),
divider(),
divider(),
divider(),
para('b'),
]);
expect(result).toHaveLength(2);
});
test('empty doc yields one empty slide', () => {
const result = splitBlocksIntoSlides([]);
expect(result).toHaveLength(1);
expect(result[0]).toHaveLength(0);
});
test('divider-only doc yields one empty slide', () => {
const result = splitBlocksIntoSlides([divider(), divider()]);
expect(result).toHaveLength(1);
expect(result[0]).toHaveLength(0);
});
test('group of only empty paragraphs is dropped', () => {
const result = splitBlocksIntoSlides([
para('a'),
divider(),
para(''),
para(' '),
divider(),
para('b'),
]);
expect(result).toHaveLength(2);
expect(result[0][0]).toMatchObject({ content: [{ text: 'a' }] });
expect(result[1][0]).toMatchObject({ content: [{ text: 'b' }] });
});
test('group with one empty + one non-empty paragraph keeps only the non-empty', () => {
const result = splitBlocksIntoSlides([para(''), para('hi'), para(' ')]);
expect(result).toHaveLength(1);
expect(result[0]).toHaveLength(1);
expect(result[0][0]).toMatchObject({ content: [{ text: 'hi' }] });
});
test('image-only group is kept', () => {
const result = splitBlocksIntoSlides([para('a'), divider(), image()]);
expect(result).toHaveLength(2);
expect(result[1]).toHaveLength(1);
});
test('heading with whitespace is filtered', () => {
const result = splitBlocksIntoSlides([heading(' '), para('body')]);
expect(result).toHaveLength(1);
expect(result[0]).toHaveLength(1);
expect(result[0][0]).toMatchObject({ type: 'paragraph' });
});
});

View File

@@ -1,118 +0,0 @@
import { Button } from '@gouvfr-lasuite/cunningham-react';
import {
ChevronLeft,
ChevronRight,
Maximize,
XMark,
} from '@gouvfr-lasuite/ui-kit';
import { useTranslation } from 'react-i18next';
import { css } from 'styled-components';
import { Box, Text } from '@/components';
interface PresenterFloatingBarProps {
index: number;
total: number;
isFullscreen: boolean;
onPrev: () => void;
onNext: () => void;
onToggleFullscreen: () => void;
onClose: () => void;
}
const barCss = css`
position: fixed;
bottom: 1.5rem;
left: 50%;
transform: translateX(-50%);
z-index: 1;
flex-direction: row !important;
align-items: center;
gap: 0.25rem;
padding: var(--c--globals--spacings--3xs, 4px);
border-radius: 8px;
font-variant-numeric: tabular-nums;
white-space: nowrap;
color: var(--c--contextuals--content--semantic--neutral--secondary);
border: 1px solid var(--c--contextuals--border--surface--primary);
background: var(--c--contextuals--background--surface--primary);
box-shadow: 0 2px 4px 0 rgba(0, 0, 0, 0.05);
`;
const separatorCss = css`
width: 1px;
height: 1.25rem;
background: var(--c--theme--colors--greyscale-200, #e5e5e5);
margin: 0 0.25rem;
`;
export const PresenterFloatingBar = ({
index,
total,
isFullscreen,
onPrev,
onNext,
onToggleFullscreen,
onClose,
}: PresenterFloatingBarProps) => {
const { t } = useTranslation();
const isFirst = index <= 0;
const isLast = index >= total - 1;
return (
<Box
$direction="row"
$align="center"
$css={barCss}
role="toolbar"
aria-label={t('Presenter controls')}
>
<Button
size="small"
color="neutral"
variant="tertiary"
disabled={isFirst}
onClick={onPrev}
aria-label={t('Previous slide')}
icon={<ChevronLeft />}
/>
<Text
as="span"
$size="sm"
$color="neutral"
aria-label={t('Slide {{current}} of {{total}}', {
current: index + 1,
total,
})}
>
{index + 1} / {total}
</Text>
<Button
size="small"
color="neutral"
variant="tertiary"
disabled={isLast}
onClick={onNext}
aria-label={t('Next slide')}
icon={<ChevronRight />}
/>
<Box $css={separatorCss} aria-hidden />
<Button
size="small"
color="neutral"
variant="tertiary"
onClick={onToggleFullscreen}
aria-label={isFullscreen ? t('Exit fullscreen') : t('Enter fullscreen')}
icon={<Maximize />}
/>
<Button
size="small"
color="neutral"
variant="tertiary"
onClick={onClose}
aria-label={t('Close presenter')}
icon={<XMark />}
/>
</Box>
);
};

View File

@@ -1,188 +0,0 @@
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
import { createPortal } from 'react-dom';
import { useTranslation } from 'react-i18next';
import { css } from 'styled-components';
import { Box } from '@/components';
import { useEditorStore } from '@/docs/doc-editor/stores';
import { Doc } from '@/docs/doc-management';
import { useFocusStore } from '@/stores';
import { PRESENTER_WINDOW_RADIUS } from '../constants';
import { useBrowserFullscreen } from '../hooks/useBrowserFullscreen';
import { usePresenterShortcuts } from '../hooks/usePresenterShortcuts';
import { useSlides } from '../hooks/useSlides';
import { PresenterFloatingBar } from './PresenterFloatingBar';
import { PresenterSlide } from './PresenterSlide';
interface PresenterOverlayProps {
doc: Doc;
onClose: () => void;
}
const overlayCss = css`
position: fixed;
inset: 0;
z-index: 1000;
background: white;
display: flex;
flex-direction: column;
`;
const slideAreaCss = css`
flex: 1;
display: flex;
align-items: center;
justify-content: center;
overflow: hidden;
`;
const slideFrameCss = css`
width: min(80%, 1400px);
height: 100%;
background: white;
overflow-y: auto;
overflow-x: hidden;
justify-content: center;
align-items: center;
@media (max-width: 1000px) {
width: 95%;
}
`;
const slideWrapperCss = css`
width: 100%;
max-height: 100%;
box-sizing: border-box;
display: flex;
align-items: center;
justify-content: center;
`;
export const PresenterOverlay = ({
doc: _doc,
onClose,
}: PresenterOverlayProps) => {
const { t } = useTranslation();
const editor = useEditorStore((state) => state.editor);
const { addLastFocus } = useFocusStore();
// Snapshot the editor's blocks once at mount. Subsequent collaborator
// edits do not affect the ongoing presentation (by design).
const snapshotRef = useRef<unknown[] | null>(null);
if (snapshotRef.current === null) {
snapshotRef.current = editor ? [...editor.document] : [];
}
const snapshotBlocks = snapshotRef.current;
// The presenter is opened from a dropdown menu item which doesn't expose
// its trigger to the click handler — so we capture the previously focused
// element here, on mount, after the dropdown has restored focus to its
// trigger button. `restoreFocus()` is then called by the parent on close.
useEffect(() => {
if (typeof document === 'undefined') {
return;
}
addLastFocus(document.activeElement as HTMLElement | null);
}, [addLastFocus]);
const slides = useSlides(snapshotBlocks as { type: string }[]);
const [currentIndex, setCurrentIndex] = useState(0);
const total = slides.length;
const clamp = useCallback(
(i: number) => Math.max(0, Math.min(i, total - 1)),
[total],
);
const goPrev = useCallback(
() => setCurrentIndex((i) => clamp(i - 1)),
[clamp],
);
const goNext = useCallback(
() => setCurrentIndex((i) => clamp(i + 1)),
[clamp],
);
const goFirst = useCallback(() => setCurrentIndex(0), []);
const goLast = useCallback(
() => setCurrentIndex(clamp(total - 1)),
[clamp, total],
);
const { isFullscreen, enter, exitIfOwned, toggle } = useBrowserFullscreen();
useEffect(() => {
void enter();
return () => {
void exitIfOwned();
};
}, [enter, exitIfOwned]);
usePresenterShortcuts({
onPrev: goPrev,
onNext: goNext,
onFirst: goFirst,
onLast: goLast,
onToggleFullscreen: () => void toggle(),
onClose,
isFullscreen,
});
const mountedIndices = useMemo(() => {
const from = Math.max(0, currentIndex - PRESENTER_WINDOW_RADIUS);
const to = Math.min(total - 1, currentIndex + PRESENTER_WINDOW_RADIUS);
const indices: number[] = [];
for (let i = from; i <= to; i += 1) {
indices.push(i);
}
return indices;
}, [currentIndex, total]);
if (typeof document === 'undefined') {
return null;
}
return createPortal(
<Box
$css={overlayCss}
role="dialog"
aria-modal="true"
aria-label={t('Presenter mode')}
>
<Box $css={slideAreaCss}>
<Box $css={slideFrameCss}>
{mountedIndices.map((i) => (
<Box
key={i}
$css={css`
${slideWrapperCss};
${i === currentIndex ? '' : 'display: none;'}
`}
>
<PresenterSlide
blocks={slides[i] as unknown[]}
ariaLabel={t('Slide {{current}} of {{total}}', {
current: i + 1,
total,
})}
/>
</Box>
))}
</Box>
</Box>
<PresenterFloatingBar
index={currentIndex}
total={total}
isFullscreen={isFullscreen}
onPrev={goPrev}
onNext={goNext}
onToggleFullscreen={() => void toggle()}
onClose={onClose}
/>
</Box>,
document.body,
);
};

View File

@@ -1,59 +0,0 @@
import { BlockNoteView } from '@blocknote/mantine';
import { useCreateBlockNote } from '@blocknote/react';
import { useTranslation } from 'react-i18next';
import { css } from 'styled-components';
import { Box } from '@/components';
import { blockNoteSchema } from '@/docs/doc-editor/components/BlockNoteEditor';
import { cssEditor } from '@/docs/doc-editor/styles';
interface PresenterSlideProps {
blocks: unknown[];
ariaLabel?: string;
}
const slideCss = css`
${cssEditor};
width: fit-content;
max-width: 100%;
margin: 0 auto;
padding: 0 1.5rem;
/* Hide editor chrome that may leak through despite editable={false} */
.bn-side-menu,
.bn-formatting-toolbar,
.bn-slash-menu {
display: none !important;
}
`;
export const PresenterSlide = ({ blocks, ariaLabel }: PresenterSlideProps) => {
const { t } = useTranslation();
const editor = useCreateBlockNote({
initialContent:
// BlockNote rejects an empty initialContent array — fall back to one empty paragraph.
blocks.length > 0
? (blocks as NonNullable<
Parameters<typeof useCreateBlockNote>[0]
>['initialContent'])
: undefined,
schema: blockNoteSchema,
});
return (
<Box
$css={slideCss}
role="group"
className="titi-presenter-slide"
aria-label={ariaLabel ?? t('Presenter slide')}
>
<BlockNoteView
editor={editor}
editable={false}
theme="light"
formattingToolbar={false}
slashMenu={false}
comments={false}
/>
</Box>
);
};

View File

@@ -1,7 +0,0 @@
/**
* Half-window of slide renderers mounted around the current slide.
* Total mounted = 2 * PRESENTER_WINDOW_RADIUS + 1.
* 1 = three slides mounted (prev, current, next) — sweet spot between
* memory and navigation flash. Tune freely.
*/
export const PRESENTER_WINDOW_RADIUS = 1;

View File

@@ -1,89 +0,0 @@
import { useCallback, useEffect, useRef, useState } from 'react';
const isCurrentlyFullscreen = () =>
typeof document !== 'undefined' && !!document.fullscreenElement;
export const useBrowserFullscreen = () => {
const [isFullscreen, setIsFullscreen] = useState<boolean>(
isCurrentlyFullscreen,
);
// Tracks whether the *current* fullscreen session was started by us.
// Prevents tearing down a fullscreen the user (or OS) had already
// entered before this hook was mounted.
const ownedRef = useRef(false);
useEffect(() => {
if (typeof document === 'undefined') {
return;
}
const handleChange = () => {
const fs = isCurrentlyFullscreen();
// Anytime fullscreen ends — Esc, our exit(), OS — release ownership.
if (!fs) {
ownedRef.current = false;
}
setIsFullscreen(fs);
};
document.addEventListener('fullscreenchange', handleChange);
return () => {
document.removeEventListener('fullscreenchange', handleChange);
};
}, []);
const enter = useCallback(async () => {
if (typeof document === 'undefined') {
return;
}
if (isCurrentlyFullscreen()) {
return;
}
if (!document.documentElement.requestFullscreen) {
return;
}
try {
await document.documentElement.requestFullscreen();
ownedRef.current = true;
} catch {
// Browsers reject the request when not triggered by a user gesture
// or when the API is unavailable. The presenter remains usable
// without fullscreen, so we swallow the rejection silently.
}
}, []);
const exit = useCallback(async () => {
if (typeof document === 'undefined') {
return;
}
if (!isCurrentlyFullscreen()) {
return;
}
if (!document.exitFullscreen) {
return;
}
try {
await document.exitFullscreen();
} catch {
// Ignore: nothing actionable if exit fails.
}
}, []);
// Same as exit() but bails out if we didn't initiate the fullscreen.
// Use this for cleanup-on-unmount so we don't yank a user out of a
// session they opened themselves before the presenter mounted.
const exitIfOwned = useCallback(async () => {
if (!ownedRef.current) {
return;
}
await exit();
}, [exit]);
const toggle = useCallback(async () => {
if (isCurrentlyFullscreen()) {
await exit();
} else {
await enter();
}
}, [enter, exit]);
return { isFullscreen, enter, exit, exitIfOwned, toggle };
};

View File

@@ -1,99 +0,0 @@
import { useEffect } from 'react';
interface ShortcutHandlers {
onPrev: () => void;
onNext: () => void;
onFirst: () => void;
onLast: () => void;
onToggleFullscreen: () => void;
onClose: () => void;
isFullscreen: boolean;
}
const ARROW_CODES = new Set(['ArrowLeft', 'ArrowRight']);
export const usePresenterShortcuts = ({
onPrev,
onNext,
onFirst,
onLast,
onToggleFullscreen,
onClose,
isFullscreen,
}: ShortcutHandlers) => {
useEffect(() => {
const handleKeyDown = (event: KeyboardEvent) => {
if (event.repeat && !ARROW_CODES.has(event.code)) {
return;
}
switch (event.code) {
case 'ArrowLeft':
case 'PageUp':
event.preventDefault();
onPrev();
return;
case 'Space': {
// A focused button activates on `keyup` (native click). If we
// also call onNext() here on `keydown`, Space on the toolbar's
// Next button fires twice. Skip when the event target handles
// Space natively.
const target = event.target;
if (
target instanceof Element &&
target.closest(
'button, [role="button"], a, input, textarea, select, [contenteditable="true"]',
)
) {
return;
}
event.preventDefault();
onNext();
return;
}
case 'ArrowRight':
case 'PageDown':
event.preventDefault();
onNext();
return;
case 'Home':
event.preventDefault();
onFirst();
return;
case 'End':
event.preventDefault();
onLast();
return;
case 'KeyF':
if (event.ctrlKey || event.metaKey || event.altKey) {
return;
}
event.preventDefault();
onToggleFullscreen();
return;
case 'Escape':
// While fullscreen, the browser handles Esc natively (exits
// fullscreen) and we deliberately stay open. Once out of
// fullscreen, Esc closes the presenter.
if (!isFullscreen) {
event.preventDefault();
onClose();
}
return;
}
};
window.addEventListener('keydown', handleKeyDown);
return () => {
window.removeEventListener('keydown', handleKeyDown);
};
}, [
onPrev,
onNext,
onFirst,
onLast,
onToggleFullscreen,
onClose,
isFullscreen,
]);
};

View File

@@ -1,81 +0,0 @@
import { useMemo } from 'react';
type Block = {
type: string;
content?: unknown;
children?: Block[];
};
const TEXT_BEARING_TYPES = new Set([
'paragraph',
'heading',
'bulletListItem',
'numberedListItem',
'checkListItem',
'quote',
]);
const extractText = (content: unknown): string => {
if (!content) {
return '';
}
if (typeof content === 'string') {
return content;
}
if (Array.isArray(content)) {
return content.map(extractText).join('');
}
if (typeof content === 'object') {
const obj = content as Record<string, unknown>;
if (typeof obj.text === 'string') {
return obj.text;
}
if ('content' in obj) {
return extractText(obj.content);
}
}
return '';
};
export const isEmptyBlock = (block: Block): boolean => {
if (!TEXT_BEARING_TYPES.has(block.type)) {
return false;
}
if (block.children && block.children.length > 0) {
return false;
}
return extractText(block.content).trim() === '';
};
/**
* Split a flat list of top-level blocks into slide groups.
*
* - Each `divider` block separates two slides; the divider itself is dropped.
* - Empty text-bearing blocks (paragraph, heading, ...) are filtered out.
* - Groups that are empty after filtering are removed entirely.
* - The returned array is never empty: an empty doc yields one empty group.
*/
export const splitBlocksIntoSlides = <T extends Block>(blocks: T[]): T[][] => {
const groups: T[][] = [];
let current: T[] = [];
for (const block of blocks) {
if (block.type === 'divider') {
groups.push(current);
current = [];
continue;
}
current.push(block);
}
groups.push(current);
const cleaned = groups
.map((group) => group.filter((b) => !isEmptyBlock(b)))
.filter((group) => group.length > 0);
return cleaned.length > 0 ? cleaned : [[]];
};
export const useSlides = <T extends Block>(blocks: T[]): T[][] => {
return useMemo(() => splitBlocksIntoSlides(blocks), [blocks]);
};

View File

@@ -1 +0,0 @@
export { PresenterOverlay } from './components/PresenterOverlay';

View File

@@ -10,12 +10,8 @@ import { createGlobalStyle } from 'styled-components';
import { Box, Text } from '@/components';
import { useEditorStore } from '@/docs/doc-editor/stores';
import {
Doc,
base64ToYDoc,
useProviderStore,
useUpdateDoc,
} from '@/docs/doc-management/';
import { Doc, base64ToYDoc, useProviderStore } from '@/docs/doc-management/';
import { useDocContentUpdate } from '@/docs/doc-management/api/useDocContentUpdate';
import { useDocVersion } from '../api';
import { KEY_LIST_DOC_VERSIONS } from '../api/useDocVersions';
@@ -49,7 +45,7 @@ export const ModalConfirmationVersion = ({
const { toast } = useToastProvider();
const { provider } = useProviderStore();
const { threadStore } = useEditorStore();
const { mutate: updateDoc } = useUpdateDoc({
const { mutate: updateDocContent } = useDocContentUpdate({
listInvalidQueries: [KEY_LIST_DOC_VERSIONS],
onSuccess: () => {
const onDisplaySuccess = () => {
@@ -104,7 +100,7 @@ export const ModalConfirmationVersion = ({
return;
}
updateDoc({
updateDocContent({
id: docId,
content: version.content,
});

View File

@@ -1,5 +1,3 @@
import { Doc } from '../doc-management/types';
export interface APIListVersions {
count: number;
is_truncated: boolean;
@@ -15,7 +13,7 @@ export interface Versions {
}
export interface Version {
content: Doc['content'];
content: string; // Base64 encoded content
last_modified: string;
id: string;
}

View File

@@ -11,6 +11,12 @@ export type DBRequest = {
key: string;
};
export interface DocContentCacheEntry {
etag: string;
lastModified: string;
content: string;
}
interface IDocsDB extends DBSchema {
'doc-list': {
key: string;
@@ -28,9 +34,13 @@ interface IDocsDB extends DBSchema {
key: 'version';
value: number;
};
'doc-content': {
key: string;
value: DocContentCacheEntry;
};
}
type TableName = 'doc-list' | 'doc-item' | 'doc-mutation';
type TableName = 'doc-list' | 'doc-item' | 'doc-mutation' | 'doc-content';
/**
* IndexDB prefers incremental versioning when upgrading the database,
@@ -78,6 +88,9 @@ export class DocsDB {
if (!db.objectStoreNames.contains('doc-version')) {
db.createObjectStore('doc-version');
}
if (!db.objectStoreNames.contains('doc-content')) {
db.createObjectStore('doc-content');
}
},
});
} catch (error) {
@@ -127,20 +140,35 @@ export class DocsDB {
*/
public static async cacheResponse(
key: string,
body: DocsResponse | Doc | DBRequest,
body: DocsResponse | Doc | DBRequest | DocContentCacheEntry,
tableName: TableName,
isRetry = false,
): Promise<void> {
const db = await DocsDB.open();
try {
await db.put(tableName, body, key);
} catch (error) {
console.error(
'SW: Failed to save response in IndexedDB',
error,
key,
body,
);
db.close();
// If the store is missing and we haven't retried yet, reset the DB once
// (handles a PR that added a store without a version bump).
// The isRetry guard prevents an infinite loop if the store name is invalid.
if (!isRetry && !db.objectStoreNames.contains(tableName)) {
console.warn(
'SW: Missing object store, resetting IndexedDB and retrying',
tableName,
);
await deleteDB(DocsDB.DBNAME);
await DocsDB.cacheResponse(key, body, tableName, true);
} else {
console.error(
'SW: Failed to save response in IndexedDB',
error,
key,
body,
);
}
return;
}
db.close();

View File

@@ -1,6 +1,7 @@
import { afterEach, describe, expect, it, vi } from 'vitest';
import { RequestSerializer } from '../RequestSerializer';
import { SyncManager } from '../SyncManager';
import { ApiPlugin } from '../plugins/ApiPlugin';
const mockedGet = vi.fn().mockResolvedValue({});
@@ -108,6 +109,7 @@ describe('ApiPlugin', () => {
{ type: 'create', withClone: true },
{ type: 'list', withClone: false },
{ type: 'item', withClone: false },
{ type: 'content', withClone: false },
].forEach(({ type, withClone }) => {
it(`calls requestWillFetch with type ${type}`, async () => {
const mockedSync = vi.fn().mockResolvedValue({});
@@ -137,6 +139,60 @@ describe('ApiPlugin', () => {
});
});
it(`calls requestWillFetch with type content and sets If-None-Match when etag is cached`, async () => {
const mockedSync = vi.fn().mockResolvedValue({});
const apiPlugin = new ApiPlugin({
type: 'content',
tableName: 'doc-content',
syncManager: { sync: () => mockedSync() } as any,
});
mockedGet.mockResolvedValue({
etag: '"abc123"',
lastModified: '',
content: 'hello',
});
const requestInit = {
request: new Request('http://test.jest/documents/123456/content/'),
} as any;
const request = await apiPlugin.requestWillFetch?.(requestInit);
expect(mockedGet).toHaveBeenCalledWith(
'doc-content',
'http://test.jest/documents/123456/content/',
);
expect(request?.headers.get('If-None-Match')).toBe('"abc123"');
});
it(`calls requestWillFetch with type content and sets If-Modified-Since when only lastModified is cached`, async () => {
const mockedSync = vi.fn().mockResolvedValue({});
const apiPlugin = new ApiPlugin({
type: 'content',
tableName: 'doc-content',
syncManager: { sync: () => mockedSync() } as SyncManager,
});
mockedGet.mockResolvedValue({
etag: '',
lastModified: 'Mon, 14 Apr 2026 00:00:00 GMT',
content: 'hello',
});
const requestInit = {
request: new Request('http://test.jest/documents/123456/content/'),
} as any;
const request = await apiPlugin.requestWillFetch?.(requestInit);
expect(mockedGet).toHaveBeenCalledWith(
'doc-content',
'http://test.jest/documents/123456/content/',
);
expect(request?.headers.get('If-Modified-Since')).toBe(
'Mon, 14 Apr 2026 00:00:00 GMT',
);
});
it(`checks getApiCatchHandler`, async () => {
const response = ApiPlugin.getApiCatchHandler();
expect(await response.json()).toEqual({ error: 'Network is unavailable.' });
@@ -145,6 +201,7 @@ describe('ApiPlugin', () => {
[
{ type: 'list', tableName: 'doc-list' },
{ type: 'item', tableName: 'doc-item' },
{ type: 'content', tableName: 'doc-content' },
].forEach(({ type, tableName }) => {
it(`checks handlerDidError with type ${type}`, async () => {
const requestInit = {
@@ -156,7 +213,7 @@ describe('ApiPlugin', () => {
const apiPlugin = new ApiPlugin({
type: type as 'list' | 'item' | 'update' | 'create' | 'delete',
tableName: tableName as 'doc-list' | 'doc-item',
syncManager: {} as any,
syncManager: {} as SyncManager,
});
await apiPlugin.fetchDidFail?.({} as any);
@@ -242,6 +299,72 @@ describe('ApiPlugin', () => {
expect(response?.status).toBe(200);
});
it(`checks handlerDidError with type content-update`, async () => {
const requestInit = {
request: {
url: 'http://test.jest/documents/123456/content/',
clone: () => mockedClone(),
headers: new Headers({
'Content-Type': 'application/json',
}),
arrayBuffer: () =>
RequestSerializer.objectToArrayBuffer({
content: 'test',
}),
json: () => ({
content: 'test',
}),
} as unknown as Request,
} as any;
const mockedClone = vi.fn().mockReturnValue(requestInit.request);
const mockedSync = vi.fn().mockResolvedValue({});
const apiPlugin = new ApiPlugin({
type: 'content-update',
syncManager: {
sync: () => mockedSync(),
} as any,
});
mockedGet.mockResolvedValue({
etag: '',
lastModified: '',
content: '',
});
await apiPlugin.requestWillFetch?.(requestInit);
await apiPlugin.fetchDidFail?.({} as any);
const response = await apiPlugin.handlerDidError?.(requestInit);
expect(mockedGet).toHaveBeenCalledWith(
'doc-content',
'http://test.jest/documents/123456/content/',
);
expect(mockedPut).toHaveBeenCalledWith(
'doc-mutation',
expect.objectContaining({
key: expect.any(String),
requestData: expect.objectContaining({
url: 'http://test.jest/documents/123456/content/',
headers: {
'content-type': 'application/json',
},
}),
}),
expect.any(String),
);
expect(mockedPut).toHaveBeenCalledWith(
'doc-content',
{ etag: '', lastModified: '', content: 'test' },
'http://test.jest/documents/123456/content/',
);
expect(mockedPut).toHaveBeenCalledTimes(2);
expect(mockedClose).toHaveBeenCalled();
expect(response?.status).toBe(204);
});
it(`checks handlerDidError with type delete`, async () => {
const requestInit = {
request: {
@@ -291,6 +414,10 @@ describe('ApiPlugin', () => {
'doc-item',
'http://test.jest/documents/123456/',
);
expect(mockedDelete).toHaveBeenCalledWith(
'doc-content',
'http://test.jest/documents/123456/content/',
);
expect(mockedGetAllKeys).toHaveBeenCalledWith('doc-list');
expect(mockedGet).toHaveBeenCalledWith(
'doc-list',
@@ -382,6 +509,15 @@ describe('ApiPlugin', () => {
expect.objectContaining({}),
'http://test.jest/documents/444555/',
);
expect(mockedPut).toHaveBeenCalledWith(
'doc-content',
expect.objectContaining({
content: '',
etag: '',
lastModified: '',
}),
'http://test.jest/documents/444555/content/',
);
expect(mockedPut).toHaveBeenCalledWith(
'doc-list',
expect.objectContaining({
@@ -398,7 +534,7 @@ describe('ApiPlugin', () => {
'doc-list',
'http://test.jest/documents/?page=1',
);
expect(mockedPut).toHaveBeenCalledTimes(3);
expect(mockedPut).toHaveBeenCalledTimes(4);
expect(mockedClose).toHaveBeenCalled();
expect(response?.status).toBe(201);
});

View File

@@ -2,18 +2,19 @@ import { WorkboxPlugin } from 'workbox-core';
import { Doc, DocsResponse } from '@/docs/doc-management';
import { LinkReach, LinkRole, Role } from '@/docs/doc-management/types';
import { UpdateDocContentParams } from '@/features/docs/doc-management/api/useDocContentUpdate';
import { DBRequest, DocsDB } from '../DocsDB';
import { RequestSerializer } from '../RequestSerializer';
import { SyncManager } from '../SyncManager';
interface OptionsReadonly {
tableName: 'doc-list' | 'doc-item';
type: 'list' | 'item';
tableName: 'doc-list' | 'doc-item' | 'doc-content';
type: 'list' | 'item' | 'content';
}
interface OptionsMutate {
type: 'update' | 'delete' | 'create';
type: 'update' | 'delete' | 'create' | 'content-update';
}
interface OptionsSync {
@@ -51,34 +52,68 @@ export class ApiPlugin implements WorkboxPlugin {
request,
response,
}) => {
if (response.status !== 200) {
return response;
}
try {
// For content requests, a 304 means the document hasn't changed:
// transparently serve the cached version from IDB.
if (this.options.type === 'content' && response.status === 304) {
const db = await DocsDB.open();
const entry = await db.get('doc-content', request.url);
db.close();
if (entry) {
return new Response(entry.content, {
status: 200,
statusText: 'OK',
headers: {
'Content-Type': 'text/plain',
...(entry.etag && { ETag: entry.etag }),
...(entry.lastModified && {
'Last-Modified': entry.lastModified,
}),
},
});
}
}
if (this.options.type === 'list' || this.options.type === 'item') {
const tableName = this.options.tableName;
const body = (await response.clone().json()) as DocsResponse | Doc;
await DocsDB.cacheResponse(request.url, body, tableName);
}
if (this.options.type === 'update') {
const db = await DocsDB.open();
const storedResponse = await db.get('doc-item', request.url);
if (!storedResponse || !this.initialRequest) {
if (response.status !== 200) {
return response;
}
const bodyMutate = (await this.initialRequest
.clone()
.json()) as Partial<Doc>;
if (this.options.type === 'list' || this.options.type === 'item') {
const tableName = this.options.tableName;
const body = (await response.clone().json()) as DocsResponse | Doc;
await DocsDB.cacheResponse(request.url, body, tableName);
} else if (this.options.type === 'content') {
// Cache the content response with its ETag / Last-Modified to be
// able to use it for conditional requests and offline access.
const content = await response.clone().text();
const etag = response.headers.get('ETag') ?? '';
const lastModified = response.headers.get('Last-Modified') ?? '';
await DocsDB.cacheResponse(
request.url,
{ etag, lastModified, content },
'doc-content',
);
} else if (this.options.type === 'update') {
const db = await DocsDB.open();
const storedResponse = await db.get('doc-item', request.url);
const newResponse = {
...storedResponse,
...bodyMutate,
};
if (!storedResponse || !this.initialRequest) {
return response;
}
await DocsDB.cacheResponse(request.url, newResponse, 'doc-item');
const bodyMutate = (await this.initialRequest
.clone()
.json()) as Partial<Doc>;
const newResponse = {
...storedResponse,
...bodyMutate,
};
await DocsDB.cacheResponse(request.url, newResponse, 'doc-item');
}
} catch (error) {
console.error('SW: ApiPlugin fetchDidSucceed DB error', error);
}
return response;
@@ -100,6 +135,7 @@ export class ApiPlugin implements WorkboxPlugin {
requestWillFetch: WorkboxPlugin['requestWillFetch'] = async ({ request }) => {
if (
this.options.type === 'update' ||
this.options.type === 'content-update' ||
this.options.type === 'create' ||
this.options.type === 'delete'
) {
@@ -108,6 +144,27 @@ export class ApiPlugin implements WorkboxPlugin {
await this.options.syncManager.sync();
// For content requests, add If-None-Match / If-Modified-Since from IDB
// so the backend can return a 304 when the document hasn't changed.
if (this.options.type === 'content') {
try {
const db = await DocsDB.open();
const entry = await db.get('doc-content', request.url);
db.close();
if (entry?.etag || entry?.lastModified) {
const headers = new Headers(request.headers);
if (entry.etag) {
headers.set('If-None-Match', entry.etag);
} else {
headers.set('If-Modified-Since', entry.lastModified);
}
return new Request(request, { headers });
}
} catch (error) {
console.error('SW: ApiPlugin requestWillFetch content error', error);
}
}
return Promise.resolve(request);
};
@@ -116,7 +173,12 @@ export class ApiPlugin implements WorkboxPlugin {
*/
handlerDidError: WorkboxPlugin['handlerDidError'] = async ({ request }) => {
if (!this.isFetchDidFailed) {
return Promise.resolve(ApiPlugin.getApiCatchHandler());
// it could be a plugin error, not a network error, so we try to do the request without the plugin.
try {
return await fetch(request);
} catch {
return ApiPlugin.getApiCatchHandler();
}
}
switch (this.options.type) {
@@ -126,14 +188,33 @@ export class ApiPlugin implements WorkboxPlugin {
return this.handlerDidErrorDelete(request);
case 'update':
return this.handlerDidErrorUpdate(request);
case 'content-update':
return this.handlerDidErrorContentUpdate(request);
case 'list':
case 'item':
return this.handlerDidErrorRead(this.options.tableName, request.url);
case 'content':
return this.handlerDidErrorContent(request);
}
return Promise.resolve(ApiPlugin.getApiCatchHandler());
};
private queueMutation = async (request: Request): Promise<void> => {
const requestData = (
await RequestSerializer.fromRequest(request)
).toObject();
const serializeRequest: DBRequest = {
requestData,
key: `${Date.now()}`,
};
await DocsDB.cacheResponse(
serializeRequest.key,
serializeRequest,
'doc-mutation',
);
};
private handlerDidErrorCreate = async (request: Request) => {
if (!this.initialRequest) {
return new Response('Request not found', { status: 404 });
@@ -169,7 +250,6 @@ export class ApiPlugin implements WorkboxPlugin {
const newResponse: Doc = {
title: '',
id: uuid,
content: '',
created_at: new Date().toISOString(),
creator: 'dummy-id',
deleted_at: null,
@@ -190,9 +270,12 @@ export class ApiPlugin implements WorkboxPlugin {
children_list: true,
collaboration_auth: true,
comment: true,
content_patch: true,
content_retrieve: true,
destroy: true,
duplicate: true,
favorite: true,
formatted_content: true,
invite_owner: true,
link_configuration: true,
media_auth: true,
@@ -220,12 +303,26 @@ export class ApiPlugin implements WorkboxPlugin {
ancestors_link_role: undefined,
};
/**
* Create a new document in the cache with the new id, so the client can use it while offline,
* and it will be updated later when the request will be synced.
*/
await DocsDB.cacheResponse(
`${request.url}${uuid}/`,
newResponse,
'doc-item',
);
/**
* Create an empty content for the new document in the cache, so the client can use it while offline,
* and it will be updated later when the request will be synced.
*/
await DocsDB.cacheResponse(
`${request.url}${uuid}/content/`,
{ etag: '', lastModified: '', content: '' },
'doc-content',
);
/**
* Add the new entry to the cache list.
*/
@@ -261,26 +358,14 @@ export class ApiPlugin implements WorkboxPlugin {
/**
* Queue the request in the cache 'doc-mutation' to sync it later.
*/
const requestData = (
await RequestSerializer.fromRequest(this.initialRequest)
).toObject();
const serializeRequest: DBRequest = {
requestData,
key: `${Date.now()}`,
};
await DocsDB.cacheResponse(
serializeRequest.key,
serializeRequest,
'doc-mutation',
);
await this.queueMutation(this.initialRequest);
/**
* Delete item in the cache
*/
const db = await DocsDB.open();
await db.delete('doc-item', request.url);
await db.delete('doc-content', `${request.url}content/`);
/**
* Delete entry from the cache list.
@@ -327,20 +412,7 @@ export class ApiPlugin implements WorkboxPlugin {
/**
* Queue the request in the cache 'doc-mutation' to sync it later.
*/
const requestData = (
await RequestSerializer.fromRequest(this.initialRequest)
).toObject();
const serializeRequest: DBRequest = {
requestData,
key: `${Date.now()}`,
};
await DocsDB.cacheResponse(
serializeRequest.key,
serializeRequest,
'doc-mutation',
);
await this.queueMutation(this.initialRequest);
/**
* Update the cache item with the new data.
@@ -418,4 +490,56 @@ export class ApiPlugin implements WorkboxPlugin {
},
});
};
private handlerDidErrorContent = async (request: Request) => {
const db = await DocsDB.open();
const entry = await db.get('doc-content', request.url);
db.close();
if (!entry) {
return Promise.resolve(ApiPlugin.getApiCatchHandler());
}
return new Response(entry.content, {
status: 200,
statusText: 'OK',
headers: {
'Content-Type': 'text/plain',
...(entry.etag && { ETag: entry.etag }),
...(entry.lastModified && { 'Last-Modified': entry.lastModified }),
},
});
};
/**
* When the content update fails, we save the new content in the cache, and we will sync it later with the SyncManager.
* We return a 204 to the client to say that the update is successful, and we update the content in the cache so the
* client can see the new content while offline.
*/
private handlerDidErrorContentUpdate = async (request: Request) => {
const db = await DocsDB.open();
const entry = await db.get('doc-content', request.url);
db.close();
if (!entry || !this.initialRequest) {
return new Response('Not found', { status: 404 });
}
await this.queueMutation(this.initialRequest);
const bodyMutate = (await this.initialRequest
.clone()
.json()) as Partial<UpdateDocContentParams>;
const newContent = bodyMutate.content ?? entry.content;
await DocsDB.cacheResponse(
request.url,
{ etag: '', lastModified: '', content: newContent },
'doc-content',
);
return new Response(null, {
status: 204,
statusText: 'No Content',
});
};
}

View File

@@ -62,6 +62,47 @@ registerRoute(
'GET',
);
registerRoute(
({ url }) =>
isApiUrl(url.href) && /\/documents\/[a-z0-9-]+\/content\/$/.test(url.href),
new NetworkOnly({
plugins: [
new ApiPlugin({
tableName: 'doc-content',
type: 'content',
syncManager,
}),
new OfflinePlugin(),
],
}),
'GET',
);
/**
* Mutate routes for the content update
* It will save in cache the request if the content update fails, and will retry
* to sync it later with the SyncManager
*/
registerRoute(
({ url }) =>
isApiUrl(url.href) && /\/documents\/[a-z0-9-]+\/content\/$/.test(url.href),
new NetworkOnly({
plugins: [
new ApiPlugin({
type: 'content-update',
syncManager,
}),
new OfflinePlugin(),
],
}),
'PATCH',
);
/**
* Mutate routes for the document update
* It will save in cache the request if the document update fails, and will retry
* to sync it later with the SyncManager
*/
registerRoute(
({ url }) => isDocumentApiUrl(url),
new NetworkOnly({

View File

@@ -12,10 +12,8 @@ import {
Doc,
DocPage403,
KEY_DOC,
useCollaboration,
useDoc,
useDocStore,
useProviderStore,
useTrans,
} from '@/docs/doc-management/';
import { KEY_AUTH, setAuthUrl, useAuth } from '@/features/auth';
@@ -24,7 +22,6 @@ import { getDocChildren, subPageToTree } from '@/features/docs/doc-tree/';
import { DocEditorSkeleton, useSkeletonStore } from '@/features/skeletons';
import { MainLayout } from '@/layouts';
import { MAIN_LAYOUT_ID } from '@/layouts/conf';
import { useBroadcastStore } from '@/stores/useBroadcastStore';
import { NextPageWithLayout } from '@/types/next';
const DocEditor = dynamic(
@@ -78,7 +75,6 @@ interface DocProps {
}
const DocPage = ({ id }: DocProps) => {
const { hasLostConnection, resetLostConnection } = useProviderStore();
const { isSkeletonVisible, setIsSkeletonVisible } = useSkeletonStore();
const {
data: docQuery,
@@ -88,7 +84,7 @@ const DocPage = ({ id }: DocProps) => {
} = useDoc(
{ id },
{
staleTime: 0,
staleTime: 30000, // 30 seconds - We keep the data fresh as it is a highly collaborative page
queryKey: [KEY_DOC, { id }],
retryDelay: 1000,
retry: (failureCount, error) => {
@@ -103,10 +99,8 @@ const DocPage = ({ id }: DocProps) => {
const [doc, setDoc] = useState<Doc>();
const { setCurrentDoc } = useDocStore();
const { addTask } = useBroadcastStore();
const queryClient = useQueryClient();
const { replace, asPath } = useRouter();
useCollaboration(doc?.id, doc?.content);
const { t } = useTranslation();
const { authenticated } = useAuth();
const { untitledDocument } = useTrans();
@@ -144,16 +138,6 @@ const DocPage = ({ id }: DocProps) => {
};
}, [id]);
// Invalidate when provider store reports a lost connection
useEffect(() => {
if (hasLostConnection && doc?.id) {
void queryClient.invalidateQueries({
queryKey: [KEY_DOC, { id: doc.id }],
});
resetLostConnection();
}
}, [hasLostConnection, doc?.id, queryClient, resetLostConnection]);
useEffect(() => {
if (!docQuery || isFetching) {
return;
@@ -174,22 +158,6 @@ const DocPage = ({ id }: DocProps) => {
};
}, [setCurrentDoc, setIsSkeletonVisible]);
/**
* We add a broadcast task to reset the query cache
* when the document visibility changes.
*/
useEffect(() => {
if (!doc?.id) {
return;
}
addTask(`${KEY_DOC}-${doc.id}`, () => {
void queryClient.invalidateQueries({
queryKey: [KEY_DOC, { id: doc.id }],
});
});
}, [addTask, doc?.id, queryClient]);
useEffect(() => {
if (!isError || !error?.status || [403].includes(error.status)) {
return;

View File

@@ -19,13 +19,10 @@ describe('CollaborationBackend', () => {
const { fetchDocument } = await import('@/api/collaborationBackend');
const documentId = 'test-document-123';
await fetchDocument(
{ name: documentId, withoutContent: true },
{ cookie: 'test-cookie' },
);
await fetchDocument({ name: documentId }, { cookie: 'test-cookie' });
expect(axiosGetSpy).toHaveBeenCalledWith(
`http://app-dev:8000/api/v1.0/documents/${documentId}/?without_content=true`,
`http://app-dev:8000/api/v1.0/documents/${documentId}/`,
expect.objectContaining({
headers: expect.objectContaining({
'X-Y-Provider-Key': 'test-yprovider-key',

View File

@@ -228,7 +228,7 @@ describe('Server Tests', () => {
wsHocus.stopConnectionAttempt();
expect(data.reason).toBe('permission-denied');
expect(fetchDocumentMock).toHaveBeenCalledExactlyOnceWith(
{ name: room, withoutContent: true },
{ name: room },
expect.any(Object),
);
wsHocus.webSocket?.close();
@@ -273,7 +273,7 @@ describe('Server Tests', () => {
wsHocus.stopConnectionAttempt();
expect(data.reason).toBe('permission-denied');
expect(fetchDocumentMock).toHaveBeenCalledExactlyOnceWith(
{ name: room, withoutContent: true },
{ name: room },
expect.any(Object),
);
wsHocus.webSocket?.close();
@@ -322,7 +322,7 @@ describe('Server Tests', () => {
wsHocus.destroy();
expect(fetchDocumentMock).toHaveBeenCalledWith(
{ name: room, withoutContent: true },
{ name: room },
expect.any(Object),
);
@@ -371,7 +371,7 @@ describe('Server Tests', () => {
wsHocus.destroy();
expect(fetchDocumentMock).toHaveBeenCalledWith(
{ name: room, withoutContent: true },
{ name: room },
expect.any(Object),
);

View File

@@ -75,11 +75,10 @@ async function fetch<T>(
}
export function fetchDocument(
{ name, withoutContent }: { name: string; withoutContent?: boolean },
{ name }: { name: string },
requestHeaders: IncomingHttpHeaders,
): Promise<Doc> {
const params = withoutContent ? '?without_content=true' : '';
return fetch<Doc>(`/api/v1.0/documents/${name}/${params}`, requestHeaders);
return fetch<Doc>(`/api/v1.0/documents/${name}/`, requestHeaders);
}
export function fetchCurrentUser(

View File

@@ -18,3 +18,6 @@ export const PORT = Number(process.env.PORT || 4444);
export const SENTRY_DSN = process.env.SENTRY_DSN || '';
export const COLLABORATION_BACKEND_BASE_URL =
process.env.COLLABORATION_BACKEND_BASE_URL || 'http://app-dev:8000';
export const COLLABORATION_INACTIVITY_TIMEOUT = Number(
process.env.COLLABORATION_INACTIVITY_TIMEOUT || 0,
);

View File

@@ -1,9 +1,14 @@
import { Request } from 'express';
import * as ws from 'ws';
import { COLLABORATION_INACTIVITY_TIMEOUT } from '@/env';
import { hocuspocusServer } from '@/servers/hocuspocusServer';
import { setupInactivityTimeout } from '@/utils';
export const collaborationWSHandler = (ws: ws.WebSocket, req: Request) => {
if (COLLABORATION_INACTIVITY_TIMEOUT > 0) {
setupInactivityTimeout(ws, COLLABORATION_INACTIVITY_TIMEOUT);
}
try {
hocuspocusServer.hocuspocus.handleConnection(ws, req);
} catch (error) {

View File

@@ -40,7 +40,7 @@ export const hocuspocusServer = new Server({
try {
const document = await fetchDocument(
{ name: documentName, withoutContent: true },
{ name: documentName },
requestHeaders,
);

View File

@@ -1,3 +1,5 @@
import * as ws from 'ws';
import { COLLABORATION_LOGGING } from './env';
export function logger(...args: unknown[]) {
@@ -9,3 +11,25 @@ export function logger(...args: unknown[]) {
export const toBase64 = function (str: Uint8Array) {
return Buffer.from(str).toString('base64');
};
export function setupInactivityTimeout(
socket: ws.WebSocket,
delayMs: number,
): void {
const closeInactive = () => {
logger('Closing inactive WebSocket connection after', delayMs, 'ms');
socket.close();
};
let timer = setTimeout(closeInactive, delayMs);
socket.on('message', () => {
logger('clear closeInactive timer');
clearTimeout(timer);
timer = setTimeout(closeInactive, delayMs);
});
socket.on('close', () => {
clearTimeout(timer);
});
}

File diff suppressed because it is too large Load Diff

View File

@@ -100,10 +100,9 @@ backend:
- uvicorn
- --app-dir=/app
- --host=0.0.0.0
- --timeout-graceful-shutdown=300
- --limit-max-requests=20000
- --lifespan=off
- --reload
- --reload-dir=/app
- "impress.asgi:application"
createsuperuser:
@@ -179,7 +178,7 @@ docSpec:
image:
repository: ghcr.io/docspecio/api
pullPolicy: IfNotPresent
tag: "2.6.3"
tag: "3.0.1"
probes:
liveness:

View File

@@ -31,9 +31,8 @@ backend:
DJANGO_EMAIL_URL_APP: https://{{ .Values.feature }}-docs.{{ .Values.domain }}
DJANGO_EMAIL_USE_SSL: False
FRONTEND_SILENT_LOGIN_ENABLED: True
LOGGING_LEVEL_HANDLERS_CONSOLE: ERROR
LOGGING_LEVEL_LOGGERS_ROOT: INFO
LOGGING_LEVEL_LOGGERS_APP: INFO
LOGGING_LEVEL_LOGGERS_ROOT: DEBUG
LOGGING_LEVEL_LOGGERS_APP: DEBUG
OIDC_USERINFO_SHORTNAME_FIELD: "first_name"
OIDC_USERINFO_FULLNAME_FIELDS: "name"
OIDC_OP_JWKS_ENDPOINT: https://{{ .Values.feature }}-docs-keycloak.{{ .Values.domain }}/realms/docs/protocol/openid-connect/certs
@@ -154,7 +153,7 @@ docSpec:
image:
repository: ghcr.io/docspecio/api
pullPolicy: IfNotPresent
tag: "2.6.3"
tag: "3.0.1"
probes:
liveness:

View File

@@ -768,7 +768,7 @@ docSpec:
image:
repository: ghcr.io/docspecio/api
pullPolicy: IfNotPresent
tag: "2.6.3"
tag: "3.0.1"
## @param docSpec.command Override the docSpec container command
command: []