mirror of
https://github.com/suitenumerique/docs.git
synced 2026-05-06 23:22:15 +02:00
Compare commits
1 Commits
improve-in
...
refacto/su
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c89909bd1a |
31
CHANGELOG.md
31
CHANGELOG.md
@@ -8,35 +8,7 @@ and this project adheres to
|
||||
|
||||
### Changed
|
||||
|
||||
- ✨(backend) improve indexing command
|
||||
- checkpoint recovery
|
||||
- asynchronicity
|
||||
- admin command trigger
|
||||
- 💄(frontend) improve comments highlights #1961
|
||||
|
||||
## [v4.8.3] - 2026-03-23
|
||||
|
||||
### Changed
|
||||
|
||||
- ♿️(frontend) improve version history list accessibility #2033
|
||||
- ♿(frontend) focus skip link on headings and skip grid dropzone #1983
|
||||
- ♿️(frontend) add sr-only format to export download button #2088
|
||||
- ♿️(frontend) announce formatting shortcuts for screen readers #2070
|
||||
- ✨(frontend) add markdown copy icon for Copy as Markdown option #2096
|
||||
- ♻️(backend) skip saving in database a document when payload is empty #2062
|
||||
- ♻️(frontend) refacto Version modal to fit with the design system #2091
|
||||
- ⚡️(frontend) add debounce WebSocket reconnect #2104
|
||||
|
||||
### Fixed
|
||||
|
||||
- ♿️(frontend) fix more options menu feedback for screen readers #2071
|
||||
- ♿️(frontend) fix more options menu feedback for screen readers #2071
|
||||
- 💫(frontend) fix the help button to the bottom in tree #2073
|
||||
- ♿️(frontend) fix aria-labels for table of contents #2065
|
||||
- 🐛(backend) allow using search endpoint without refresh token enabled #2097
|
||||
- 🐛(frontend) fix close panel when click on subdoc #2094
|
||||
- 🐛(frontend) fix leftpanel button in doc version #9238
|
||||
- 🐛(y-provider) fix loop when no cookies #2101
|
||||
|
||||
## [v4.8.2] - 2026-03-19
|
||||
|
||||
@@ -1181,8 +1153,7 @@ and this project adheres to
|
||||
- ✨(frontend) Coming Soon page (#67)
|
||||
- 🚀 Impress, project to manage your documents easily and collaboratively.
|
||||
|
||||
[unreleased]: https://github.com/suitenumerique/docs/compare/v4.8.3...main
|
||||
[v4.8.3]: https://github.com/suitenumerique/docs/releases/v4.8.3
|
||||
[unreleased]: https://github.com/suitenumerique/docs/compare/v4.8.2...main
|
||||
[v4.8.2]: https://github.com/suitenumerique/docs/releases/v4.8.2
|
||||
[v4.8.1]: https://github.com/suitenumerique/docs/releases/v4.8.1
|
||||
[v4.8.0]: https://github.com/suitenumerique/docs/releases/v4.8.0
|
||||
|
||||
2
Makefile
2
Makefile
@@ -260,7 +260,7 @@ demo: ## flush db then create a demo for load testing purpose
|
||||
.PHONY: demo
|
||||
|
||||
index: ## index all documents to remote search
|
||||
@$(MANAGE) index $(args)
|
||||
@$(MANAGE) index
|
||||
.PHONY: index
|
||||
|
||||
# Nota bene: Black should come after isort just in case they don't agree...
|
||||
|
||||
@@ -1,63 +0,0 @@
|
||||
# Index Command
|
||||
|
||||
The `index` management command is used to index documents to the remote search indexer.
|
||||
|
||||
## Usage
|
||||
|
||||
### Make Command
|
||||
|
||||
```bash
|
||||
# Basic usage with defaults
|
||||
make index
|
||||
|
||||
# With custom parameters
|
||||
make index args="--batch-size 100 --lower-time-bound 2024-01-01T00:00:00 --upper-time-bound 2026-01-01T00:00:00"
|
||||
|
||||
```
|
||||
|
||||
### Command line
|
||||
|
||||
```bash
|
||||
python manage.py index \
|
||||
--lower-time-bound "2024-01-01T00:00:00" \
|
||||
--upper-time-bound "2024-01-31T23:59:59" \
|
||||
--batch-size 200 \
|
||||
--async
|
||||
```
|
||||
|
||||
### Django Admin
|
||||
|
||||
The command is available in the Django admin interface:
|
||||
|
||||
1. Go to `/admin/run-indexing/`, you arrive at the "Run Indexing Command" page
|
||||
2. Fill in the form with the desired parameters
|
||||
3. Click **"Run Indexing Command"**
|
||||
|
||||
## Parameters
|
||||
|
||||
### `--batch-size`
|
||||
- **type:** Integer
|
||||
- **default:** `settings.SEARCH_INDEXER_BATCH_SIZE`
|
||||
- **description:** Number of documents to process per batch. Higher values may improve performance but use more memory.
|
||||
|
||||
### `--lower-time-bound`
|
||||
- **optional**: true
|
||||
- **type:** ISO 8601 datetime string
|
||||
- **default:** `None`
|
||||
- **description:** Only documents updated after this date will be indexed.
|
||||
|
||||
### `--upper-time-bound`
|
||||
- **optional**: true
|
||||
- **type:** ISO 8601 datetime string
|
||||
- **default:** `None`
|
||||
- **description:** Only documents updated before this date will be indexed.
|
||||
|
||||
### `--async`
|
||||
- **type:** Boolean flag
|
||||
- **default:** `False`
|
||||
+- **description:** When set, dispatches the indexing job to a Celery worker instead of running it synchronously.
|
||||
|
||||
## Crash Safe Mode
|
||||
|
||||
The command saves the `updated_at` of the last document of each successful batch into the `bulk-indexer-checkpoint` cache variable.
|
||||
If the process crashes, this value can be used as `lower-time-bound` to resume from the last successfully indexed document.
|
||||
@@ -61,8 +61,8 @@ OIDC_RS_AUDIENCE_CLAIM="client_id" # The claim used to identify the audience
|
||||
OIDC_RS_ALLOWED_AUDIENCES=""
|
||||
|
||||
# Store OIDC tokens in the session. Needed by search/ endpoint.
|
||||
# OIDC_STORE_ACCESS_TOKEN=True
|
||||
# OIDC_STORE_REFRESH_TOKEN=True # Store the encrypted refresh token in the session.
|
||||
OIDC_STORE_ACCESS_TOKEN=True
|
||||
OIDC_STORE_REFRESH_TOKEN=True # Store the encrypted refresh token in the session.
|
||||
|
||||
# Must be a valid Fernet key (32 url-safe base64-encoded bytes)
|
||||
# To create one, use the bin/fernetkey command.
|
||||
|
||||
@@ -1,54 +1,15 @@
|
||||
"""Admin classes and registrations for core app."""
|
||||
|
||||
from django.contrib import admin, messages
|
||||
from django.contrib.admin.views.decorators import staff_member_required
|
||||
from django.contrib.auth import admin as auth_admin
|
||||
from django.core.management import call_command
|
||||
from django.http import HttpRequest
|
||||
from django.shortcuts import redirect, render
|
||||
from django.shortcuts import redirect
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from treebeard.admin import TreeAdmin
|
||||
|
||||
from core import models
|
||||
from core.forms import RunIndexingForm
|
||||
from core.tasks.user_reconciliation import user_reconciliation_csv_import_job
|
||||
|
||||
# Customize the default admin site's get_app_list method
|
||||
_original_get_app_list = admin.site.get_app_list
|
||||
|
||||
|
||||
def custom_get_app_list(_self, request, app_label=None):
|
||||
"""Add custom commands to the app list."""
|
||||
app_list = _original_get_app_list(request, app_label)
|
||||
|
||||
# Add Commands app with Run Indexing command
|
||||
commands_app = {
|
||||
"name": _("Commands"),
|
||||
"app_label": "commands",
|
||||
"app_url": "#",
|
||||
"has_module_perms": True,
|
||||
"models": [
|
||||
{
|
||||
"name": _("Run indexing"),
|
||||
"object_name": "RunIndexing",
|
||||
"admin_url": "/admin/run-indexing/",
|
||||
"view_only": False,
|
||||
"add_url": None,
|
||||
"change_url": None,
|
||||
}
|
||||
],
|
||||
}
|
||||
|
||||
app_list.append(commands_app)
|
||||
return app_list
|
||||
|
||||
|
||||
# Monkey-patch the admin site
|
||||
admin.site.get_app_list = lambda request, app_label=None: custom_get_app_list(
|
||||
admin.site, request, app_label
|
||||
)
|
||||
|
||||
|
||||
@admin.register(models.User)
|
||||
class UserAdmin(auth_admin.UserAdmin):
|
||||
@@ -266,39 +227,3 @@ class InvitationAdmin(admin.ModelAdmin):
|
||||
def save_model(self, request, obj, form, change):
|
||||
obj.issuer = request.user
|
||||
obj.save()
|
||||
|
||||
|
||||
@staff_member_required
|
||||
def run_indexing_view(request: HttpRequest):
|
||||
"""Custom admin view for running indexing commands."""
|
||||
if request.method == "POST":
|
||||
form = RunIndexingForm(request.POST)
|
||||
if form.is_valid():
|
||||
lower_time_bound = form.cleaned_data.get("lower_time_bound")
|
||||
upper_time_bound = form.cleaned_data.get("upper_time_bound")
|
||||
call_command(
|
||||
"index",
|
||||
batch_size=form.cleaned_data["batch_size"],
|
||||
lower_time_bound=lower_time_bound.isoformat()
|
||||
if lower_time_bound
|
||||
else None,
|
||||
upper_time_bound=upper_time_bound.isoformat()
|
||||
if upper_time_bound
|
||||
else None,
|
||||
async_mode=True,
|
||||
)
|
||||
messages.success(request, _("Indexing triggered!"))
|
||||
return redirect("run_indexing")
|
||||
messages.error(request, _("Please correct the errors below."))
|
||||
else:
|
||||
form = RunIndexingForm()
|
||||
|
||||
return render(
|
||||
request=request,
|
||||
template_name="runindexing.html",
|
||||
context={
|
||||
**admin.site.each_context(request),
|
||||
"title": "Run Indexing Command",
|
||||
"form": form,
|
||||
},
|
||||
)
|
||||
|
||||
@@ -300,15 +300,6 @@ class DocumentSerializer(ListDocumentSerializer):
|
||||
|
||||
return file
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
"""
|
||||
When no data is sent on the update, skip making the update in the database and return
|
||||
directly the instance unchanged.
|
||||
"""
|
||||
if not validated_data:
|
||||
return instance # No data provided, skip the update
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
def save(self, **kwargs):
|
||||
"""
|
||||
Process the content field to extract attachment keys and update the document's
|
||||
|
||||
@@ -6,10 +6,8 @@ from abc import ABC, abstractmethod
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.core.files.storage import default_storage
|
||||
from django.utils.decorators import method_decorator
|
||||
|
||||
import botocore
|
||||
from lasuite.oidc_login.decorators import refresh_oidc_access_token
|
||||
from rest_framework.throttling import BaseThrottle
|
||||
|
||||
|
||||
@@ -93,19 +91,6 @@ def generate_s3_authorization_headers(key):
|
||||
return request
|
||||
|
||||
|
||||
def conditional_refresh_oidc_token(func):
|
||||
"""
|
||||
Conditionally apply refresh_oidc_access_token decorator.
|
||||
|
||||
The decorator is only applied if OIDC_STORE_REFRESH_TOKEN is True, meaning
|
||||
we can actually refresh something. Broader settings checks are done in settings.py.
|
||||
"""
|
||||
if settings.OIDC_STORE_REFRESH_TOKEN:
|
||||
return method_decorator(refresh_oidc_access_token)(func)
|
||||
|
||||
return func
|
||||
|
||||
|
||||
class AIBaseRateThrottle(BaseThrottle, ABC):
|
||||
"""Base throttle class for AI-related rate limiting with backoff."""
|
||||
|
||||
|
||||
@@ -25,6 +25,7 @@ from django.db.models.functions import Greatest, Left, Length
|
||||
from django.http import Http404, StreamingHttpResponse
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.http import content_disposition_header
|
||||
from django.utils.text import capfirst, slugify
|
||||
@@ -37,6 +38,7 @@ from botocore.exceptions import ClientError
|
||||
from csp.constants import NONE
|
||||
from csp.decorators import csp_update
|
||||
from lasuite.malware_detection import malware_detection
|
||||
from lasuite.oidc_login.decorators import refresh_oidc_access_token
|
||||
from lasuite.tools.email import get_domain_from_email
|
||||
from pydantic import ValidationError as PydanticValidationError
|
||||
from rest_framework import filters, status, viewsets
|
||||
@@ -1413,7 +1415,7 @@ class DocumentViewSet(
|
||||
return duplicated_document
|
||||
|
||||
@drf.decorators.action(detail=False, methods=["get"], url_path="search")
|
||||
@utils.conditional_refresh_oidc_token
|
||||
@method_decorator(refresh_oidc_access_token)
|
||||
def search(self, request, *args, **kwargs):
|
||||
"""
|
||||
Returns an ordered list of documents best matching the search query parameter 'q'.
|
||||
@@ -1424,6 +1426,7 @@ class DocumentViewSet(
|
||||
params = serializers.SearchDocumentSerializer(data=request.query_params)
|
||||
params.is_valid(raise_exception=True)
|
||||
search_type = self._get_search_type()
|
||||
|
||||
if search_type == SearchType.TITLE:
|
||||
return self._title_search(request, params.validated_data, *args, **kwargs)
|
||||
|
||||
|
||||
@@ -6,7 +6,6 @@ from django.conf import settings
|
||||
from django.contrib.auth.hashers import make_password
|
||||
|
||||
import factory.fuzzy
|
||||
from factory import post_generation
|
||||
from faker import Faker
|
||||
|
||||
from core import models
|
||||
@@ -160,20 +159,6 @@ class DocumentFactory(factory.django.DjangoModelFactory):
|
||||
document=self, user=item, defaults={"is_masked": True}
|
||||
)
|
||||
|
||||
@post_generation
|
||||
def updated_at(self, create, extracted, **kwargs):
|
||||
"""
|
||||
the BaseModel.updated_at has auto_now=True.
|
||||
This prevents setting a specific updated_at value with the factory.
|
||||
|
||||
This post_generation method bypasses this behavior.
|
||||
"""
|
||||
if not create or not extracted:
|
||||
return
|
||||
|
||||
self.__class__.objects.filter(pk=self.pk).update(updated_at=extracted)
|
||||
self.refresh_from_db()
|
||||
|
||||
|
||||
class UserDocumentAccessFactory(factory.django.DjangoModelFactory):
|
||||
"""Create fake document user accesses for testing."""
|
||||
|
||||
@@ -1,42 +0,0 @@
|
||||
"""Forms for the core app."""
|
||||
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
|
||||
class RunIndexingForm(forms.Form):
|
||||
"""
|
||||
Form for running the indexing process.
|
||||
"""
|
||||
|
||||
batch_size = forms.IntegerField(
|
||||
min_value=1,
|
||||
initial=settings.SEARCH_INDEXER_BATCH_SIZE,
|
||||
)
|
||||
lower_time_bound = forms.DateTimeField(
|
||||
required=False, widget=forms.TextInput(attrs={"type": "datetime-local"})
|
||||
)
|
||||
upper_time_bound = forms.DateTimeField(
|
||||
required=False, widget=forms.TextInput(attrs={"type": "datetime-local"})
|
||||
)
|
||||
|
||||
def clean(self):
|
||||
"""Override clean to validate time bounds."""
|
||||
cleaned_data = super().clean()
|
||||
self.check_time_bounds()
|
||||
return cleaned_data
|
||||
|
||||
def check_time_bounds(self):
|
||||
"""Validate that lower_time_bound is before upper_time_bound."""
|
||||
lower_time_bound = self.cleaned_data.get("lower_time_bound")
|
||||
upper_time_bound = self.cleaned_data.get("upper_time_bound")
|
||||
if (
|
||||
lower_time_bound
|
||||
and upper_time_bound
|
||||
and lower_time_bound > upper_time_bound
|
||||
):
|
||||
self.add_error(
|
||||
"upper_time_bound",
|
||||
_("Upper time bound must be after lower time bound."),
|
||||
)
|
||||
@@ -4,16 +4,12 @@ Handle search setup that needs to be done at bootstrap time.
|
||||
|
||||
import logging
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
|
||||
from core import models
|
||||
from core.services.search_indexers import get_document_indexer
|
||||
from core.tasks.search import batch_document_indexer_task
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger = logging.getLogger("docs.search.bootstrap_search")
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
@@ -28,32 +24,9 @@ class Command(BaseCommand):
|
||||
action="store",
|
||||
dest="batch_size",
|
||||
type=int,
|
||||
default=settings.SEARCH_INDEXER_BATCH_SIZE,
|
||||
default=50,
|
||||
help="Indexation query batch size",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--lower-time-bound",
|
||||
action="store",
|
||||
dest="lower_time_bound",
|
||||
type=datetime.fromisoformat,
|
||||
default=None,
|
||||
help="DateTime in ISO format. Only documents updated after this date will be indexed",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--upper-time-bound",
|
||||
action="store",
|
||||
dest="upper_time_bound",
|
||||
type=datetime.fromisoformat,
|
||||
default=None,
|
||||
help="DateTime in ISO format. Only documents updated before this date will be indexed",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--async",
|
||||
action="store_true",
|
||||
dest="async_mode",
|
||||
default=False,
|
||||
help="Whether to execute indexing asynchronously in a Celery task (default: False)",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
"""Launch and log search index generation."""
|
||||
@@ -62,38 +35,18 @@ class Command(BaseCommand):
|
||||
if not indexer:
|
||||
raise CommandError("The indexer is not enabled or properly configured.")
|
||||
|
||||
if options["async_mode"]:
|
||||
try:
|
||||
batch_document_indexer_task.apply_async(
|
||||
kwargs={
|
||||
"lower_time_bound": options["lower_time_bound"],
|
||||
"upper_time_bound": options["upper_time_bound"],
|
||||
"batch_size": options["batch_size"],
|
||||
"crash_safe_mode": True,
|
||||
},
|
||||
)
|
||||
except Exception as err:
|
||||
raise CommandError("Unable to dispatch indexing task") from err
|
||||
logger.info("Document indexing task sent to worker")
|
||||
else:
|
||||
logger.info("Starting to regenerate Find index...")
|
||||
start = time.perf_counter()
|
||||
logger.info("Starting to regenerate Find index...")
|
||||
start = time.perf_counter()
|
||||
batch_size = options["batch_size"]
|
||||
|
||||
try:
|
||||
count = indexer.index(
|
||||
queryset=models.Document.objects.filter_updated_at(
|
||||
lower_time_bound=options["lower_time_bound"],
|
||||
upper_time_bound=options["upper_time_bound"],
|
||||
),
|
||||
batch_size=options["batch_size"],
|
||||
crash_safe_mode=True,
|
||||
)
|
||||
except Exception as err:
|
||||
raise CommandError("Unable to regenerate index") from err
|
||||
try:
|
||||
count = indexer.index(batch_size=batch_size)
|
||||
except Exception as err:
|
||||
raise CommandError("Unable to regenerate index") from err
|
||||
|
||||
duration = time.perf_counter() - start
|
||||
logger.info(
|
||||
"Search index regenerated from %d document(s) in %.2f seconds.",
|
||||
count,
|
||||
duration,
|
||||
)
|
||||
duration = time.perf_counter() - start
|
||||
logger.info(
|
||||
"Search index regenerated from %d document(s) in %.2f seconds.",
|
||||
count,
|
||||
duration,
|
||||
)
|
||||
|
||||
@@ -859,32 +859,6 @@ class DocumentQuerySet(MP_NodeQuerySet):
|
||||
user_roles=models.Value([], output_field=output_field),
|
||||
)
|
||||
|
||||
def filter_updated_at(self, lower_time_bound=None, upper_time_bound=None):
|
||||
"""
|
||||
Filter documents by update_at.
|
||||
|
||||
Args:
|
||||
lower_time_bound (datetime, optional):
|
||||
Keep documents updated after this timestamp.
|
||||
upper_time_bound (datetime, optional):
|
||||
Keep documents updated before this timestamp.
|
||||
|
||||
Returns:
|
||||
QuerySet: Filtered queryset ready for indexation.
|
||||
"""
|
||||
conditions = models.Q()
|
||||
if lower_time_bound and upper_time_bound:
|
||||
conditions = models.Q(
|
||||
updated_at__gte=lower_time_bound,
|
||||
updated_at__lte=upper_time_bound,
|
||||
)
|
||||
elif lower_time_bound:
|
||||
conditions = models.Q(updated_at__gte=lower_time_bound)
|
||||
elif upper_time_bound:
|
||||
conditions = models.Q(updated_at__lte=upper_time_bound)
|
||||
|
||||
return self.filter(conditions)
|
||||
|
||||
|
||||
class DocumentManager(MP_NodeManager.from_queryset(DocumentQuerySet)):
|
||||
"""
|
||||
@@ -1485,16 +1459,13 @@ class Document(MP_Node, BaseModel):
|
||||
.first()
|
||||
)
|
||||
self.ancestors_deleted_at = ancestors_deleted_at
|
||||
self.save(update_fields=["deleted_at", "ancestors_deleted_at", "updated_at"])
|
||||
self.save(update_fields=["deleted_at", "ancestors_deleted_at"])
|
||||
self.invalidate_nb_accesses_cache()
|
||||
|
||||
self.get_descendants().exclude(
|
||||
models.Q(deleted_at__isnull=False)
|
||||
| models.Q(ancestors_deleted_at__lt=current_deleted_at)
|
||||
).update(
|
||||
ancestors_deleted_at=self.ancestors_deleted_at,
|
||||
updated_at=self.updated_at,
|
||||
)
|
||||
).update(ancestors_deleted_at=self.ancestors_deleted_at)
|
||||
|
||||
if self.depth > 1:
|
||||
self._meta.model.objects.filter(pk=self.get_parent().pk).update(
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
"""Document search index management utilities and indexers"""
|
||||
|
||||
import itertools
|
||||
import logging
|
||||
from abc import ABC, abstractmethod
|
||||
from collections import defaultdict
|
||||
@@ -126,44 +125,44 @@ class BaseDocumentIndexer(ABC):
|
||||
if not self.search_url:
|
||||
raise ImproperlyConfigured("SEARCH_URL must be set in Django settings.")
|
||||
|
||||
def index(self, queryset, batch_size=None, crash_safe_mode=False):
|
||||
def index(self, queryset=None, batch_size=None):
|
||||
"""
|
||||
Fetch documents in batches, serialize them, and push to the search backend.
|
||||
|
||||
Args:
|
||||
queryset: Document queryset
|
||||
queryset (optional): Document queryset
|
||||
Defaults to all documents without filter.
|
||||
batch_size (int, optional): Number of documents per batch.
|
||||
Defaults to settings.SEARCH_INDEXER_BATCH_SIZE.
|
||||
crash_safe_mode (bool, optional): If True, order documents by updated_at
|
||||
This allows resuming indexing from the last successful batch in case of a crash
|
||||
but is more computationally expensive due to sorting.
|
||||
"""
|
||||
last_id = 0
|
||||
count = 0
|
||||
queryset = queryset or models.Document.objects.all()
|
||||
batch_size = batch_size or self.batch_size
|
||||
|
||||
if crash_safe_mode:
|
||||
queryset = queryset.order_by("updated_at")
|
||||
while True:
|
||||
documents_batch = list(
|
||||
queryset.filter(
|
||||
id__gt=last_id,
|
||||
).order_by("id")[:batch_size]
|
||||
)
|
||||
|
||||
if not documents_batch:
|
||||
break
|
||||
|
||||
for documents_batch in itertools.batched(queryset.iterator(), batch_size):
|
||||
doc_paths = [doc.path for doc in documents_batch]
|
||||
last_id = documents_batch[-1].id
|
||||
accesses_by_document_path = get_batch_accesses_by_users_and_teams(doc_paths)
|
||||
|
||||
serialized_batch = [
|
||||
self.serialize_document(document, accesses_by_document_path)
|
||||
for document in documents_batch
|
||||
if document.content or document.title
|
||||
]
|
||||
|
||||
if not serialized_batch:
|
||||
continue
|
||||
|
||||
self.push(serialized_batch)
|
||||
count += len(serialized_batch)
|
||||
|
||||
if crash_safe_mode:
|
||||
logger.info(
|
||||
"Indexing checkpoint: %s.",
|
||||
serialized_batch[-1]["updated_at"],
|
||||
)
|
||||
if serialized_batch:
|
||||
self.push(serialized_batch)
|
||||
count += len(serialized_batch)
|
||||
|
||||
return count
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ from logging import getLogger
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Q
|
||||
|
||||
from django_redis.cache import RedisCache
|
||||
|
||||
@@ -19,12 +20,7 @@ logger = getLogger(__file__)
|
||||
|
||||
@app.task
|
||||
def document_indexer_task(document_id):
|
||||
"""
|
||||
Celery Task: Indexes a single document by its ID.
|
||||
|
||||
Args:
|
||||
document_id: Primary key of the document to index.
|
||||
"""
|
||||
"""Celery Task : Sends indexation query for a document."""
|
||||
indexer = get_document_indexer()
|
||||
|
||||
if indexer:
|
||||
@@ -34,17 +30,8 @@ def document_indexer_task(document_id):
|
||||
|
||||
def batch_indexer_throttle_acquire(timeout: int = 0, atomic: bool = True):
|
||||
"""
|
||||
Acquire a throttle lock to prevent multiple batch indexation tasks during countdown.
|
||||
|
||||
implements a debouncing pattern: only the first call during the timeout period
|
||||
will succeed, subsequent calls are skipped until the timeout expires.
|
||||
|
||||
Args:
|
||||
timeout (int): Lock duration in seconds (countdown period).
|
||||
atomic (bool): Use Redis locks for atomic operations if available.
|
||||
|
||||
Returns:
|
||||
bool: True if lock acquired (first call), False if already held (subsequent calls).
|
||||
Enable the task throttle flag for a delay.
|
||||
Uses redis locks if available to ensure atomic changes
|
||||
"""
|
||||
key = "document-batch-indexer-throttle"
|
||||
|
||||
@@ -54,65 +41,44 @@ def batch_indexer_throttle_acquire(timeout: int = 0, atomic: bool = True):
|
||||
with cache.locks(key):
|
||||
return batch_indexer_throttle_acquire(timeout, atomic=False)
|
||||
|
||||
# cache.add() is atomic test-and-set operation:
|
||||
# - If key doesn't exist: creates it with timeout and returns True
|
||||
# - If key already exists: does nothing and returns False
|
||||
# The key expires after timeout seconds, releasing the lock.
|
||||
# The value 1 is irrelevant, only the key presence/absence matters.
|
||||
# Use add() here :
|
||||
# - set the flag and returns true if not exist
|
||||
# - do nothing and return false if exist
|
||||
return cache.add(key, 1, timeout=timeout)
|
||||
|
||||
|
||||
@app.task
|
||||
def batch_document_indexer_task(lower_time_bound=None, upper_time_bound=None, **kwargs):
|
||||
"""
|
||||
Celery Task: Batch indexes all documents modified since timestamp.
|
||||
|
||||
Args:
|
||||
lower_time_bound (datetime, optional):
|
||||
indexes documents updated or deleted after this timestamp.
|
||||
upper_time_bound (datetime, optional):
|
||||
indexes documents updated or deleted before this timestamp.
|
||||
"""
|
||||
def batch_document_indexer_task(timestamp):
|
||||
"""Celery Task : Sends indexation query for a batch of documents."""
|
||||
indexer = get_document_indexer()
|
||||
|
||||
if not indexer:
|
||||
logger.warning("Indexing task triggered but no indexer configured: skipping")
|
||||
return
|
||||
if indexer:
|
||||
queryset = models.Document.objects.filter(
|
||||
Q(updated_at__gte=timestamp)
|
||||
| Q(deleted_at__gte=timestamp)
|
||||
| Q(ancestors_deleted_at__gte=timestamp)
|
||||
)
|
||||
|
||||
count = indexer.index(
|
||||
queryset=models.Document.objects.filter_updated_at(
|
||||
lower_time_bound=lower_time_bound, upper_time_bound=upper_time_bound
|
||||
),
|
||||
**kwargs,
|
||||
)
|
||||
logger.info("Indexed %d documents", count)
|
||||
count = indexer.index(queryset)
|
||||
logger.info("Indexed %d documents", count)
|
||||
|
||||
|
||||
def trigger_batch_document_indexer(document):
|
||||
"""
|
||||
Trigger document indexation with optional debounce mechanism.
|
||||
|
||||
behavior depends on SEARCH_INDEXER_COUNTDOWN setting:
|
||||
- if countdown > 0 sec (async batch mode):
|
||||
* schedules a batch indexation task after countdown in seconds
|
||||
* uses throttle mechanism to ensure only ONE batch task runs per countdown period
|
||||
* all documents modified since first trigger are indexed together
|
||||
- if countdown == 0 sec (sync mode):
|
||||
* executes indexation synchronously in the current thread
|
||||
* no batching, no throttling, no Celery task queuing
|
||||
Trigger indexation task with debounce a delay set by the SEARCH_INDEXER_COUNTDOWN setting.
|
||||
|
||||
Args:
|
||||
document (Document): the document instance that triggered the indexation.
|
||||
document (Document): The document instance.
|
||||
"""
|
||||
countdown = int(settings.SEARCH_INDEXER_COUNTDOWN)
|
||||
|
||||
# DO NOT create a task if indexation is disabled
|
||||
# DO NOT create a task if indexation if disabled
|
||||
if not settings.SEARCH_INDEXER_CLASS:
|
||||
return
|
||||
|
||||
if countdown > 0:
|
||||
# use throttle to ensure only one task is scheduled per countdown period.
|
||||
# if throttle acquired, schedule batch task; otherwise skip.
|
||||
# Each time this method is called during a countdown, we increment the
|
||||
# counter and each task decrease it, so the index be run only once.
|
||||
if batch_indexer_throttle_acquire(timeout=countdown):
|
||||
logger.info(
|
||||
"Add task for batch document indexation from updated_at=%s in %d seconds",
|
||||
@@ -121,7 +87,7 @@ def trigger_batch_document_indexer(document):
|
||||
)
|
||||
|
||||
batch_document_indexer_task.apply_async(
|
||||
kwargs={"lower_time_bound": document.updated_at}, countdown=countdown
|
||||
args=[document.updated_at], countdown=countdown
|
||||
)
|
||||
else:
|
||||
logger.info("Skip task for batch document %s indexation", document.pk)
|
||||
|
||||
@@ -1,22 +0,0 @@
|
||||
{% extends "admin/base_site.html" %}
|
||||
{% load i18n %}
|
||||
|
||||
{% block content %}
|
||||
|
||||
<form method="POST" >
|
||||
{% csrf_token %}
|
||||
|
||||
<hr style="margin-bottom: 10px;">
|
||||
|
||||
<p>
|
||||
{% translate "This command triggers the indexing of all documents within the specified time bound." %}
|
||||
</p>
|
||||
|
||||
<hr style="margin-bottom: 20px;">
|
||||
|
||||
{{ form.as_p }}
|
||||
|
||||
<input type="submit" value="{% translate 'Run Indexing' %}" style="margin-top: 20px;">
|
||||
</form>
|
||||
|
||||
{% endblock %}
|
||||
@@ -2,25 +2,21 @@
|
||||
Unit test for `index` command.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from operator import itemgetter
|
||||
from unittest import mock
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.core.management import CommandError, call_command
|
||||
from django.db import transaction
|
||||
|
||||
import pytest
|
||||
|
||||
from core import factories
|
||||
from core.factories import DocumentFactory
|
||||
from core.services.search_indexers import FindDocumentIndexer
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
def test_index_without_bound_success():
|
||||
def test_index():
|
||||
"""Test the command `index` that run the Find app indexer for all the available documents."""
|
||||
user = factories.UserFactory()
|
||||
indexer = FindDocumentIndexer()
|
||||
@@ -43,152 +39,18 @@ def test_index_without_bound_success():
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
call_command("index")
|
||||
|
||||
push_call_args = [call.args[0] for call in mock_push.call_args_list]
|
||||
push_call_args = [call.args[0] for call in mock_push.call_args_list]
|
||||
|
||||
# called once but with a batch of docs
|
||||
mock_push.assert_called_once()
|
||||
# called once but with a batch of docs
|
||||
mock_push.assert_called_once()
|
||||
|
||||
assert sorted(push_call_args[0], key=itemgetter("id")) == sorted(
|
||||
[
|
||||
indexer.serialize_document(doc, accesses),
|
||||
indexer.serialize_document(no_title_doc, accesses),
|
||||
],
|
||||
key=itemgetter("id"),
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
def test_index_with_both_bounds_success():
|
||||
"""Test the command `index` for all documents within time bound."""
|
||||
cache.clear()
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=timezone.utc)
|
||||
upper_time_bound = lower_time_bound + timedelta(days=30)
|
||||
|
||||
document_too_early = DocumentFactory(
|
||||
updated_at=lower_time_bound - timedelta(days=10)
|
||||
)
|
||||
document_in_window_1 = DocumentFactory(
|
||||
updated_at=lower_time_bound + timedelta(days=5)
|
||||
)
|
||||
document_in_window_2 = DocumentFactory(
|
||||
updated_at=lower_time_bound + timedelta(days=15)
|
||||
)
|
||||
document_too_late = DocumentFactory(
|
||||
updated_at=upper_time_bound + timedelta(days=10)
|
||||
)
|
||||
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
call_command(
|
||||
"index",
|
||||
lower_time_bound=lower_time_bound.isoformat(),
|
||||
upper_time_bound=upper_time_bound.isoformat(),
|
||||
assert sorted(push_call_args[0], key=itemgetter("id")) == sorted(
|
||||
[
|
||||
indexer.serialize_document(doc, accesses),
|
||||
indexer.serialize_document(no_title_doc, accesses),
|
||||
],
|
||||
key=itemgetter("id"),
|
||||
)
|
||||
pushed_document_ids = [
|
||||
document["id"]
|
||||
for call_arg_list in mock_push.call_args_list
|
||||
for document in call_arg_list.args[0]
|
||||
]
|
||||
|
||||
# Only documents in window should be indexed
|
||||
assert str(document_too_early.id) not in pushed_document_ids
|
||||
assert str(document_in_window_1.id) in pushed_document_ids
|
||||
assert str(document_in_window_2.id) in pushed_document_ids
|
||||
assert str(document_too_late.id) not in pushed_document_ids
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
def test_index_with_crash_recovery(caplog_with_propagate):
|
||||
"""Test resuming indexing from checkpoint after a crash."""
|
||||
cache.clear()
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=timezone.utc)
|
||||
upper_time_bound = lower_time_bound + timedelta(days=60)
|
||||
|
||||
batch_size = 2
|
||||
documents = [
|
||||
# batch 0
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=5)),
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=10)),
|
||||
# batch 1
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=20)),
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=25)),
|
||||
# batch 2 - will crash here
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=30)),
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=35)),
|
||||
# batch 3
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=40)),
|
||||
factories.DocumentFactory(updated_at=lower_time_bound + timedelta(days=45)),
|
||||
]
|
||||
|
||||
def push_with_failure_on_batch_2(data):
|
||||
# Crash when encounters document at index 4 (batch 2 with batch_size=2)
|
||||
if str(documents[4].id) in [document["id"] for document in data]:
|
||||
raise ConnectionError("Simulated indexing error")
|
||||
|
||||
# First run: simulate crash on batch 3
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
mock_push.side_effect = push_with_failure_on_batch_2
|
||||
with pytest.raises(CommandError):
|
||||
with caplog_with_propagate.at_level(logging.INFO):
|
||||
call_command(
|
||||
"index",
|
||||
batch_size=batch_size,
|
||||
lower_time_bound=lower_time_bound.isoformat(),
|
||||
upper_time_bound=upper_time_bound.isoformat(),
|
||||
)
|
||||
pushed_document_ids = [
|
||||
document["id"]
|
||||
for call_arg_list in mock_push.call_args_list
|
||||
for document in call_arg_list.args[0]
|
||||
]
|
||||
|
||||
# the updated at of the last document of each batch are logged as checkpoint
|
||||
# -> documents[3].updated_at is the most advanced checkpoint
|
||||
for i in [1, 3]:
|
||||
assert any(
|
||||
f"Indexing checkpoint: {documents[i].updated_at.isoformat()}." in message
|
||||
for message in caplog_with_propagate.messages
|
||||
)
|
||||
for i in [0, 2, 4, 5, 6]:
|
||||
assert not any(
|
||||
f"Indexing checkpoint: {documents[i].updated_at.isoformat()}" in message
|
||||
for message in caplog_with_propagate.messages
|
||||
)
|
||||
# first 2 batches should be indexed successfully
|
||||
for i in range(0, 4):
|
||||
assert str(documents[i].id) in pushed_document_ids
|
||||
# next batch should have been attempted but failed
|
||||
for i in range(4, 6):
|
||||
assert str(documents[i].id) in pushed_document_ids
|
||||
# last batches indexing should not have been attempted
|
||||
for i in range(6, 8):
|
||||
assert str(documents[i].id) not in pushed_document_ids
|
||||
|
||||
# Second run: resume from checkpoint
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
call_command(
|
||||
"index",
|
||||
batch_size=batch_size,
|
||||
lower_time_bound=documents[3].updated_at,
|
||||
upper_time_bound=upper_time_bound.isoformat(),
|
||||
)
|
||||
pushed_document_ids = [
|
||||
document["id"]
|
||||
for call_arg_list in mock_push.call_args_list
|
||||
for document in call_arg_list.args[0]
|
||||
]
|
||||
|
||||
# first 2 batches should NOT be re-indexed
|
||||
# except the last document of the last batch which is on the checkpoint boundary
|
||||
# -> doc 0, 1 and 2
|
||||
for i in range(0, 3):
|
||||
assert str(documents[i].id) not in pushed_document_ids
|
||||
# next batches should be indexed including the document at the checkpoint boundary
|
||||
# which has already been indexed and is re-indexed
|
||||
# -> doc 3 to the end
|
||||
for i in range(3, 8):
|
||||
assert str(documents[i].id) in pushed_document_ids
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@@ -201,57 +63,3 @@ def test_index_improperly_configured(indexer_settings):
|
||||
call_command("index")
|
||||
|
||||
assert str(err.value) == "The indexer is not enabled or properly configured."
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
def test_index_with_async_flag(settings):
|
||||
"""Test the command `index` with --async=True runs task asynchronously."""
|
||||
cache.clear()
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=timezone.utc)
|
||||
|
||||
with mock.patch(
|
||||
"core.management.commands.index.batch_document_indexer_task"
|
||||
) as mock_task:
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
call_command(
|
||||
"index", async_mode=True, lower_time_bound=lower_time_bound.isoformat()
|
||||
)
|
||||
# push not be called synchronously
|
||||
mock_push.assert_not_called()
|
||||
# task called asynchronously
|
||||
mock_task.apply_async.assert_called_once_with(
|
||||
kwargs={
|
||||
"lower_time_bound": lower_time_bound.isoformat(),
|
||||
"upper_time_bound": None,
|
||||
"batch_size": settings.SEARCH_INDEXER_BATCH_SIZE,
|
||||
"crash_safe_mode": True,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
def test_index_without_async_flag():
|
||||
"""Test the command `index` with --async=False runs synchronously."""
|
||||
cache.clear()
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=timezone.utc)
|
||||
|
||||
document = DocumentFactory(updated_at=lower_time_bound + timedelta(days=10))
|
||||
|
||||
with mock.patch(
|
||||
"core.management.commands.index.batch_document_indexer_task"
|
||||
) as mock_task:
|
||||
with mock.patch.object(FindDocumentIndexer, "push") as mock_push:
|
||||
call_command(
|
||||
"index", async_mode=False, lower_time_bound=lower_time_bound.isoformat()
|
||||
)
|
||||
# push is called synchronously to index the document
|
||||
pushed_document_ids = [
|
||||
document["id"]
|
||||
for call_arg_list in mock_push.call_args_list
|
||||
for document in call_arg_list.args[0]
|
||||
]
|
||||
assert str(document.id) in pushed_document_ids
|
||||
# async task not called
|
||||
mock_task.apply_async.assert_not_called()
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
"""Fixtures for tests in the impress core application"""
|
||||
|
||||
import base64
|
||||
import logging
|
||||
from unittest import mock
|
||||
|
||||
from django.core.cache import cache
|
||||
@@ -23,30 +22,6 @@ def clear_cache():
|
||||
cache.clear()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def caplog_with_propagate(settings, caplog):
|
||||
"""
|
||||
propagate=False on settings.LOGGING loggers.
|
||||
This prevents caplog from capturing logs.
|
||||
|
||||
This fixture enables propagation on all configured loggers.
|
||||
"""
|
||||
# Save original propagate state
|
||||
original_propagate = {}
|
||||
|
||||
for logger_name in settings.LOGGING.get("loggers", {}):
|
||||
logger = logging.getLogger(logger_name)
|
||||
original_propagate[logger_name] = logger.propagate
|
||||
logger.propagate = True
|
||||
|
||||
try:
|
||||
yield caplog
|
||||
finally:
|
||||
# Restore original propagate states
|
||||
for logger_name, original_value in original_propagate.items():
|
||||
logging.getLogger(logger_name).propagate = original_value
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_user_teams():
|
||||
"""Mock for the "teams" property on the User model."""
|
||||
|
||||
@@ -96,9 +96,8 @@ def test_api_documents_delete_authenticated_owner_of_ancestor(depth):
|
||||
)
|
||||
assert models.Document.objects.count() == depth
|
||||
|
||||
document_to_delete = documents[-1]
|
||||
response = client.delete(
|
||||
f"/api/v1.0/documents/{document_to_delete.id}/",
|
||||
f"/api/v1.0/documents/{documents[-1].id}/",
|
||||
)
|
||||
|
||||
assert response.status_code == 204
|
||||
@@ -106,11 +105,7 @@ def test_api_documents_delete_authenticated_owner_of_ancestor(depth):
|
||||
# Make sure it is only a soft delete
|
||||
assert models.Document.objects.count() == depth
|
||||
assert models.Document.objects.filter(deleted_at__isnull=True).count() == depth - 1
|
||||
deleted_documents = models.Document.objects.filter(deleted_at__isnull=False)
|
||||
assert len(deleted_documents) == 1
|
||||
deleted_document = deleted_documents[0]
|
||||
# updated_at is updated by the soft delete
|
||||
assert deleted_document.updated_at > document_to_delete.updated_at
|
||||
assert models.Document.objects.filter(deleted_at__isnull=False).count() == 1
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via", VIA)
|
||||
|
||||
@@ -91,15 +91,11 @@ def test_api_documents_restore_authenticated_owner_ancestor_deleted():
|
||||
factories.UserDocumentAccessFactory(document=document, user=user, role="owner")
|
||||
|
||||
document.soft_delete()
|
||||
document.refresh_from_db()
|
||||
document_deleted_at = document.deleted_at
|
||||
document_updated_at = document.updated_at
|
||||
|
||||
assert document_deleted_at is not None
|
||||
|
||||
grand_parent.soft_delete()
|
||||
grand_parent_deleted_at = grand_parent.deleted_at
|
||||
|
||||
assert grand_parent_deleted_at is not None
|
||||
|
||||
response = client.post(f"/api/v1.0/documents/{document.id!s}/restore/")
|
||||
@@ -109,8 +105,6 @@ def test_api_documents_restore_authenticated_owner_ancestor_deleted():
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.deleted_at is None
|
||||
# document is updated by restore
|
||||
assert document.updated_at > document_updated_at
|
||||
# document is still marked as deleted
|
||||
assert document.ancestors_deleted_at == grand_parent_deleted_at
|
||||
assert grand_parent_deleted_at > document_deleted_at
|
||||
|
||||
@@ -70,20 +70,17 @@ def test_api_documents_search_anonymous(search_query, indexer_settings):
|
||||
|
||||
|
||||
@mock.patch("core.api.viewsets.DocumentViewSet.list")
|
||||
def test_api_documents_search_fall_back_on_search_list(mock_list, settings):
|
||||
def test_api_documents_search_fall_back_on_search_list(mock_list, indexer_settings):
|
||||
"""
|
||||
When indexer is not configured and no path is provided,
|
||||
should fall back on list method
|
||||
"""
|
||||
indexer_settings.SEARCH_URL = None
|
||||
assert get_document_indexer() is None
|
||||
assert settings.OIDC_STORE_REFRESH_TOKEN is False
|
||||
assert settings.OIDC_STORE_ACCESS_TOKEN is False
|
||||
|
||||
user = factories.UserFactory()
|
||||
client = APIClient()
|
||||
client.force_login(
|
||||
user, backend="core.authentication.backends.OIDCAuthenticationBackend"
|
||||
)
|
||||
client.force_login(user)
|
||||
|
||||
mocked_response = {
|
||||
"count": 0,
|
||||
@@ -96,8 +93,6 @@ def test_api_documents_search_fall_back_on_search_list(mock_list, settings):
|
||||
q = "alpha"
|
||||
response = client.get("/api/v1.0/documents/search/", data={"q": q})
|
||||
|
||||
assert response.status_code == 200
|
||||
|
||||
assert mock_list.call_count == 1
|
||||
assert mock_list.call_args[0][0].GET.get("q") == q
|
||||
assert response.json() == mocked_response
|
||||
@@ -105,21 +100,18 @@ def test_api_documents_search_fall_back_on_search_list(mock_list, settings):
|
||||
|
||||
@mock.patch("core.api.viewsets.DocumentViewSet._list_descendants")
|
||||
def test_api_documents_search_fallback_on_search_list_sub_docs(
|
||||
mock_list_descendants, settings
|
||||
mock_list_descendants, indexer_settings
|
||||
):
|
||||
"""
|
||||
When indexer is not configured and path parameter is provided,
|
||||
should call _list_descendants() method
|
||||
"""
|
||||
assert get_document_indexer() is None
|
||||
assert settings.OIDC_STORE_REFRESH_TOKEN is False
|
||||
assert settings.OIDC_STORE_ACCESS_TOKEN is False
|
||||
indexer_settings.SEARCH_URL = "http://find/api/v1.0/search"
|
||||
assert get_document_indexer() is not None
|
||||
|
||||
user = factories.UserFactory()
|
||||
client = APIClient()
|
||||
client.force_login(
|
||||
user, backend="core.authentication.backends.OIDCAuthenticationBackend"
|
||||
)
|
||||
client.force_login(user)
|
||||
|
||||
parent = factories.DocumentFactory(title="parent", users=[user])
|
||||
|
||||
@@ -136,9 +128,9 @@ def test_api_documents_search_fallback_on_search_list_sub_docs(
|
||||
"/api/v1.0/documents/search/", data={"q": q, "path": parent.path}
|
||||
)
|
||||
|
||||
mock_list_descendants.assert_called_with(
|
||||
mock.ANY, {"q": "alpha", "path": parent.path}
|
||||
)
|
||||
assert mock_list_descendants.call_count == 1
|
||||
assert mock_list_descendants.call_args[0][0].GET.get("q") == q
|
||||
assert mock_list_descendants.call_args[0][0].GET.get("path") == parent.path
|
||||
assert response.json() == mocked_response
|
||||
|
||||
|
||||
@@ -160,9 +152,7 @@ def test_api_documents_search_indexer_crashes(mock_title_search, indexer_setting
|
||||
|
||||
user = factories.UserFactory()
|
||||
client = APIClient()
|
||||
client.force_login(
|
||||
user, backend="core.authentication.backends.OIDCAuthenticationBackend"
|
||||
)
|
||||
client.force_login(user)
|
||||
|
||||
mocked_response = {
|
||||
"count": 0,
|
||||
@@ -195,9 +185,7 @@ def test_api_documents_search_invalid_params(indexer_settings):
|
||||
|
||||
user = factories.UserFactory()
|
||||
client = APIClient()
|
||||
client.force_login(
|
||||
user, backend="core.authentication.backends.OIDCAuthenticationBackend"
|
||||
)
|
||||
client.force_login(user)
|
||||
|
||||
response = client.get("/api/v1.0/documents/search/")
|
||||
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
"""
|
||||
Tests for Documents API endpoint in impress's core app: update
|
||||
"""
|
||||
# pylint: disable=too-many-lines
|
||||
|
||||
import random
|
||||
from unittest.mock import patch
|
||||
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from django.core.cache import cache
|
||||
@@ -19,25 +17,6 @@ from core.tests.conftest import TEAM, USER, VIA
|
||||
|
||||
pytestmark = pytest.mark.django_db
|
||||
|
||||
# A valid Yjs document derived from YDOC_HELLO_WORLD_BASE64 with "Hello" replaced by "World",
|
||||
# used in PATCH tests to guarantee a real content change distinct from what DocumentFactory
|
||||
# produces.
|
||||
YDOC_UPDATED_CONTENT_BASE64 = (
|
||||
"AR717vLVDgAHAQ5kb2N1bWVudC1zdG9yZQMKYmxvY2tHcm91cAcA9e7y1Q4AAw5ibG9ja0NvbnRh"
|
||||
"aW5lcgcA9e7y1Q4BAwdoZWFkaW5nBwD17vLVDgIGBgD17vLVDgMGaXRhbGljAnt9hPXu8tUOBAVX"
|
||||
"b3JsZIb17vLVDgkGaXRhbGljBG51bGwoAPXu8tUOAg10ZXh0QWxpZ25tZW50AXcEbGVmdCgA9e7y"
|
||||
"1Q4CBWxldmVsAX0BKAD17vLVDgECaWQBdyQwNGQ2MjM0MS04MzI2LTQyMzYtYTA4My00ODdlMjZm"
|
||||
"YWQyMzAoAPXu8tUOAQl0ZXh0Q29sb3IBdwdkZWZhdWx0KAD17vLVDgEPYmFja2dyb3VuZENvbG9y"
|
||||
"AXcHZGVmYXVsdIf17vLVDgEDDmJsb2NrQ29udGFpbmVyBwD17vLVDhADDmJ1bGxldExpc3RJdGVt"
|
||||
"BwD17vLVDhEGBAD17vLVDhIBd4b17vLVDhMEYm9sZAJ7fYT17vLVDhQCb3KG9e7y1Q4WBGJvbGQE"
|
||||
"bnVsbIT17vLVDhcCbGQoAPXu8tUOEQ10ZXh0QWxpZ25tZW50AXcEbGVmdCgA9e7y1Q4QAmlkAXck"
|
||||
"ZDM1MWUwNjgtM2U1NS00MjI2LThlYTUtYWJiMjYzMTk4ZTJhKAD17vLVDhAJdGV4dENvbG9yAXcH"
|
||||
"ZGVmYXVsdCgA9e7y1Q4QD2JhY2tncm91bmRDb2xvcgF3B2RlZmF1bHSH9e7y1Q4QAw5ibG9ja0Nv"
|
||||
"bnRhaW5lcgcA9e7y1Q4eAwlwYXJhZ3JhcGgoAPXu8tUOHw10ZXh0QWxpZ25tZW50AXcEbGVmdCgA"
|
||||
"9e7y1Q4eAmlkAXckODk3MDBjMDctZTBlMS00ZmUwLWFjYTItODQ5MzIwOWE3ZTQyKAD17vLVDh4J"
|
||||
"dGV4dENvbG9yAXcHZGVmYXVsdCgA9e7y1Q4eD2JhY2tncm91bmRDb2xvcgF3B2RlZmF1bHQA"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize(
|
||||
@@ -351,7 +330,6 @@ def test_api_documents_update_authenticated_no_websocket(settings):
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 0, "exists": False})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.put(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
@@ -360,8 +338,6 @@ def test_api_documents_update_authenticated_no_websocket(settings):
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.path == old_path
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
@@ -470,7 +446,6 @@ def test_api_documents_update_user_connected_to_websocket(settings):
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": True})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.put(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
@@ -478,9 +453,6 @@ def test_api_documents_update_user_connected_to_websocket(settings):
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.path == old_path
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
@@ -514,7 +486,6 @@ def test_api_documents_update_websocket_server_unreachable_fallback_to_no_websoc
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.put(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
@@ -523,8 +494,6 @@ def test_api_documents_update_websocket_server_unreachable_fallback_to_no_websoc
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.path == old_path
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
@@ -636,7 +605,6 @@ def test_api_documents_update_force_websocket_param_to_true(settings):
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.put(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
@@ -645,8 +613,6 @@ def test_api_documents_update_force_websocket_param_to_true(settings):
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.path == old_path
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 0
|
||||
|
||||
@@ -677,7 +643,6 @@ def test_api_documents_update_feature_flag_disabled(settings):
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.put(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
@@ -686,8 +651,6 @@ def test_api_documents_update_feature_flag_disabled(settings):
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
document.refresh_from_db()
|
||||
assert document.path == old_path
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 0
|
||||
|
||||
@@ -753,724 +716,3 @@ def test_api_documents_update_invalid_content():
|
||||
)
|
||||
assert response.status_code == 400
|
||||
assert response.json() == {"content": ["Invalid base64 content."]}
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# PATCH tests
|
||||
# =============================================================================
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize(
|
||||
"reach, role",
|
||||
[
|
||||
("restricted", "reader"),
|
||||
("restricted", "editor"),
|
||||
("authenticated", "reader"),
|
||||
("authenticated", "editor"),
|
||||
("public", "reader"),
|
||||
],
|
||||
)
|
||||
def test_api_documents_patch_anonymous_forbidden(reach, role, via_parent):
|
||||
"""
|
||||
Anonymous users should not be allowed to patch a document when link
|
||||
configuration does not allow it.
|
||||
"""
|
||||
if via_parent:
|
||||
grand_parent = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
else:
|
||||
document = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = APIClient().patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 401
|
||||
assert response.json() == {
|
||||
"detail": "Authentication credentials were not provided."
|
||||
}
|
||||
|
||||
document.refresh_from_db()
|
||||
assert serializers.DocumentSerializer(instance=document).data == old_document_values
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize(
|
||||
"reach,role",
|
||||
[
|
||||
("public", "reader"),
|
||||
("authenticated", "reader"),
|
||||
("restricted", "reader"),
|
||||
("restricted", "editor"),
|
||||
],
|
||||
)
|
||||
def test_api_documents_patch_authenticated_unrelated_forbidden(reach, role, via_parent):
|
||||
"""
|
||||
Authenticated users should not be allowed to patch a document to which
|
||||
they are not related if the link configuration does not allow it.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
|
||||
if via_parent:
|
||||
grand_parent = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
else:
|
||||
document = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
|
||||
assert response.status_code == 403
|
||||
assert response.json() == {
|
||||
"detail": "You do not have permission to perform this action."
|
||||
}
|
||||
|
||||
document.refresh_from_db()
|
||||
assert serializers.DocumentSerializer(instance=document).data == old_document_values
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize(
|
||||
"is_authenticated,reach,role",
|
||||
[
|
||||
(False, "public", "editor"),
|
||||
(True, "public", "editor"),
|
||||
(True, "authenticated", "editor"),
|
||||
],
|
||||
)
|
||||
def test_api_documents_patch_anonymous_or_authenticated_unrelated(
|
||||
is_authenticated, reach, role, via_parent
|
||||
):
|
||||
"""
|
||||
Anonymous and authenticated users should be able to patch a document to which
|
||||
they are not related if the link configuration allows it.
|
||||
"""
|
||||
client = APIClient()
|
||||
|
||||
if is_authenticated:
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client.force_login(user)
|
||||
|
||||
if via_parent:
|
||||
grand_parent = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
else:
|
||||
document = factories.DocumentFactory(link_reach=reach, link_role=role)
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
old_path = document.path
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content, "websocket": True},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not wirk because the content is in cache.
|
||||
# Force reloading it by fetching the document in the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
document_values = serializers.DocumentSerializer(instance=document).data
|
||||
for key in [
|
||||
"id",
|
||||
"title",
|
||||
"link_reach",
|
||||
"link_role",
|
||||
"creator",
|
||||
"depth",
|
||||
"numchild",
|
||||
"path",
|
||||
]:
|
||||
assert document_values[key] == old_document_values[key]
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize("via", VIA)
|
||||
def test_api_documents_patch_authenticated_reader(via, via_parent, mock_user_teams):
|
||||
"""Users who are reader of a document should not be allowed to patch it."""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
|
||||
if via_parent:
|
||||
grand_parent = factories.DocumentFactory(link_reach="restricted")
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
access_document = grand_parent
|
||||
else:
|
||||
document = factories.DocumentFactory(link_reach="restricted")
|
||||
access_document = document
|
||||
|
||||
if via == USER:
|
||||
factories.UserDocumentAccessFactory(
|
||||
document=access_document, user=user, role="reader"
|
||||
)
|
||||
elif via == TEAM:
|
||||
mock_user_teams.return_value = ["lasuite", "unknown"]
|
||||
factories.TeamDocumentAccessFactory(
|
||||
document=access_document, team="lasuite", role="reader"
|
||||
)
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
|
||||
assert response.status_code == 403
|
||||
assert response.json() == {
|
||||
"detail": "You do not have permission to perform this action."
|
||||
}
|
||||
|
||||
document.refresh_from_db()
|
||||
assert serializers.DocumentSerializer(instance=document).data == old_document_values
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via_parent", [True, False])
|
||||
@pytest.mark.parametrize("role", ["editor", "administrator", "owner"])
|
||||
@pytest.mark.parametrize("via", VIA)
|
||||
def test_api_documents_patch_authenticated_editor_administrator_or_owner(
|
||||
via, role, via_parent, mock_user_teams
|
||||
):
|
||||
"""A user who is editor, administrator or owner of a document should be allowed to patch it."""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
|
||||
if via_parent:
|
||||
grand_parent = factories.DocumentFactory(link_reach="restricted")
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="restricted")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
access_document = grand_parent
|
||||
else:
|
||||
document = factories.DocumentFactory(link_reach="restricted")
|
||||
access_document = document
|
||||
|
||||
if via == USER:
|
||||
factories.UserDocumentAccessFactory(
|
||||
document=access_document, user=user, role=role
|
||||
)
|
||||
elif via == TEAM:
|
||||
mock_user_teams.return_value = ["lasuite", "unknown"]
|
||||
factories.TeamDocumentAccessFactory(
|
||||
document=access_document, team="lasuite", role=role
|
||||
)
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
old_path = document.path
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content, "websocket": True},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not wirk because the content is in cache.
|
||||
# Force reloading it by fetching the document in the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
document_values = serializers.DocumentSerializer(instance=document).data
|
||||
for key in [
|
||||
"id",
|
||||
"title",
|
||||
"link_reach",
|
||||
"link_role",
|
||||
"creator",
|
||||
"depth",
|
||||
"numchild",
|
||||
"path",
|
||||
"nb_accesses_ancestors",
|
||||
"nb_accesses_direct",
|
||||
]:
|
||||
assert document_values[key] == old_document_values[key]
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_authenticated_no_websocket(settings):
|
||||
"""
|
||||
When a user patches the document, not connected to the websocket and is the first to update,
|
||||
the document should be updated.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 0, "exists": False})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not work because the content is cached.
|
||||
# Force reloading it by fetching the document from the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_authenticated_no_websocket_user_already_editing(settings):
|
||||
"""
|
||||
When a user patches the document, not connected to the websocket and is not the first to
|
||||
update, the document should not be updated.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 0, "exists": False})
|
||||
|
||||
cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 403
|
||||
assert response.json() == {"detail": "You are not allowed to edit this document."}
|
||||
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_no_websocket_other_user_connected_to_websocket(settings):
|
||||
"""
|
||||
When a user patches the document, not connected to the websocket and another user is connected
|
||||
to the websocket, the document should not be updated.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": False})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 403
|
||||
assert response.json() == {"detail": "You are not allowed to edit this document."}
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_user_connected_to_websocket(settings):
|
||||
"""
|
||||
When a user patches the document while connected to the websocket, the document should be
|
||||
updated.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": True})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not wirk because the content is in cache.
|
||||
# Force reloading it by fetching the document in the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websocket(
|
||||
settings,
|
||||
):
|
||||
"""
|
||||
When the websocket server is unreachable, the patch should be applied like if the user was
|
||||
not connected to the websocket.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not work because the content is cached.
|
||||
# Force reloading it by fetching the document from the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == session_key
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_websocket_server_unreachable_fallback_to_no_websocket_other_users(
|
||||
settings,
|
||||
):
|
||||
"""
|
||||
When the websocket server is unreachable, the behavior falls back to no-websocket.
|
||||
If another user is already editing, the patch must be denied.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 403
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == "other_session_key"
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_websocket_server_room_not_found_fallback_to_no_websocket_other_users(
|
||||
settings,
|
||||
):
|
||||
"""
|
||||
When the WebSocket server does not have the room created, the logic should fallback to
|
||||
no-WebSocket. If another user is already editing, the patch must be denied.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, status=404)
|
||||
|
||||
cache.set(f"docs:no-websocket:{document.id}", "other_session_key")
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 403
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") == "other_session_key"
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_force_websocket_param_to_true(settings):
|
||||
"""
|
||||
When the websocket parameter is set to true, the patch should be applied without any check.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content, "websocket": True},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not work because the content is cached.
|
||||
# Force reloading it by fetching the document from the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 0
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_feature_flag_disabled(settings):
|
||||
"""
|
||||
When the feature flag is disabled, the patch should be applied without any check.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "editor")])
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = False
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, status=500)
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
old_path = document.path
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
# Using document.refresh_from_db does not work because the content is cached.
|
||||
# Force reloading it by fetching the document from the database.
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
assert document.path == old_path
|
||||
assert document.content == new_content
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 0
|
||||
|
||||
|
||||
@pytest.mark.parametrize("via", VIA)
|
||||
def test_api_documents_patch_administrator_or_owner_of_another(via, mock_user_teams):
|
||||
"""
|
||||
Being administrator or owner of a document should not grant authorization to patch
|
||||
another document.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
|
||||
document = factories.DocumentFactory()
|
||||
if via == USER:
|
||||
factories.UserDocumentAccessFactory(
|
||||
document=document, user=user, role=random.choice(["administrator", "owner"])
|
||||
)
|
||||
elif via == TEAM:
|
||||
mock_user_teams.return_value = ["lasuite", "unknown"]
|
||||
factories.TeamDocumentAccessFactory(
|
||||
document=document,
|
||||
team="lasuite",
|
||||
role=random.choice(["administrator", "owner"]),
|
||||
)
|
||||
|
||||
other_document = factories.DocumentFactory(title="Old title", link_role="reader")
|
||||
old_document_values = serializers.DocumentSerializer(instance=other_document).data
|
||||
new_content = YDOC_UPDATED_CONTENT_BASE64
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{other_document.id!s}/",
|
||||
{"content": new_content},
|
||||
format="json",
|
||||
)
|
||||
|
||||
assert response.status_code == 403
|
||||
|
||||
other_document.refresh_from_db()
|
||||
assert (
|
||||
serializers.DocumentSerializer(instance=other_document).data
|
||||
== old_document_values
|
||||
)
|
||||
|
||||
|
||||
def test_api_documents_patch_invalid_content():
|
||||
"""
|
||||
Patching a document with a non base64 encoded content should raise a validation error.
|
||||
"""
|
||||
user = factories.UserFactory(with_owned_document=True)
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
|
||||
document = factories.DocumentFactory(users=[[user, "owner"]])
|
||||
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
{"content": "invalid content"},
|
||||
format="json",
|
||||
)
|
||||
assert response.status_code == 400
|
||||
assert response.json() == {"content": ["Invalid base64 content."]}
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_api_documents_patch_empty_body(settings):
|
||||
"""
|
||||
Test when data is empty the document should not be updated.
|
||||
The `updated_at` property should not change asserting that no update in the database is made.
|
||||
"""
|
||||
user = factories.UserFactory()
|
||||
|
||||
client = APIClient()
|
||||
client.force_login(user)
|
||||
session_key = client.session.session_key
|
||||
|
||||
document = factories.DocumentFactory(users=[(user, "owner")], creator=user)
|
||||
document_updated_at = document.updated_at
|
||||
|
||||
settings.COLLABORATION_API_URL = "http://example.com/"
|
||||
settings.COLLABORATION_SERVER_SECRET = "secret-token"
|
||||
settings.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY = True
|
||||
endpoint_url = (
|
||||
f"{settings.COLLABORATION_API_URL}get-connections/"
|
||||
f"?room={document.id}&sessionKey={session_key}"
|
||||
)
|
||||
ws_resp = responses.get(endpoint_url, json={"count": 3, "exists": True})
|
||||
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
|
||||
old_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
|
||||
with patch("core.models.Document.save") as mock_document_save:
|
||||
response = client.patch(
|
||||
f"/api/v1.0/documents/{document.id!s}/",
|
||||
content_type="application/json",
|
||||
)
|
||||
mock_document_save.assert_not_called()
|
||||
assert response.status_code == 200
|
||||
|
||||
document = models.Document.objects.get(id=document.id)
|
||||
new_document_values = serializers.DocumentSerializer(instance=document).data
|
||||
assert new_document_values == old_document_values
|
||||
assert document_updated_at == document.updated_at
|
||||
assert cache.get(f"docs:no-websocket:{document.id}") is None
|
||||
assert ws_resp.call_count == 1
|
||||
|
||||
@@ -1,54 +0,0 @@
|
||||
"""Tests for run_indexing_view admin endpoint."""
|
||||
|
||||
from unittest.mock import patch
|
||||
|
||||
from django.http import HttpResponse
|
||||
|
||||
import pytest
|
||||
|
||||
from core import factories
|
||||
|
||||
|
||||
@pytest.mark.usefixtures("indexer_settings")
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize(
|
||||
"is_authenticated,is_staff,should_call_command",
|
||||
[
|
||||
(False, False, False),
|
||||
(True, False, False),
|
||||
(True, True, True),
|
||||
],
|
||||
)
|
||||
def test_run_indexing_view_post_authentication(
|
||||
client,
|
||||
is_authenticated,
|
||||
is_staff,
|
||||
should_call_command,
|
||||
):
|
||||
"""Test that POST to run_indexing_view requires staff authentication."""
|
||||
|
||||
if is_authenticated:
|
||||
user = factories.UserFactory(is_staff=is_staff)
|
||||
client.force_login(user)
|
||||
|
||||
batch_size = 100
|
||||
with patch("core.admin.call_command") as mock_call_command:
|
||||
mock_call_command.return_value = HttpResponse("Mocked render")
|
||||
response = client.post("/admin/run-indexing/", {"batch_size": batch_size})
|
||||
|
||||
# redirects in all cases
|
||||
assert response.status_code == 302
|
||||
|
||||
if should_call_command:
|
||||
assert "/admin/run-indexing/" == response.url
|
||||
mock_call_command.assert_called_once()
|
||||
assert mock_call_command.call_args.kwargs == {
|
||||
"batch_size": batch_size,
|
||||
"lower_time_bound": None,
|
||||
"upper_time_bound": None,
|
||||
"async_mode": True,
|
||||
}
|
||||
|
||||
else:
|
||||
assert "/admin/login/" in response.url
|
||||
mock_call_command.assert_not_called()
|
||||
@@ -1,32 +0,0 @@
|
||||
"""module testing the conditional_refresh_oidc_token utils."""
|
||||
|
||||
from unittest import mock
|
||||
|
||||
from core.api import utils
|
||||
|
||||
|
||||
def test_refresh_oidc_access_token_storing_refresh_token_disabled(settings):
|
||||
"""The method_decorator must not be called when OIDC_STORE_REFRESH_TOKEN is False."""
|
||||
|
||||
settings.OIDC_STORE_REFRESH_TOKEN = False
|
||||
|
||||
callback = mock.MagicMock()
|
||||
|
||||
with mock.patch.object(utils, "method_decorator") as mock_method_decorator:
|
||||
result = utils.conditional_refresh_oidc_token(callback)
|
||||
|
||||
mock_method_decorator.assert_not_called()
|
||||
assert result == callback
|
||||
|
||||
|
||||
def test_refresh_oidc_access_token_storing_refresh_token_enabled(settings):
|
||||
"""The method_decorator must not be called when OIDC_STORE_REFRESH_TOKEN is False."""
|
||||
|
||||
settings.OIDC_STORE_REFRESH_TOKEN = True
|
||||
|
||||
callback = mock.MagicMock()
|
||||
|
||||
with mock.patch.object(utils, "method_decorator") as mock_method_decorator:
|
||||
utils.conditional_refresh_oidc_token(callback)
|
||||
|
||||
mock_method_decorator.assert_called_with(utils.refresh_oidc_access_token)
|
||||
@@ -5,8 +5,6 @@ Unit tests for the Document model
|
||||
|
||||
import random
|
||||
import smtplib
|
||||
from datetime import datetime, timedelta
|
||||
from datetime import timezone as base_timezone
|
||||
from logging import Logger
|
||||
from unittest import mock
|
||||
|
||||
@@ -21,7 +19,6 @@ from django.utils import timezone
|
||||
import pytest
|
||||
|
||||
from core import factories, models
|
||||
from core.factories import DocumentFactory
|
||||
|
||||
pytestmark = pytest.mark.django_db
|
||||
|
||||
@@ -90,8 +87,7 @@ def test_models_documents_tree_alphabet():
|
||||
|
||||
@pytest.mark.parametrize("depth", range(5))
|
||||
def test_models_documents_soft_delete(depth):
|
||||
"""
|
||||
Trying to delete a document that is already deleted or is a descendant of
|
||||
"""Trying to delete a document that is already deleted or is a descendant of
|
||||
a deleted document should raise an error.
|
||||
"""
|
||||
documents = []
|
||||
@@ -103,8 +99,6 @@ def test_models_documents_soft_delete(depth):
|
||||
)
|
||||
assert models.Document.objects.count() == depth + 1
|
||||
|
||||
document_pk_to_updated_at = {d.pk: d.updated_at for d in documents}
|
||||
|
||||
# Delete any one of the documents...
|
||||
deleted_document = random.choice(documents)
|
||||
deleted_document.soft_delete()
|
||||
@@ -112,26 +106,19 @@ def test_models_documents_soft_delete(depth):
|
||||
with pytest.raises(RuntimeError):
|
||||
documents[-1].soft_delete()
|
||||
|
||||
deleted_document.refresh_from_db()
|
||||
assert deleted_document.deleted_at is not None
|
||||
assert deleted_document.ancestors_deleted_at == deleted_document.deleted_at
|
||||
# updated_at is updated on the deleted document
|
||||
assert deleted_document.updated_at > document_pk_to_updated_at[deleted_document.pk]
|
||||
|
||||
descendants = deleted_document.get_descendants()
|
||||
for child in descendants:
|
||||
assert child.deleted_at is None
|
||||
assert child.ancestors_deleted_at is not None
|
||||
assert child.ancestors_deleted_at == deleted_document.deleted_at
|
||||
# updated_at is updated on descendants
|
||||
assert child.updated_at > document_pk_to_updated_at[child.pk]
|
||||
|
||||
ancestors = deleted_document.get_ancestors()
|
||||
for parent in ancestors:
|
||||
assert parent.deleted_at is None
|
||||
assert parent.ancestors_deleted_at is None
|
||||
# updated_at is not affected on parents
|
||||
assert parent.updated_at == document_pk_to_updated_at[parent.pk]
|
||||
|
||||
assert len(ancestors) + len(descendants) == depth
|
||||
|
||||
@@ -1432,20 +1419,16 @@ def test_models_documents_restore_tempering_with_instance():
|
||||
def test_models_documents_restore(django_assert_num_queries):
|
||||
"""The restore method should restore a soft-deleted document."""
|
||||
document = factories.DocumentFactory()
|
||||
|
||||
document.soft_delete()
|
||||
document.refresh_from_db()
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
original_updated_after_delete = document.updated_at
|
||||
|
||||
with django_assert_num_queries(10):
|
||||
document.restore()
|
||||
document.refresh_from_db()
|
||||
assert document.deleted_at is None
|
||||
assert document.ancestors_deleted_at is None
|
||||
# updated_at is updated by restore
|
||||
assert original_updated_after_delete < document.updated_at
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
|
||||
|
||||
def test_models_documents_restore_complex(django_assert_num_queries):
|
||||
@@ -1462,7 +1445,6 @@ def test_models_documents_restore_complex(django_assert_num_queries):
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
assert child1.ancestors_deleted_at == document.deleted_at
|
||||
@@ -1472,18 +1454,13 @@ def test_models_documents_restore_complex(django_assert_num_queries):
|
||||
grand_parent.soft_delete()
|
||||
grand_parent.refresh_from_db()
|
||||
parent.refresh_from_db()
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
grand_parent_updated_at = grand_parent.updated_at
|
||||
document_updated_at = document.updated_at
|
||||
child1_updated_at = child2.updated_at
|
||||
child2_updated_at = child2.updated_at
|
||||
|
||||
assert grand_parent.deleted_at is not None
|
||||
assert grand_parent.ancestors_deleted_at == grand_parent.deleted_at
|
||||
assert parent.ancestors_deleted_at == grand_parent.deleted_at
|
||||
# item, child1 and child2 should not be affected
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
assert child1.ancestors_deleted_at == document.deleted_at
|
||||
@@ -1492,23 +1469,15 @@ def test_models_documents_restore_complex(django_assert_num_queries):
|
||||
# Restore the item
|
||||
with django_assert_num_queries(14):
|
||||
document.restore()
|
||||
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
grand_parent.refresh_from_db()
|
||||
|
||||
assert document.deleted_at is None
|
||||
assert document.ancestors_deleted_at == grand_parent.deleted_at
|
||||
# child 1 and child 2 should now have the same ancestors_deleted_at as the grand parent
|
||||
assert child1.ancestors_deleted_at == grand_parent.deleted_at
|
||||
assert child2.ancestors_deleted_at == grand_parent.deleted_at
|
||||
# updated_at is updated for document and children after restore
|
||||
assert document.updated_at > document_updated_at
|
||||
assert child1.updated_at > child1_updated_at
|
||||
assert child2.updated_at > child2_updated_at
|
||||
# grand_parent updated_at is not affected
|
||||
assert grand_parent.updated_at == grand_parent_updated_at
|
||||
|
||||
|
||||
def test_models_documents_restore_complex_bis(django_assert_num_queries):
|
||||
@@ -1516,37 +1485,31 @@ def test_models_documents_restore_complex_bis(django_assert_num_queries):
|
||||
grand_parent = factories.DocumentFactory()
|
||||
parent = factories.DocumentFactory(parent=grand_parent)
|
||||
document = factories.DocumentFactory(parent=parent)
|
||||
|
||||
child1 = factories.DocumentFactory(parent=document)
|
||||
child2 = factories.DocumentFactory(parent=document)
|
||||
|
||||
# Soft delete first the document
|
||||
document.soft_delete()
|
||||
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
assert child1.ancestors_deleted_at == document.deleted_at
|
||||
assert child2.ancestors_deleted_at == document.deleted_at
|
||||
|
||||
# Soft delete the grand_parent
|
||||
# Soft delete the grand parent
|
||||
grand_parent.soft_delete()
|
||||
|
||||
grand_parent.refresh_from_db()
|
||||
parent.refresh_from_db()
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
original_parent_updated_at = parent.updated_at
|
||||
original_child1_updated_at = child1.updated_at
|
||||
original_child2_updated_at = child2.updated_at
|
||||
|
||||
assert grand_parent.deleted_at is not None
|
||||
assert grand_parent.ancestors_deleted_at == grand_parent.deleted_at
|
||||
assert parent.ancestors_deleted_at == grand_parent.deleted_at
|
||||
# item, child1 and child2 should not be affected
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
assert child1.ancestors_deleted_at == document.deleted_at
|
||||
@@ -1562,20 +1525,14 @@ def test_models_documents_restore_complex_bis(django_assert_num_queries):
|
||||
document.refresh_from_db()
|
||||
child1.refresh_from_db()
|
||||
child2.refresh_from_db()
|
||||
|
||||
assert grand_parent.deleted_at is None
|
||||
assert grand_parent.ancestors_deleted_at is None
|
||||
assert parent.deleted_at is None
|
||||
assert parent.ancestors_deleted_at is None
|
||||
# parent should have updated_at updated (descendant of restored grand_parent)
|
||||
assert parent.updated_at > original_parent_updated_at
|
||||
assert document.deleted_at is not None
|
||||
assert document.ancestors_deleted_at == document.deleted_at
|
||||
# children are not restored and then there updated_at should not be affected
|
||||
assert child1.ancestors_deleted_at == document.deleted_at
|
||||
assert child1.updated_at == original_child1_updated_at
|
||||
assert child2.ancestors_deleted_at == document.deleted_at
|
||||
assert child2.updated_at == original_child2_updated_at
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
@@ -1734,82 +1691,3 @@ def test_models_documents_compute_ancestors_links_paths_mapping_structure(
|
||||
{"link_reach": sibling.link_reach, "link_role": sibling.link_role},
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
def test_models_documents_manager_time_filter_no_filters():
|
||||
"""Test time_filter with no filters returns all documents."""
|
||||
factories.DocumentFactory.create_batch(3)
|
||||
|
||||
queryset = models.Document.objects.filter_updated_at()
|
||||
|
||||
assert queryset.count() == 3
|
||||
|
||||
|
||||
def test_models_documents_manager_time_filter_oldest_updated_at():
|
||||
"""
|
||||
Test filtering by oldest_updated_at includes documents updated after or at
|
||||
lower_time_bound.
|
||||
"""
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=base_timezone.utc)
|
||||
|
||||
DocumentFactory(updated_at=lower_time_bound - timedelta(days=30))
|
||||
document_at_boundary = DocumentFactory(updated_at=lower_time_bound)
|
||||
document_recent = DocumentFactory(updated_at=lower_time_bound + timedelta(days=15))
|
||||
|
||||
queryset = models.Document.objects.filter_updated_at(
|
||||
lower_time_bound=lower_time_bound
|
||||
)
|
||||
|
||||
assert queryset.count() == 2
|
||||
assert sorted(queryset.values_list("pk", flat=True)) == sorted(
|
||||
[document_at_boundary.pk, document_recent.pk]
|
||||
)
|
||||
|
||||
|
||||
def test_models_documents_manager_time_filter_newest_updated_at():
|
||||
"""Test filtering by newest_updated_at includes documents updated before timestamp."""
|
||||
upper_time_bound = datetime(2024, 2, 1, tzinfo=base_timezone.utc)
|
||||
|
||||
document_old = DocumentFactory(updated_at=upper_time_bound - timedelta(days=30))
|
||||
document_at_boundary = DocumentFactory(updated_at=upper_time_bound)
|
||||
document_too_recent = DocumentFactory(
|
||||
updated_at=upper_time_bound + timedelta(days=15)
|
||||
)
|
||||
|
||||
queryset = models.Document.objects.filter_updated_at(
|
||||
upper_time_bound=upper_time_bound
|
||||
)
|
||||
|
||||
assert queryset.count() == 2
|
||||
assert document_old in queryset
|
||||
assert document_at_boundary in queryset
|
||||
assert document_too_recent not in queryset
|
||||
|
||||
|
||||
def test_models_documents_manager_time_filter_both_bounds():
|
||||
"""Test filtering with both oldest and newest bounds."""
|
||||
lower_time_bound = datetime(2024, 2, 1, tzinfo=base_timezone.utc)
|
||||
upper_time_bound = lower_time_bound + timedelta(days=30)
|
||||
|
||||
document_too_early = DocumentFactory(
|
||||
updated_at=lower_time_bound - timedelta(days=10)
|
||||
)
|
||||
document_in_window = DocumentFactory(
|
||||
updated_at=lower_time_bound + timedelta(days=5)
|
||||
)
|
||||
other_document_in_window = DocumentFactory(
|
||||
updated_at=lower_time_bound + timedelta(days=15)
|
||||
)
|
||||
document_too_late = DocumentFactory(
|
||||
updated_at=upper_time_bound + timedelta(days=10)
|
||||
)
|
||||
|
||||
queryset = models.Document.objects.filter_updated_at(
|
||||
lower_time_bound=lower_time_bound, upper_time_bound=upper_time_bound
|
||||
)
|
||||
|
||||
assert queryset.count() == 2
|
||||
assert document_too_early not in queryset
|
||||
assert document_in_window in queryset
|
||||
assert other_document_in_window in queryset
|
||||
assert document_too_late not in queryset
|
||||
|
||||
@@ -252,7 +252,7 @@ def test_services_search_indexers_index_errors(indexer_settings):
|
||||
)
|
||||
|
||||
with pytest.raises(HTTPError):
|
||||
FindDocumentIndexer().index(models.Document.objects.all())
|
||||
FindDocumentIndexer().index()
|
||||
|
||||
|
||||
@patch.object(FindDocumentIndexer, "push")
|
||||
@@ -272,7 +272,7 @@ def test_services_search_indexers_batches_pass_only_batch_accesses(
|
||||
access = factories.UserDocumentAccessFactory(document=document)
|
||||
expected_user_subs[str(document.id)] = str(access.user.sub)
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 5
|
||||
assert FindDocumentIndexer().index() == 5
|
||||
|
||||
# Should be 3 batches: 2 + 2 + 1
|
||||
assert mock_push.call_count == 3
|
||||
@@ -310,7 +310,7 @@ def test_services_search_indexers_batch_size_argument(mock_push):
|
||||
access = factories.UserDocumentAccessFactory(document=document)
|
||||
expected_user_subs[str(document.id)] = str(access.user.sub)
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all(), batch_size=2) == 5
|
||||
assert FindDocumentIndexer().index(batch_size=2) == 5
|
||||
|
||||
# Should be 3 batches: 2 + 2 + 1
|
||||
assert mock_push.call_count == 3
|
||||
@@ -345,7 +345,7 @@ def test_services_search_indexers_ignore_empty_documents(mock_push):
|
||||
empty_title = factories.DocumentFactory(title="")
|
||||
empty_content = factories.DocumentFactory(content="")
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 3
|
||||
assert FindDocumentIndexer().index() == 3
|
||||
|
||||
assert mock_push.call_count == 1
|
||||
|
||||
@@ -373,7 +373,7 @@ def test_services_search_indexers_skip_empty_batches(mock_push, indexer_settings
|
||||
# Only empty docs
|
||||
factories.DocumentFactory.create_batch(5, content="", title="")
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 1
|
||||
assert FindDocumentIndexer().index() == 1
|
||||
assert mock_push.call_count == 1
|
||||
|
||||
results = [doc["id"] for doc in mock_push.call_args[0][0]]
|
||||
@@ -391,7 +391,7 @@ def test_services_search_indexers_ancestors_link_reach(mock_push):
|
||||
parent = factories.DocumentFactory(parent=grand_parent, link_reach="public")
|
||||
document = factories.DocumentFactory(parent=parent, link_reach="restricted")
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 4
|
||||
assert FindDocumentIndexer().index() == 4
|
||||
|
||||
results = {doc["id"]: doc for doc in mock_push.call_args[0][0]}
|
||||
assert len(results) == 4
|
||||
@@ -411,7 +411,7 @@ def test_services_search_indexers_ancestors_users(mock_push):
|
||||
parent = factories.DocumentFactory(parent=grand_parent, users=[user_p])
|
||||
document = factories.DocumentFactory(parent=parent, users=[user_d])
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 3
|
||||
assert FindDocumentIndexer().index() == 3
|
||||
|
||||
results = {doc["id"]: doc for doc in mock_push.call_args[0][0]}
|
||||
assert len(results) == 3
|
||||
@@ -432,7 +432,7 @@ def test_services_search_indexers_ancestors_teams(mock_push):
|
||||
parent = factories.DocumentFactory(parent=grand_parent, teams=["team_p"])
|
||||
document = factories.DocumentFactory(parent=parent, teams=["team_d"])
|
||||
|
||||
assert FindDocumentIndexer().index(models.Document.objects.all()) == 3
|
||||
assert FindDocumentIndexer().index() == 3
|
||||
|
||||
results = {doc["id"]: doc for doc in mock_push.call_args[0][0]}
|
||||
assert len(results) == 3
|
||||
|
||||
@@ -12,10 +12,7 @@ from drf_spectacular.views import (
|
||||
SpectacularSwaggerView,
|
||||
)
|
||||
|
||||
from core.admin import run_indexing_view
|
||||
|
||||
urlpatterns = [
|
||||
path("admin/run-indexing/", run_indexing_view, name="run_indexing"),
|
||||
path("admin/", admin.site.urls),
|
||||
path("", include("core.urls")),
|
||||
]
|
||||
|
||||
@@ -7,7 +7,7 @@ build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "impress"
|
||||
version = "4.8.3"
|
||||
version = "4.8.2"
|
||||
authors = [{ "name" = "DINUM", "email" = "dev@mail.numerique.gouv.fr" }]
|
||||
classifiers = [
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
|
||||
@@ -131,13 +131,12 @@ test.describe('Doc Comments', () => {
|
||||
await thread.getByRole('paragraph').first().fill('This is a comment');
|
||||
await thread.locator('[data-test="save"]').click();
|
||||
await expect(thread.getByText('This is a comment').first()).toBeHidden();
|
||||
await expect(editor.getByText('Hello')).toHaveClass('bn-thread-mark');
|
||||
|
||||
// Check background color changed
|
||||
await expect(editor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'color(srgb 0.882353 0.831373 0.717647 / 0.4)',
|
||||
'rgba(237, 180, 0, 0.4)',
|
||||
);
|
||||
|
||||
await editor.first().click();
|
||||
await editor.getByText('Hello').click();
|
||||
|
||||
@@ -186,7 +185,6 @@ test.describe('Doc Comments', () => {
|
||||
await thread.getByText('This is an edited comment').first().hover();
|
||||
await thread.locator('[data-test="resolve"]').click();
|
||||
await expect(thread).toBeHidden();
|
||||
|
||||
await expect(editor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'rgba(0, 0, 0, 0)',
|
||||
@@ -198,13 +196,11 @@ test.describe('Doc Comments', () => {
|
||||
|
||||
await thread.getByRole('paragraph').first().fill('This is a new comment');
|
||||
await thread.locator('[data-test="save"]').click();
|
||||
await expect(editor.getByText('Hello')).toHaveClass('bn-thread-mark');
|
||||
|
||||
await expect(editor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'color(srgb 0.882353 0.831373 0.717647 / 0.4)',
|
||||
'rgba(237, 180, 0, 0.4)',
|
||||
);
|
||||
|
||||
await editor.first().click();
|
||||
await editor.getByText('Hello').click();
|
||||
|
||||
@@ -212,7 +208,6 @@ test.describe('Doc Comments', () => {
|
||||
await thread.locator('[data-test="moreactions"]').first().click();
|
||||
await getMenuItem(thread, 'Delete comment').click();
|
||||
|
||||
await expect(editor.getByText('Hello')).not.toHaveClass('bn-thread-mark');
|
||||
await expect(editor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'rgba(0, 0, 0, 0)',
|
||||
@@ -268,7 +263,7 @@ test.describe('Doc Comments', () => {
|
||||
|
||||
await expect(otherEditor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'color(srgb 0.882353 0.831373 0.717647 / 0.4)',
|
||||
'rgba(237, 180, 0, 0.4)',
|
||||
);
|
||||
|
||||
// We change the role of the second user to reader
|
||||
@@ -303,7 +298,7 @@ test.describe('Doc Comments', () => {
|
||||
|
||||
await expect(otherEditor.getByText('Hello')).toHaveCSS(
|
||||
'background-color',
|
||||
'color(srgb 0.882353 0.831373 0.717647 / 0.4)',
|
||||
'rgba(237, 180, 0, 0.4)',
|
||||
);
|
||||
await otherEditor.getByText('Hello').click();
|
||||
await expect(
|
||||
@@ -349,7 +344,7 @@ test.describe('Doc Comments', () => {
|
||||
|
||||
await expect(editor1.getByText('Document One')).toHaveCSS(
|
||||
'background-color',
|
||||
'color(srgb 0.882353 0.831373 0.717647 / 0.4)',
|
||||
'rgba(237, 180, 0, 0.4)',
|
||||
);
|
||||
|
||||
await editor1.getByText('Document One').click();
|
||||
|
||||
@@ -474,9 +474,7 @@ test.describe('Doc Header', () => {
|
||||
// Copy content to clipboard
|
||||
await page.getByLabel('Open the document options').click();
|
||||
await getMenuItem(page, 'Copy as Markdown').click();
|
||||
await expect(
|
||||
page.getByText('Copied as Markdown to clipboard'),
|
||||
).toBeVisible();
|
||||
await expect(page.getByText('Copied to clipboard')).toBeVisible();
|
||||
|
||||
// Test that clipboard is in Markdown format
|
||||
const handle = await page.evaluateHandle(() =>
|
||||
|
||||
@@ -29,7 +29,7 @@ test.describe('Document search', () => {
|
||||
await page.getByTestId('search-docs-button').click();
|
||||
|
||||
await expect(
|
||||
page.getByLabel('Search modal').locator('img[alt=""]'),
|
||||
page.getByRole('img', { name: 'No active search' }),
|
||||
).toBeVisible();
|
||||
|
||||
await expect(
|
||||
@@ -107,7 +107,7 @@ test.describe('Document search', () => {
|
||||
|
||||
await searchButton.click();
|
||||
await expect(
|
||||
page.getByRole('combobox', { name: 'Search documents' }),
|
||||
page.getByRole('combobox', { name: 'Quick search input' }),
|
||||
).toBeVisible();
|
||||
await expect(filters).toBeHidden();
|
||||
|
||||
@@ -120,7 +120,7 @@ test.describe('Document search', () => {
|
||||
|
||||
await searchButton.click();
|
||||
await expect(
|
||||
page.getByRole('combobox', { name: 'Search documents' }),
|
||||
page.getByRole('combobox', { name: 'Quick search input' }),
|
||||
).toBeVisible();
|
||||
await expect(filters).toBeHidden();
|
||||
|
||||
@@ -164,9 +164,9 @@ test.describe('Document search', () => {
|
||||
const searchButton = page.getByTestId('search-docs-button');
|
||||
|
||||
await searchButton.click();
|
||||
await page.getByRole('combobox', { name: 'Search documents' }).click();
|
||||
await page.getByRole('combobox', { name: 'Quick search input' }).click();
|
||||
await page
|
||||
.getByRole('combobox', { name: 'Search documents' })
|
||||
.getByRole('combobox', { name: 'Quick search input' })
|
||||
.fill('sub page search');
|
||||
|
||||
// Expect to find the first and second docs in the results list
|
||||
@@ -188,7 +188,7 @@ test.describe('Document search', () => {
|
||||
);
|
||||
await searchButton.click();
|
||||
await page
|
||||
.getByRole('combobox', { name: 'Search documents' })
|
||||
.getByRole('combobox', { name: 'Quick search input' })
|
||||
.fill('second');
|
||||
|
||||
// Now there is a sub page - expect to have the focus on the current doc
|
||||
|
||||
@@ -19,9 +19,7 @@ test.describe('Doc Table Content', () => {
|
||||
|
||||
await page.locator('.ProseMirror').click();
|
||||
|
||||
await expect(
|
||||
page.getByRole('button', { name: 'Show the table of contents' }),
|
||||
).toBeHidden();
|
||||
await expect(page.getByRole('button', { name: 'Summary' })).toBeHidden();
|
||||
|
||||
await page.keyboard.type('# Level 1\n## Level 2\n### Level 3');
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ test.describe('Doc Version', () => {
|
||||
await expect(page.getByText('History', { exact: true })).toBeVisible();
|
||||
|
||||
const modal = page.getByRole('dialog', { name: 'Version history' });
|
||||
const panel = modal.getByLabel('Version list');
|
||||
const panel = modal.getByLabel('version list');
|
||||
await expect(panel).toBeVisible();
|
||||
await expect(modal.getByText('No versions')).toBeVisible();
|
||||
|
||||
@@ -155,25 +155,20 @@ test.describe('Doc Version', () => {
|
||||
await getMenuItem(page, 'Version history').click();
|
||||
|
||||
const modal = page.getByRole('dialog', { name: 'Version history' });
|
||||
const panel = modal.getByLabel('Version list');
|
||||
const panel = modal.getByLabel('version list');
|
||||
await expect(panel).toBeVisible();
|
||||
|
||||
await expect(page.getByText('History', { exact: true })).toBeVisible();
|
||||
await panel.locator('.version-item').first().click();
|
||||
await panel.getByRole('button', { name: 'version item' }).click();
|
||||
|
||||
await expect(modal.getByText('World')).toBeHidden();
|
||||
|
||||
await page.getByRole('button', { name: 'Restore', exact: true }).click();
|
||||
await expect(
|
||||
page.getByText(
|
||||
"The current document will be replaced, but you'll still find it in the version history.",
|
||||
),
|
||||
).toBeVisible();
|
||||
await page.getByRole('button', { name: 'Restore' }).click();
|
||||
await expect(page.getByText('Your current document will')).toBeVisible();
|
||||
await page.getByText('If a member is editing, his').click();
|
||||
|
||||
await page.getByLabel('Restore', { exact: true }).click();
|
||||
|
||||
await page.waitForTimeout(500);
|
||||
|
||||
await expect(page.getByText('Hello')).toBeVisible();
|
||||
await expect(page.getByText('World')).toBeHidden();
|
||||
});
|
||||
|
||||
@@ -191,27 +191,25 @@ test.describe('Header: Override configuration', () => {
|
||||
});
|
||||
|
||||
test.describe('Header: Skip to Content', () => {
|
||||
test('it displays skip link on first TAB and focuses page heading on click', async ({
|
||||
test('it displays skip link on first TAB and focuses main content on click', async ({
|
||||
page,
|
||||
}) => {
|
||||
await page.goto('/');
|
||||
|
||||
// Wait for skip link to be mounted (client-side only component)
|
||||
const skipLink = page.getByRole('link', { name: 'Go to content' });
|
||||
await skipLink.waitFor({ state: 'attached' });
|
||||
// Wait for skip button to be mounted (client-side only component)
|
||||
const skipButton = page.getByRole('button', { name: 'Go to content' });
|
||||
await skipButton.waitFor({ state: 'attached' });
|
||||
|
||||
// First TAB shows the skip link
|
||||
// First TAB shows the skip button
|
||||
await page.keyboard.press('Tab');
|
||||
|
||||
// The skip link should be visible and focused
|
||||
await expect(skipLink).toBeFocused();
|
||||
await expect(skipLink).toBeVisible();
|
||||
// Clicking moves focus to the page heading
|
||||
await skipLink.click();
|
||||
const pageHeading = page.getByRole('heading', {
|
||||
name: 'All docs',
|
||||
level: 2,
|
||||
});
|
||||
await expect(pageHeading).toBeFocused();
|
||||
// The skip button should be visible and focused
|
||||
await expect(skipButton).toBeFocused();
|
||||
await expect(skipButton).toBeVisible();
|
||||
|
||||
// Clicking moves focus to the main content
|
||||
await skipButton.click();
|
||||
const mainContent = page.locator('main#mainContent');
|
||||
await expect(mainContent).toBeFocused();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { expect, test } from '@playwright/test';
|
||||
|
||||
import { createDoc, goToGridDoc, verifyDocName } from './utils-common';
|
||||
import { createRootSubPage } from './utils-sub-pages';
|
||||
|
||||
test.describe('Left panel desktop', () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
@@ -19,7 +18,7 @@ test.describe('Left panel desktop', () => {
|
||||
await expect(page.getByTestId('home-button')).toBeVisible();
|
||||
});
|
||||
|
||||
test('focuses page heading after switching the docs filter', async ({
|
||||
test('focuses main content after switching the docs filter', async ({
|
||||
page,
|
||||
}) => {
|
||||
await page.goto('/');
|
||||
@@ -29,11 +28,8 @@ test.describe('Left panel desktop', () => {
|
||||
await page.keyboard.press('Enter');
|
||||
await expect(page).toHaveURL(/target=my_docs/);
|
||||
|
||||
const pageHeading = page.getByRole('heading', {
|
||||
name: 'My docs',
|
||||
level: 2,
|
||||
});
|
||||
await expect(pageHeading).toBeFocused();
|
||||
const mainContent = page.locator('main#mainContent');
|
||||
await expect(mainContent).toBeFocused();
|
||||
});
|
||||
|
||||
test('checks resize handle is present and functional on document page', async ({
|
||||
@@ -122,47 +118,6 @@ test.describe('Left panel mobile', () => {
|
||||
await expect(logoutButton).toBeInViewport();
|
||||
});
|
||||
|
||||
test('checks panel closes when clicking on a subdoc', async ({
|
||||
page,
|
||||
browserName,
|
||||
}) => {
|
||||
const [docTitle] = await createDoc(
|
||||
page,
|
||||
'mobile-doc-test',
|
||||
browserName,
|
||||
1,
|
||||
true,
|
||||
);
|
||||
|
||||
const { name: docChild } = await createRootSubPage(
|
||||
page,
|
||||
browserName,
|
||||
'mobile-doc-test-child',
|
||||
true,
|
||||
);
|
||||
|
||||
const { name: docChild2 } = await createRootSubPage(
|
||||
page,
|
||||
browserName,
|
||||
'mobile-doc-test-child-2',
|
||||
true,
|
||||
);
|
||||
|
||||
const header = page.locator('header').first();
|
||||
await header.getByLabel('Open the header menu').click();
|
||||
|
||||
await expect(page.getByTestId('left-panel-mobile')).toBeInViewport();
|
||||
|
||||
const docTree = page.getByTestId('doc-tree');
|
||||
await expect(docTree.getByText(docTitle)).toBeVisible();
|
||||
await expect(docTree.getByText(docChild)).toBeVisible();
|
||||
await expect(docTree.getByText(docChild2)).toBeVisible();
|
||||
|
||||
await docTree.getByText(docChild).click();
|
||||
await verifyDocName(page, docChild);
|
||||
await expect(page.getByTestId('left-panel-mobile')).not.toBeInViewport();
|
||||
});
|
||||
|
||||
test('checks resize handle is not present on mobile', async ({ page }) => {
|
||||
await page.goto('/');
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "app-e2e",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"repository": "https://github.com/suitenumerique/docs",
|
||||
"author": "DINUM",
|
||||
"license": "MIT",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "app-impress",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"repository": "https://github.com/suitenumerique/docs",
|
||||
"author": "DINUM",
|
||||
"license": "MIT",
|
||||
@@ -64,7 +64,7 @@
|
||||
"idb": "8.0.3",
|
||||
"lodash": "4.17.23",
|
||||
"luxon": "3.7.2",
|
||||
"next": "16.1.7",
|
||||
"next": "16.1.6",
|
||||
"posthog-js": "1.347.2",
|
||||
"react": "*",
|
||||
"react-aria-components": "1.15.1",
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
viewBox="0 -960 960 960"
|
||||
fill="currentColor"
|
||||
>
|
||||
<path
|
||||
d="M360-240q-33 0-56.5-23.5T280-320v-480q0-33 23.5-56.5T360-880h360q33 0 56.5 23.5T800-800v480q0 33-23.5 56.5T720-240H360Zm0-80h360v-480H360v480ZM200-80q-33 0-56.5-23.5T120-160v-560h80v560h440v80H200Zm210-360h60v-180h40v120h60v-120h40v180h60v-200q0-17-11.5-28.5T630-680H450q-17 0-28.5 11.5T410-640v200Zm-50 120v-480 480Z"
|
||||
/>
|
||||
</svg>
|
||||
|
Before Width: | Height: | Size: 439 B |
@@ -1,33 +1,63 @@
|
||||
import { Button } from '@gouvfr-lasuite/cunningham-react';
|
||||
import { useState } from 'react';
|
||||
import { useRouter } from 'next/router';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { css } from 'styled-components';
|
||||
|
||||
import { Box } from '@/components';
|
||||
import { useCunninghamTheme } from '@/cunningham';
|
||||
import { MAIN_LAYOUT_ID } from '@/layouts/conf';
|
||||
import { focusMainContentStart } from '@/layouts/utils';
|
||||
|
||||
export const SkipToContent = () => {
|
||||
const { t } = useTranslation();
|
||||
const router = useRouter();
|
||||
const { spacingsTokens } = useCunninghamTheme();
|
||||
const [isVisible, setIsVisible] = useState(false);
|
||||
|
||||
// Reset focus after route change so first TAB goes to skip link
|
||||
useEffect(() => {
|
||||
const handleRouteChange = () => {
|
||||
(document.activeElement as HTMLElement)?.blur();
|
||||
|
||||
document.body.setAttribute('tabindex', '-1');
|
||||
document.body.focus({ preventScroll: true });
|
||||
|
||||
setTimeout(() => {
|
||||
document.body.removeAttribute('tabindex');
|
||||
}, 100);
|
||||
};
|
||||
|
||||
router.events.on('routeChangeComplete', handleRouteChange);
|
||||
return () => {
|
||||
router.events.off('routeChangeComplete', handleRouteChange);
|
||||
};
|
||||
}, [router.events]);
|
||||
|
||||
const handleClick = (e: React.MouseEvent) => {
|
||||
e.preventDefault();
|
||||
const focusTarget = focusMainContentStart();
|
||||
|
||||
if (focusTarget instanceof HTMLElement) {
|
||||
focusTarget.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
const mainContent = document.getElementById(MAIN_LAYOUT_ID);
|
||||
if (mainContent) {
|
||||
mainContent.focus();
|
||||
mainContent.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<Box>
|
||||
<Box
|
||||
$css={css`
|
||||
.c__button--brand--primary.--docs--skip-to-content:focus-visible {
|
||||
box-shadow:
|
||||
0 0 0 1px var(--c--globals--colors--white-000),
|
||||
0 0 0 4px var(--c--contextuals--border--semantic--brand--primary);
|
||||
border-radius: var(--c--globals--spacings--st);
|
||||
}
|
||||
`}
|
||||
>
|
||||
<Button
|
||||
href={`#${MAIN_LAYOUT_ID}`}
|
||||
onClick={handleClick}
|
||||
type="button"
|
||||
color="brand"
|
||||
className="--docs--skip-to-content"
|
||||
onClick={handleClick}
|
||||
onFocus={() => setIsVisible(true)}
|
||||
onBlur={() => setIsVisible(false)}
|
||||
style={{
|
||||
@@ -35,6 +65,7 @@ export const SkipToContent = () => {
|
||||
pointerEvents: isVisible ? 'auto' : 'none',
|
||||
position: 'fixed',
|
||||
top: spacingsTokens['2xs'],
|
||||
// padding header + logo(32px) + gap(3xs≈4px) + text "Docs"(≈70px) + 12px
|
||||
left: `calc(${spacingsTokens['base']} + 32px + ${spacingsTokens['3xs']} + 70px + 12px)`,
|
||||
zIndex: 9999,
|
||||
whiteSpace: 'nowrap',
|
||||
|
||||
@@ -46,7 +46,7 @@ export const FilterDropdown = ({
|
||||
$direction="row"
|
||||
$align="center"
|
||||
>
|
||||
<Text $weight={400} $variation="tertiary" $theme="neutral">
|
||||
<Text $weight={400} $variation="tertiary" $theme="neutral" $size="sm">
|
||||
{selectedOption?.label ?? options[0].label}
|
||||
</Text>
|
||||
<Icon
|
||||
|
||||
@@ -32,7 +32,6 @@ export type QuickSearchProps = {
|
||||
label?: string;
|
||||
placeholder?: string;
|
||||
groupKey?: string;
|
||||
beforeList?: ReactNode;
|
||||
};
|
||||
|
||||
export const QuickSearch = ({
|
||||
@@ -42,7 +41,6 @@ export const QuickSearch = ({
|
||||
showInput = true,
|
||||
label,
|
||||
placeholder,
|
||||
beforeList,
|
||||
children,
|
||||
}: PropsWithChildren<QuickSearchProps>) => {
|
||||
const ref = useRef<HTMLDivElement | null>(null);
|
||||
@@ -78,7 +76,6 @@ export const QuickSearch = ({
|
||||
{inputContent}
|
||||
</QuickSearchInput>
|
||||
)}
|
||||
{beforeList}
|
||||
<Command.List id={listId} aria-label={label} role="listbox">
|
||||
<Box>{children}</Box>
|
||||
</Command.List>
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import { Command } from 'cmdk';
|
||||
import { PropsWithChildren, useEffect, useRef } from 'react';
|
||||
import { PropsWithChildren } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { HorizontalSeparator } from '@/components';
|
||||
import { useCunninghamTheme } from '@/cunningham';
|
||||
import { useFocusStore } from '@/stores';
|
||||
|
||||
import { Box } from '../Box';
|
||||
import { Icon } from '../Icon';
|
||||
@@ -15,6 +14,7 @@ type QuickSearchInputProps = {
|
||||
placeholder?: string;
|
||||
withSeparator?: boolean;
|
||||
listId?: string;
|
||||
isExpanded?: boolean;
|
||||
};
|
||||
export const QuickSearchInput = ({
|
||||
inputValue,
|
||||
@@ -26,12 +26,6 @@ export const QuickSearchInput = ({
|
||||
}: PropsWithChildren<QuickSearchInputProps>) => {
|
||||
const { t } = useTranslation();
|
||||
const { spacingsTokens } = useCunninghamTheme();
|
||||
const inputRef = useRef<HTMLInputElement>(null);
|
||||
const addLastFocus = useFocusStore((state) => state.addLastFocus);
|
||||
|
||||
useEffect(() => {
|
||||
addLastFocus(inputRef.current);
|
||||
}, [addLastFocus]);
|
||||
|
||||
if (children) {
|
||||
return (
|
||||
@@ -49,11 +43,10 @@ export const QuickSearchInput = ({
|
||||
$align="center"
|
||||
className="quick-search-input"
|
||||
$gap={spacingsTokens['2xs']}
|
||||
$padding={{ horizontal: 'base', vertical: 'xxs' }}
|
||||
$padding={{ horizontal: 'base', vertical: 'xs' }}
|
||||
>
|
||||
<Icon iconName="search" $variation="secondary" aria-hidden="true" />
|
||||
<Command.Input
|
||||
ref={inputRef}
|
||||
autoFocus={true}
|
||||
aria-label={t('Quick search input')}
|
||||
aria-controls={listId}
|
||||
@@ -65,7 +58,7 @@ export const QuickSearchInput = ({
|
||||
data-testid="quick-search-input"
|
||||
/>
|
||||
</Box>
|
||||
{separator && <HorizontalSeparator $margin={{ top: 'base' }} />}
|
||||
{separator && <HorizontalSeparator $margin={{ top: '2xs' }} />}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -16,9 +16,10 @@ export const QuickSearchStyle = createGlobalStyle`
|
||||
}
|
||||
|
||||
[cmdk-input] {
|
||||
font-family: var(--c--globals--font--families--base);
|
||||
border: none;
|
||||
width: 100%;
|
||||
font-size: 17px;
|
||||
font-size: 16px;
|
||||
background: white;
|
||||
outline: none;
|
||||
color: var(--c--contextuals--content--semantic--neutral--primary);
|
||||
|
||||
@@ -2,7 +2,7 @@ import clsx from 'clsx';
|
||||
import { useEffect, useState } from 'react';
|
||||
|
||||
import { Box, Loading } from '@/components';
|
||||
import { DocHeader } from '@/docs/doc-header/';
|
||||
import { DocHeader, FloatingBar } from '@/docs/doc-header/';
|
||||
import {
|
||||
Doc,
|
||||
LinkReach,
|
||||
@@ -35,6 +35,7 @@ export const DocEditorContainer = ({
|
||||
|
||||
return (
|
||||
<>
|
||||
{isDesktop && <FloatingBar />}
|
||||
<Box
|
||||
$maxWidth="868px"
|
||||
$width="100%"
|
||||
|
||||
@@ -8,37 +8,12 @@ export const cssComments = (
|
||||
& .--docs--main-editor .ProseMirror {
|
||||
// Comments marks in the editor
|
||||
.bn-editor {
|
||||
// Resets blocknote comments styles
|
||||
.bn-thread-mark,
|
||||
.bn-thread-mark-selected {
|
||||
background-color: transparent;
|
||||
.bn-thread-mark:not([data-orphan='true']),
|
||||
.bn-thread-mark-selected:not([data-orphan='true']) {
|
||||
background: ${canSeeComment ? '#EDB40066' : 'transparent'};
|
||||
color: var(--c--globals--colors--gray-700);
|
||||
}
|
||||
|
||||
${canSeeComment &&
|
||||
css`
|
||||
.bn-thread-mark:not([data-orphan='true']) {
|
||||
background-color: color-mix(
|
||||
in srgb,
|
||||
var(--c--contextuals--background--palette--yellow--tertiary) 40%,
|
||||
transparent
|
||||
);
|
||||
border-bottom: 2px solid
|
||||
var(--c--contextuals--background--palette--yellow--secondary);
|
||||
|
||||
mix-blend-mode: multiply;
|
||||
|
||||
transition:
|
||||
background-color var(--c--globals--transitions--duration),
|
||||
border-bottom-color var(--c--globals--transitions--duration);
|
||||
|
||||
&:has(.bn-thread-mark-selected) {
|
||||
background-color: var(
|
||||
--c--contextuals--background--palette--yellow--tertiary
|
||||
);
|
||||
}
|
||||
}
|
||||
`}
|
||||
|
||||
[data-show-selection] {
|
||||
color: HighlightText;
|
||||
}
|
||||
|
||||
@@ -1,83 +1,11 @@
|
||||
import { announce } from '@react-aria/live-announcer';
|
||||
import { useCallback, useEffect } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { useEffect } from 'react';
|
||||
|
||||
import { DocsBlockNoteEditor } from '../types';
|
||||
|
||||
const getFormattingShortcutLabel = (
|
||||
event: KeyboardEvent,
|
||||
t: (key: string) => string,
|
||||
): string | null => {
|
||||
const isMod = event.ctrlKey || event.metaKey;
|
||||
if (!isMod) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (event.altKey) {
|
||||
switch (event.code) {
|
||||
case 'Digit1':
|
||||
return t('Heading 1 applied');
|
||||
case 'Digit2':
|
||||
return t('Heading 2 applied');
|
||||
case 'Digit3':
|
||||
return t('Heading 3 applied');
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
if (event.shiftKey) {
|
||||
switch (event.code) {
|
||||
case 'Digit0':
|
||||
return t('Paragraph applied');
|
||||
case 'Digit6':
|
||||
return t('Toggle list applied');
|
||||
case 'Digit7':
|
||||
return t('Numbered list applied');
|
||||
case 'Digit8':
|
||||
return t('Bulleted list applied');
|
||||
case 'Digit9':
|
||||
return t('Checklist applied');
|
||||
case 'KeyC':
|
||||
return t('Code block applied');
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
};
|
||||
|
||||
export const useShortcuts = (
|
||||
editor: DocsBlockNoteEditor,
|
||||
el: HTMLDivElement | null,
|
||||
) => {
|
||||
const { t } = useTranslation();
|
||||
|
||||
const handleFormattingShortcut = useCallback(
|
||||
(event: KeyboardEvent) => {
|
||||
if (!editor?.isFocused()) {
|
||||
return;
|
||||
}
|
||||
|
||||
const label = getFormattingShortcutLabel(event, t);
|
||||
if (label) {
|
||||
setTimeout(() => {
|
||||
announce(label, 'assertive');
|
||||
}, 150);
|
||||
}
|
||||
},
|
||||
[editor, t],
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
el?.addEventListener('keydown', handleFormattingShortcut, true);
|
||||
|
||||
return () => {
|
||||
el?.removeEventListener('keydown', handleFormattingShortcut, true);
|
||||
};
|
||||
}, [el, handleFormattingShortcut]);
|
||||
|
||||
useEffect(() => {
|
||||
// Check if editor and its view are mounted
|
||||
if (!editor || !el) {
|
||||
|
||||
@@ -60,23 +60,6 @@ export const ModalExport = ({ onClose, doc }: ModalExportProps) => {
|
||||
const { untitledDocument } = useTrans();
|
||||
const mediaUrl = useMediaUrl();
|
||||
|
||||
const formatOptions = [
|
||||
{ label: t('PDF'), value: DocDownloadFormat.PDF },
|
||||
{ label: t('Docx'), value: DocDownloadFormat.DOCX },
|
||||
{ label: t('ODT'), value: DocDownloadFormat.ODT },
|
||||
{ label: t('HTML'), value: DocDownloadFormat.HTML },
|
||||
{ label: t('Print'), value: DocDownloadFormat.PRINT },
|
||||
];
|
||||
|
||||
const formatLabels = Object.fromEntries(
|
||||
formatOptions.map((opt) => [opt.value, opt.label]),
|
||||
);
|
||||
|
||||
const downloadButtonAriaLabel =
|
||||
format === DocDownloadFormat.PRINT
|
||||
? t('Print')
|
||||
: t('Download {{format}}', { format: formatLabels[format] });
|
||||
|
||||
async function onSubmit() {
|
||||
if (!editor) {
|
||||
toast(t('The export failed'), VariantType.ERROR);
|
||||
@@ -228,7 +211,9 @@ export const ModalExport = ({ onClose, doc }: ModalExportProps) => {
|
||||
</Button>
|
||||
<Button
|
||||
data-testid="doc-export-download-button"
|
||||
aria-label={downloadButtonAriaLabel}
|
||||
aria-label={
|
||||
format === DocDownloadFormat.PRINT ? t('Print') : t('Download')
|
||||
}
|
||||
variant="primary"
|
||||
fullWidth
|
||||
onClick={() => void onSubmit()}
|
||||
@@ -275,7 +260,13 @@ export const ModalExport = ({ onClose, doc }: ModalExportProps) => {
|
||||
clearable={false}
|
||||
fullWidth
|
||||
label={t('Format')}
|
||||
options={formatOptions}
|
||||
options={[
|
||||
{ label: t('PDF'), value: DocDownloadFormat.PDF },
|
||||
{ label: t('Docx'), value: DocDownloadFormat.DOCX },
|
||||
{ label: t('ODT'), value: DocDownloadFormat.ODT },
|
||||
{ label: t('HTML'), value: DocDownloadFormat.HTML },
|
||||
{ label: t('Print'), value: DocDownloadFormat.PRINT },
|
||||
]}
|
||||
value={format}
|
||||
onChange={(options) =>
|
||||
setFormat(options.target.value as DocDownloadFormat)
|
||||
|
||||
@@ -17,7 +17,6 @@ import GroupSVG from '@/assets/icons/ui-kit/group.svg';
|
||||
import HistorySVG from '@/assets/icons/ui-kit/history.svg';
|
||||
import KeepSVG from '@/assets/icons/ui-kit/keep.svg';
|
||||
import KeepOffSVG from '@/assets/icons/ui-kit/keep_off.svg';
|
||||
import MarkdownCopySVG from '@/assets/icons/ui-kit/markdown_copy.svg';
|
||||
import {
|
||||
Box,
|
||||
DropdownMenu,
|
||||
@@ -194,7 +193,7 @@ export const DocToolBox = ({ doc }: DocToolBoxProps) => {
|
||||
},
|
||||
{
|
||||
label: t('Copy as {{format}}', { format: 'Markdown' }),
|
||||
icon: <MarkdownCopySVG width={24} height={24} />,
|
||||
icon: <ContentCopySVG width={24} height={24} />,
|
||||
callback: () => {
|
||||
void copyCurrentEditorToClipboard('markdown');
|
||||
},
|
||||
|
||||
@@ -13,8 +13,7 @@ export const useCopyCurrentEditorToClipboard = () => {
|
||||
|
||||
return async (asFormat: 'html' | 'markdown') => {
|
||||
if (!editor) {
|
||||
const message = t('Editor unavailable');
|
||||
toast(message, VariantType.ERROR, { duration: 3000 });
|
||||
toast(t('Editor unavailable'), VariantType.ERROR, { duration: 3000 });
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -24,20 +23,10 @@ export const useCopyCurrentEditorToClipboard = () => {
|
||||
? await editor.blocksToHTMLLossy()
|
||||
: await editor.blocksToMarkdownLossy();
|
||||
await navigator.clipboard.writeText(editorContentFormatted);
|
||||
const successMessage =
|
||||
asFormat === 'markdown'
|
||||
? t('Copied as Markdown to clipboard')
|
||||
: t('Copied to clipboard');
|
||||
|
||||
toast(successMessage, VariantType.SUCCESS, { duration: 3000 });
|
||||
toast(t('Copied to clipboard'), VariantType.SUCCESS, { duration: 3000 });
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
const errorMessage =
|
||||
asFormat === 'markdown'
|
||||
? t('Failed to copy as Markdown to clipboard')
|
||||
: t('Failed to copy to clipboard');
|
||||
|
||||
toast(errorMessage, VariantType.ERROR, {
|
||||
toast(t('Failed to copy to clipboard'), VariantType.ERROR, {
|
||||
duration: 3000,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,6 +1,4 @@
|
||||
import { announce } from '@react-aria/live-announcer';
|
||||
import { useMutation, useQueryClient } from '@tanstack/react-query';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { APIError, errorCauses, fetchAPI } from '@/api';
|
||||
|
||||
@@ -31,8 +29,6 @@ export function useCreateFavoriteDoc({
|
||||
listInvalidQueries,
|
||||
}: CreateFavoriteDocProps) {
|
||||
const queryClient = useQueryClient();
|
||||
const { t } = useTranslation();
|
||||
|
||||
return useMutation<void, APIError, CreateFavoriteDocParams>({
|
||||
mutationFn: createFavoriteDoc,
|
||||
onSuccess: () => {
|
||||
@@ -41,15 +37,7 @@ export function useCreateFavoriteDoc({
|
||||
queryKey: [queryKey],
|
||||
});
|
||||
});
|
||||
|
||||
const message = t('Document pinned successfully!');
|
||||
announce(message, 'polite');
|
||||
|
||||
onSuccess?.();
|
||||
},
|
||||
onError: () => {
|
||||
const message = t('Failed to pin the document.');
|
||||
announce(message, 'assertive');
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,6 +1,4 @@
|
||||
import { announce } from '@react-aria/live-announcer';
|
||||
import { useMutation, useQueryClient } from '@tanstack/react-query';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { APIError, errorCauses, fetchAPI } from '@/api';
|
||||
|
||||
@@ -31,8 +29,6 @@ export function useDeleteFavoriteDoc({
|
||||
listInvalidQueries,
|
||||
}: DeleteFavoriteDocProps) {
|
||||
const queryClient = useQueryClient();
|
||||
const { t } = useTranslation();
|
||||
|
||||
return useMutation<void, APIError, DeleteFavoriteDocParams>({
|
||||
mutationFn: deleteFavoriteDoc,
|
||||
onSuccess: () => {
|
||||
@@ -41,15 +37,7 @@ export function useDeleteFavoriteDoc({
|
||||
queryKey: [queryKey],
|
||||
});
|
||||
});
|
||||
|
||||
const message = t('Document unpinned successfully!');
|
||||
announce(message, 'polite');
|
||||
|
||||
onSuccess?.();
|
||||
},
|
||||
onError: () => {
|
||||
const message = t('Failed to unpin the document.');
|
||||
announce(message, 'assertive');
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
@@ -88,16 +88,14 @@ export function useDuplicateDoc(options?: DuplicateDocOptions) {
|
||||
queryKey: [KEY_LIST_DOC],
|
||||
});
|
||||
|
||||
const message = t('Document duplicated successfully!');
|
||||
toast(message, VariantType.SUCCESS, {
|
||||
toast(t('Document duplicated successfully!'), VariantType.SUCCESS, {
|
||||
duration: 3000,
|
||||
});
|
||||
|
||||
void options?.onSuccess?.(data, variables, onMutateResult, context);
|
||||
},
|
||||
onError: (error, variables, onMutateResult, context) => {
|
||||
const message = t('Failed to duplicate the document...');
|
||||
toast(message, VariantType.ERROR, {
|
||||
toast(t('Failed to duplicate the document...'), VariantType.ERROR, {
|
||||
duration: 3000,
|
||||
});
|
||||
|
||||
|
||||
@@ -30,8 +30,6 @@ const defaultValues = {
|
||||
|
||||
type ExtendedCloseEvent = CloseEvent & { wasClean: boolean };
|
||||
|
||||
let reconnectTimeout: ReturnType<typeof setTimeout> | undefined;
|
||||
|
||||
export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
|
||||
...defaultValues,
|
||||
createProvider: (wsUrl, storeId, initialDoc) => {
|
||||
@@ -50,20 +48,7 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
|
||||
onDisconnect(data) {
|
||||
// Attempt to reconnect if the disconnection was clean (initiated by the client or server)
|
||||
if ((data.event as ExtendedCloseEvent).wasClean) {
|
||||
if (data.event.reason === 'No cookies' && data.event.code === 4001) {
|
||||
console.error(
|
||||
'Disconnection due to missing cookies. Not attempting to reconnect.',
|
||||
);
|
||||
void provider.disconnect();
|
||||
set({
|
||||
isReady: true,
|
||||
isConnected: false,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
clearTimeout(reconnectTimeout);
|
||||
reconnectTimeout = setTimeout(() => void provider.connect(), 1000);
|
||||
void provider.connect();
|
||||
}
|
||||
},
|
||||
onAuthenticationFailed() {
|
||||
@@ -122,7 +107,6 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
|
||||
return provider;
|
||||
},
|
||||
destroyProvider: () => {
|
||||
clearTimeout(reconnectTimeout);
|
||||
const provider = get().provider;
|
||||
if (provider) {
|
||||
provider.destroy();
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { announce } from '@react-aria/live-announcer';
|
||||
import { t } from 'i18next';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { InView } from 'react-intersection-observer';
|
||||
@@ -74,12 +73,10 @@ export const DocSearchContent = ({
|
||||
docs = docs.filter(filterResults);
|
||||
}
|
||||
|
||||
const elements = search || isSearchNotMandatory ? docs : [];
|
||||
|
||||
setDocsData({
|
||||
groupName: docs.length > 0 ? groupName : '',
|
||||
groupKey: 'docs',
|
||||
elements,
|
||||
elements: search || isSearchNotMandatory ? docs : [],
|
||||
emptyString: t('No document found'),
|
||||
endActions: hasNextPage
|
||||
? [
|
||||
@@ -93,13 +90,6 @@ export const DocSearchContent = ({
|
||||
]
|
||||
: [],
|
||||
});
|
||||
|
||||
if (search) {
|
||||
announce(
|
||||
t('{{count}} result(s) available', { count: elements.length }),
|
||||
'polite',
|
||||
);
|
||||
}
|
||||
}, [
|
||||
search,
|
||||
data?.pages,
|
||||
|
||||
@@ -38,19 +38,19 @@ export const DocSearchFilters = ({
|
||||
$justify="space-between"
|
||||
$gap="10px"
|
||||
data-testid="doc-search-filters"
|
||||
$margin={{ vertical: 'base' }}
|
||||
$margin={{ vertical: 'sm' }}
|
||||
>
|
||||
<Box $direction="row" $align="center" $gap="10px">
|
||||
<FilterDropdown
|
||||
selectedValue={values?.target}
|
||||
options={[
|
||||
{
|
||||
label: t('All docs'),
|
||||
label: t('All documents'),
|
||||
value: DocSearchTarget.ALL,
|
||||
callback: () => handleTargetChange(DocSearchTarget.ALL),
|
||||
},
|
||||
{
|
||||
label: t('Current doc'),
|
||||
label: t('Current document only'),
|
||||
value: DocSearchTarget.CURRENT,
|
||||
callback: () => handleTargetChange(DocSearchTarget.CURRENT),
|
||||
},
|
||||
@@ -58,13 +58,7 @@ export const DocSearchFilters = ({
|
||||
/>
|
||||
</Box>
|
||||
{hasFilters && (
|
||||
<Button
|
||||
color="brand"
|
||||
variant="tertiary"
|
||||
size="small"
|
||||
onClick={onReset}
|
||||
aria-label={t('Reset search filters')}
|
||||
>
|
||||
<Button color="brand" variant="tertiary" size="small" onClick={onReset}>
|
||||
{t('Reset')}
|
||||
</Button>
|
||||
)}
|
||||
|
||||
@@ -14,7 +14,7 @@ import {
|
||||
DocSearchFiltersValues,
|
||||
DocSearchTarget,
|
||||
} from '@/docs/doc-search';
|
||||
import { useFocusStore, useResponsiveStore } from '@/stores';
|
||||
import { useResponsiveStore } from '@/stores';
|
||||
|
||||
import EmptySearchIcon from '../assets/illustration-docs-empty.png';
|
||||
|
||||
@@ -36,7 +36,6 @@ const DocSearchModalGlobal = ({
|
||||
}: DocSearchModalGlobalProps) => {
|
||||
const { t } = useTranslation();
|
||||
const [loading, setLoading] = useState(false);
|
||||
const restoreFocus = useFocusStore((state) => state.restoreFocus);
|
||||
const router = useRouter();
|
||||
const [search, setSearch] = useState('');
|
||||
const [filters, setFilters] = useState<DocSearchFiltersValues>(
|
||||
@@ -52,7 +51,6 @@ const DocSearchModalGlobal = ({
|
||||
|
||||
const handleResetFilters = () => {
|
||||
setFilters({});
|
||||
restoreFocus();
|
||||
};
|
||||
|
||||
return (
|
||||
@@ -62,52 +60,45 @@ const DocSearchModalGlobal = ({
|
||||
size={isDesktop ? ModalSize.LARGE : ModalSize.FULL}
|
||||
hideCloseButton
|
||||
aria-describedby="doc-search-modal-title"
|
||||
title={
|
||||
<>
|
||||
<Text as="h2" $margin="0" $size="s" $align="flex-start">
|
||||
{t('Search for a document')}
|
||||
</Text>
|
||||
<Box $position="absolute" $css="top: 4px; right: 4px;">
|
||||
<ButtonCloseModal
|
||||
aria-label={t('Close the search modal')}
|
||||
onClick={modalProps.onClose}
|
||||
size="small"
|
||||
color="brand"
|
||||
variant="tertiary"
|
||||
/>
|
||||
</Box>
|
||||
</>
|
||||
}
|
||||
>
|
||||
<Box
|
||||
aria-label={t('Search modal')}
|
||||
$direction="column"
|
||||
$justify="space-between"
|
||||
className="--docs--doc-search-modal"
|
||||
$padding={{ vertical: 'base' }}
|
||||
$padding={{ bottom: 'base' }}
|
||||
>
|
||||
<Text
|
||||
as="h1"
|
||||
$margin="0"
|
||||
id="doc-search-modal-title"
|
||||
className="sr-only"
|
||||
>
|
||||
{t('Search docs')}
|
||||
</Text>
|
||||
<Box $position="absolute" $css="top: 4px; right: 4px;">
|
||||
<ButtonCloseModal
|
||||
aria-label={t('Close the search modal')}
|
||||
onClick={modalProps.onClose}
|
||||
size="small"
|
||||
color="brand"
|
||||
variant="tertiary"
|
||||
/>
|
||||
</Box>
|
||||
<QuickSearch
|
||||
label={t('Search documents')}
|
||||
placeholder={t('Type the name of a document')}
|
||||
loading={loading}
|
||||
onFilter={handleInputSearch}
|
||||
beforeList={
|
||||
showFilters ? (
|
||||
<Box $padding={{ horizontal: '10px' }}>
|
||||
<DocSearchFilters
|
||||
values={filters}
|
||||
onValuesChange={setFilters}
|
||||
onReset={handleResetFilters}
|
||||
/>
|
||||
</Box>
|
||||
) : undefined
|
||||
}
|
||||
>
|
||||
<Box
|
||||
$padding={{ horizontal: '10px', vertical: 'base' }}
|
||||
$padding={{ horizontal: 'sm', vertical: 'base' }}
|
||||
$height={isDesktop ? '500px' : 'calc(100vh - 68px - 1rem)'}
|
||||
>
|
||||
{showFilters && (
|
||||
<DocSearchFilters
|
||||
values={filters}
|
||||
onValuesChange={setFilters}
|
||||
onReset={handleResetFilters}
|
||||
/>
|
||||
)}
|
||||
{search.length === 0 && (
|
||||
<Box
|
||||
$direction="column"
|
||||
@@ -115,7 +106,11 @@ const DocSearchModalGlobal = ({
|
||||
$align="center"
|
||||
$justify="center"
|
||||
>
|
||||
<Image width={320} src={EmptySearchIcon} alt="" />
|
||||
<Image
|
||||
width={320}
|
||||
src={EmptySearchIcon}
|
||||
alt={t('No active search')}
|
||||
/>
|
||||
</Box>
|
||||
)}
|
||||
{search && (
|
||||
|
||||
@@ -51,13 +51,11 @@ export const Heading = ({
|
||||
|
||||
editor.setTextCursorPosition(headingId, 'end');
|
||||
|
||||
document
|
||||
.querySelector<HTMLElement>(`[data-id="${headingId}"]`)
|
||||
?.scrollIntoView({
|
||||
behavior: 'smooth',
|
||||
inline: 'start',
|
||||
block: 'start',
|
||||
});
|
||||
document.querySelector(`[data-id="${headingId}"]`)?.scrollIntoView({
|
||||
behavior: 'smooth',
|
||||
inline: 'start',
|
||||
block: 'start',
|
||||
});
|
||||
}}
|
||||
$radius="var(--c--globals--spacings--st)"
|
||||
$background={
|
||||
|
||||
@@ -91,7 +91,7 @@ export const TableContent = () => {
|
||||
$height="100%"
|
||||
$justify="center"
|
||||
$align="center"
|
||||
aria-label={t('Show the table of contents')}
|
||||
aria-label={t('Summary')}
|
||||
aria-expanded={isOpen}
|
||||
aria-controls="toc-list"
|
||||
$css={css`
|
||||
@@ -218,8 +218,8 @@ const TableContentOpened = ({
|
||||
onClick={onClose}
|
||||
$justify="center"
|
||||
$align="center"
|
||||
aria-label={t('Hide the table of contents')}
|
||||
aria-expanded={true}
|
||||
aria-label={t('Summary')}
|
||||
aria-expanded="true"
|
||||
aria-controls="toc-list"
|
||||
$css={css`
|
||||
transition: none !important;
|
||||
|
||||
@@ -5,8 +5,8 @@ import {
|
||||
VariantType,
|
||||
useToastProvider,
|
||||
} from '@gouvfr-lasuite/cunningham-react';
|
||||
import { useRouter } from 'next/router';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { createGlobalStyle } from 'styled-components';
|
||||
|
||||
import { Box, Text } from '@/components';
|
||||
import {
|
||||
@@ -21,22 +21,15 @@ import { KEY_LIST_DOC_VERSIONS } from '../api/useDocVersions';
|
||||
import { Versions } from '../types';
|
||||
import { revertUpdate } from '../utils';
|
||||
|
||||
const ModalStyle = createGlobalStyle`
|
||||
.c__modal__title {
|
||||
margin-bottom: var(--c--globals--spacings--sm);
|
||||
}
|
||||
`;
|
||||
|
||||
interface ModalConfirmationVersionProps {
|
||||
docId: Doc['id'];
|
||||
onClose: () => void;
|
||||
onSuccess: () => void;
|
||||
docId: Doc['id'];
|
||||
|
||||
versionId: Versions['version_id'];
|
||||
}
|
||||
|
||||
export const ModalConfirmationVersion = ({
|
||||
onClose,
|
||||
onSuccess,
|
||||
docId,
|
||||
versionId,
|
||||
}: ModalConfirmationVersionProps) => {
|
||||
@@ -46,13 +39,14 @@ export const ModalConfirmationVersion = ({
|
||||
});
|
||||
const { t } = useTranslation();
|
||||
const { toast } = useToastProvider();
|
||||
const { push } = useRouter();
|
||||
const { provider } = useProviderStore();
|
||||
const { mutate: updateDoc } = useUpdateDoc({
|
||||
listInvalidQueries: [KEY_LIST_DOC_VERSIONS],
|
||||
onSuccess: () => {
|
||||
const onDisplaySuccess = () => {
|
||||
toast(t('Version restored successfully'), VariantType.SUCCESS);
|
||||
onSuccess();
|
||||
void push(`/docs/${docId}`);
|
||||
};
|
||||
|
||||
if (!provider || !version?.content) {
|
||||
@@ -70,10 +64,6 @@ export const ModalConfirmationVersion = ({
|
||||
},
|
||||
});
|
||||
|
||||
if (!version) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<Modal
|
||||
isOpen
|
||||
@@ -112,7 +102,7 @@ export const ModalConfirmationVersion = ({
|
||||
</Button>
|
||||
</>
|
||||
}
|
||||
size={ModalSize.MEDIUM}
|
||||
size={ModalSize.SMALL}
|
||||
title={
|
||||
<Text
|
||||
as="h1"
|
||||
@@ -121,17 +111,17 @@ export const ModalConfirmationVersion = ({
|
||||
$size="h6"
|
||||
$align="flex-start"
|
||||
>
|
||||
{t('Restoring an older version')}
|
||||
{t('Warning')}
|
||||
</Text>
|
||||
}
|
||||
>
|
||||
<ModalStyle />
|
||||
<Box className="--docs--modal-confirmation-version">
|
||||
<Box>
|
||||
<Text $variation="secondary" as="p" $margin="none">
|
||||
{t(
|
||||
"The current document will be replaced, but you'll still find it in the version history.",
|
||||
)}
|
||||
<Text $variation="secondary" as="p">
|
||||
{t('Your current document will revert to this version.')}
|
||||
</Text>
|
||||
<Text $variation="secondary" as="p">
|
||||
{t('If a member is editing, his works can be lost.')}
|
||||
</Text>
|
||||
</Box>
|
||||
</Box>
|
||||
|
||||
@@ -114,12 +114,11 @@ export const ModalSelectVersion = ({
|
||||
$height="calc(100vh - 2em - 12px)"
|
||||
$css={css`
|
||||
overflow-y: hidden;
|
||||
border-left: 1px solid
|
||||
var(--c--contextuals--border--surface--primary);
|
||||
border-left: 1px solid var(--c--globals--colors--gray-200);
|
||||
`}
|
||||
>
|
||||
<Box
|
||||
aria-label={t('Version list')}
|
||||
aria-label="version list"
|
||||
$css={css`
|
||||
overflow-y: auto;
|
||||
flex: 1;
|
||||
@@ -131,8 +130,7 @@ export const ModalSelectVersion = ({
|
||||
$direction="row"
|
||||
$align="center"
|
||||
$css={css`
|
||||
border-bottom: 1px solid
|
||||
var(--c--contextuals--border--surface--primary);
|
||||
border-bottom: 1px solid var(--c--globals--colors--gray-200);
|
||||
`}
|
||||
$padding="sm"
|
||||
>
|
||||
@@ -157,8 +155,7 @@ export const ModalSelectVersion = ({
|
||||
<Box
|
||||
$padding="xs"
|
||||
$css={css`
|
||||
border-top: 1px solid
|
||||
var(--c--contextuals--border--surface--primary);
|
||||
border-top: 1px solid var(--c--globals--colors--gray-200);
|
||||
`}
|
||||
>
|
||||
<Button
|
||||
@@ -177,9 +174,6 @@ export const ModalSelectVersion = ({
|
||||
<ModalConfirmationVersion
|
||||
onClose={() => {
|
||||
restoreModal.close();
|
||||
}}
|
||||
onSuccess={() => {
|
||||
restoreModal.close();
|
||||
onClose();
|
||||
setSelectedVersionId(undefined);
|
||||
}}
|
||||
|
||||
@@ -1,38 +1,78 @@
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import dynamic from 'next/dynamic';
|
||||
import { useState } from 'react';
|
||||
|
||||
import { BoxButton, Text } from '@/components';
|
||||
import { Box, Text } from '@/components';
|
||||
import { useCunninghamTheme } from '@/cunningham';
|
||||
import { Doc } from '@/docs/doc-management';
|
||||
|
||||
import { Versions } from '../types';
|
||||
|
||||
const ModalConfirmationVersion = dynamic(
|
||||
() =>
|
||||
import('./ModalConfirmationVersion').then((mod) => ({
|
||||
default: mod.ModalConfirmationVersion,
|
||||
})),
|
||||
{ ssr: false },
|
||||
);
|
||||
|
||||
interface VersionItemProps {
|
||||
docId: Doc['id'];
|
||||
text: string;
|
||||
|
||||
versionId?: Versions['version_id'];
|
||||
isActive: boolean;
|
||||
onSelect?: () => void;
|
||||
}
|
||||
|
||||
export const VersionItem = ({ text, isActive, onSelect }: VersionItemProps) => {
|
||||
const { t } = useTranslation();
|
||||
const { spacingsTokens } = useCunninghamTheme();
|
||||
export const VersionItem = ({
|
||||
docId,
|
||||
versionId,
|
||||
text,
|
||||
|
||||
isActive,
|
||||
}: VersionItemProps) => {
|
||||
const { colorsTokens, spacingsTokens } = useCunninghamTheme();
|
||||
|
||||
const [isModalVersionOpen, setIsModalVersionOpen] = useState(false);
|
||||
|
||||
return (
|
||||
<BoxButton
|
||||
aria-label={t('Restore version of {{date}}', { date: text })}
|
||||
aria-pressed={isActive}
|
||||
$width="100%"
|
||||
$css={`
|
||||
background: ${isActive ? 'var(--c--contextuals--background--semantic--overlay--primary)' : 'transparent'};
|
||||
&:focus-visible, &:hover {
|
||||
background: var(--c--contextuals--background--semantic--overlay--primary);
|
||||
}
|
||||
`}
|
||||
className="version-item --docs--version-item"
|
||||
onClick={onSelect}
|
||||
$radius={spacingsTokens['3xs']}
|
||||
$padding={{ vertical: 'm', horizontal: 'xs' }}
|
||||
$hasTransition
|
||||
>
|
||||
<Text $weight="bold" $size="sm" $textAlign="left">
|
||||
{text}
|
||||
</Text>
|
||||
</BoxButton>
|
||||
<>
|
||||
<Box
|
||||
$width="100%"
|
||||
as="li"
|
||||
$background={isActive ? colorsTokens['gray-100'] : 'transparent'}
|
||||
$radius={spacingsTokens['3xs']}
|
||||
$css={`
|
||||
cursor: pointer;
|
||||
|
||||
&:hover {
|
||||
background: ${colorsTokens['gray-100']};
|
||||
}
|
||||
`}
|
||||
$hasTransition
|
||||
$minWidth="13rem"
|
||||
className="--docs--version-item"
|
||||
>
|
||||
<Box
|
||||
$padding={{ vertical: '0.7rem', horizontal: 'small' }}
|
||||
$align="center"
|
||||
$direction="row"
|
||||
$justify="space-between"
|
||||
$width="100%"
|
||||
>
|
||||
<Box $direction="row" $gap="0.5rem" $align="center">
|
||||
<Text $weight="bold" $size="sm">
|
||||
{text}
|
||||
</Text>
|
||||
</Box>
|
||||
</Box>
|
||||
</Box>
|
||||
{isModalVersionOpen && versionId && (
|
||||
<ModalConfirmationVersion
|
||||
onClose={() => setIsModalVersionOpen(false)}
|
||||
docId={docId}
|
||||
versionId={versionId}
|
||||
/>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -3,7 +3,14 @@ import { DateTime } from 'luxon';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { APIError } from '@/api';
|
||||
import { Box, Icon, InfiniteScroll, Text, TextErrors } from '@/components';
|
||||
import {
|
||||
Box,
|
||||
BoxButton,
|
||||
Icon,
|
||||
InfiniteScroll,
|
||||
Text,
|
||||
TextErrors,
|
||||
} from '@/components';
|
||||
import { Doc } from '@/docs/doc-management';
|
||||
import { useDate } from '@/hooks';
|
||||
|
||||
@@ -16,6 +23,7 @@ interface VersionListStateProps {
|
||||
isLoading: boolean;
|
||||
error: APIError<unknown> | null;
|
||||
versions?: Versions[];
|
||||
doc: Doc;
|
||||
selectedVersionId?: Versions['version_id'];
|
||||
onSelectVersion?: (versionId: Versions['version_id']) => void;
|
||||
}
|
||||
@@ -23,11 +31,13 @@ interface VersionListStateProps {
|
||||
const VersionListState = ({
|
||||
onSelectVersion,
|
||||
selectedVersionId,
|
||||
|
||||
isLoading,
|
||||
error,
|
||||
versions,
|
||||
doc,
|
||||
}: VersionListStateProps) => {
|
||||
const { formatDateSpecial } = useDate();
|
||||
const { formatDate } = useDate();
|
||||
|
||||
if (isLoading) {
|
||||
return (
|
||||
@@ -38,23 +48,24 @@ const VersionListState = ({
|
||||
}
|
||||
|
||||
return (
|
||||
<Box $gap="xxs" $padding="xs">
|
||||
{versions?.map((version) => {
|
||||
const formattedDate = formatDateSpecial(
|
||||
version.last_modified,
|
||||
'dd MMMM · HH:mm',
|
||||
);
|
||||
const isSelected = version.version_id === selectedVersionId;
|
||||
return (
|
||||
<Box as="li" key={version.version_id} $css="list-style: none;">
|
||||
<VersionItem
|
||||
text={formattedDate}
|
||||
isActive={isSelected}
|
||||
onSelect={() => onSelectVersion?.(version.version_id)}
|
||||
/>
|
||||
</Box>
|
||||
);
|
||||
})}
|
||||
<Box $gap="10px" $padding="xs">
|
||||
{versions?.map((version) => (
|
||||
<BoxButton
|
||||
aria-label="version item"
|
||||
className="version-item"
|
||||
key={version.version_id}
|
||||
onClick={() => {
|
||||
onSelectVersion?.(version.version_id);
|
||||
}}
|
||||
>
|
||||
<VersionItem
|
||||
versionId={version.version_id}
|
||||
text={formatDate(version.last_modified, DateTime.DATETIME_MED)}
|
||||
docId={doc.id}
|
||||
isActive={version.version_id === selectedVersionId}
|
||||
/>
|
||||
</BoxButton>
|
||||
))}
|
||||
{error && (
|
||||
<Box
|
||||
$justify="center"
|
||||
@@ -86,7 +97,6 @@ export const VersionList = ({
|
||||
selectedVersionId,
|
||||
}: VersionListProps) => {
|
||||
const { t } = useTranslation();
|
||||
const { formatDate } = useDate();
|
||||
|
||||
const {
|
||||
data,
|
||||
@@ -102,12 +112,6 @@ export const VersionList = ({
|
||||
const versions = data?.pages.reduce((acc, page) => {
|
||||
return acc.concat(page.versions);
|
||||
}, [] as Versions[]);
|
||||
const selectedVersion = versions?.find(
|
||||
(version) => version.version_id === selectedVersionId,
|
||||
);
|
||||
const selectedVersionDate = selectedVersion
|
||||
? formatDate(selectedVersion.last_modified, DateTime.DATETIME_MED)
|
||||
: null;
|
||||
|
||||
return (
|
||||
<Box
|
||||
@@ -123,7 +127,7 @@ export const VersionList = ({
|
||||
as="ul"
|
||||
$padding="none"
|
||||
$margin={{ top: 'none' }}
|
||||
role="list"
|
||||
role="listbox"
|
||||
>
|
||||
{versions?.length === 0 && (
|
||||
<Box $align="center" $margin="large">
|
||||
@@ -137,14 +141,10 @@ export const VersionList = ({
|
||||
isLoading={isLoading}
|
||||
error={error}
|
||||
versions={versions}
|
||||
doc={doc}
|
||||
selectedVersionId={selectedVersionId}
|
||||
/>
|
||||
</InfiniteScroll>
|
||||
<Text className="sr-only" aria-live="polite">
|
||||
{selectedVersionDate
|
||||
? t('Selected version {{date}}', { date: selectedVersionDate })
|
||||
: ''}
|
||||
</Text>
|
||||
</Box>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -116,9 +116,7 @@ export const DocsGrid = ({
|
||||
$padding={{
|
||||
bottom: 'md',
|
||||
}}
|
||||
{...(withUpload
|
||||
? getRootProps({ className: 'dropzone', tabIndex: -1 })
|
||||
: {})}
|
||||
{...(withUpload ? getRootProps({ className: 'dropzone' }) : {})}
|
||||
>
|
||||
{withUpload && <input {...getInputProps()} />}
|
||||
<DocGridTitleBar
|
||||
|
||||
@@ -19,7 +19,7 @@ import {
|
||||
useDuplicateDoc,
|
||||
useTrans,
|
||||
} from '@/docs/doc-management';
|
||||
import { focusMainContentStart } from '@/layouts/utils';
|
||||
import { MAIN_LAYOUT_ID } from '@/layouts/conf';
|
||||
import { useFocusStore } from '@/stores';
|
||||
|
||||
import { DocMoveModal } from './DocMoveModal';
|
||||
@@ -55,9 +55,10 @@ export const DocsGridActions = ({ doc }: DocsGridActionsProps) => {
|
||||
|
||||
const { mutate: duplicateDoc } = useDuplicateDoc({
|
||||
onSuccess: () => {
|
||||
requestAnimationFrame(() => {
|
||||
focusMainContentStart({ preventScroll: true });
|
||||
});
|
||||
const mainContent = document.getElementById(MAIN_LAYOUT_ID);
|
||||
if (mainContent) {
|
||||
requestAnimationFrame(() => mainContent.focus());
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
|
||||
@@ -1,5 +1,3 @@
|
||||
import { usePathname } from 'next/navigation';
|
||||
import { useEffect } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
import { createGlobalStyle, css } from 'styled-components';
|
||||
|
||||
@@ -25,15 +23,11 @@ const MobileLeftPanelStyle = createGlobalStyle`
|
||||
|
||||
export const LeftPanel = () => {
|
||||
const { isDesktop } = useResponsiveStore();
|
||||
if (isDesktop) {
|
||||
return <LeftPanelDesktop />;
|
||||
}
|
||||
|
||||
return <LeftPanelMobile />;
|
||||
};
|
||||
|
||||
export const LeftPanelDesktop = () => {
|
||||
const { t } = useTranslation();
|
||||
|
||||
const { spacingsTokens } = useCunninghamTheme();
|
||||
const { isPanelOpen, isPanelOpenMobile } = useLeftPanelStore();
|
||||
const isPanelOpenState = isDesktop ? isPanelOpen : isPanelOpenMobile;
|
||||
const { data: config } = useConfig();
|
||||
/**
|
||||
* The onboarding can be disable, so we need to check if it's enabled before displaying the help menu.
|
||||
@@ -42,51 +36,42 @@ export const LeftPanelDesktop = () => {
|
||||
*/
|
||||
const showHelpMenu = config?.theme_customization?.onboarding?.enabled;
|
||||
|
||||
return (
|
||||
<Box
|
||||
data-testid="left-panel-desktop"
|
||||
$css={css`
|
||||
height: calc(100vh - ${HEADER_HEIGHT}px);
|
||||
width: 100%;
|
||||
overflow: hidden;
|
||||
background-color: var(--c--contextuals--background--surface--primary);
|
||||
`}
|
||||
className="--docs--left-panel-desktop"
|
||||
as="nav"
|
||||
aria-label={t('Document sections')}
|
||||
>
|
||||
if (isDesktop) {
|
||||
return (
|
||||
<Box
|
||||
data-testid="left-panel-desktop"
|
||||
$css={css`
|
||||
flex: 0 0 auto;
|
||||
height: calc(100vh - ${HEADER_HEIGHT}px);
|
||||
width: 100%;
|
||||
overflow: hidden;
|
||||
background-color: var(--c--contextuals--background--surface--primary);
|
||||
`}
|
||||
className="--docs--left-panel-desktop"
|
||||
as="nav"
|
||||
aria-label={t('Document sections')}
|
||||
>
|
||||
<LeftPanelHeader />
|
||||
<Box
|
||||
$css={css`
|
||||
flex: 0 0 auto;
|
||||
`}
|
||||
>
|
||||
<LeftPanelHeader />
|
||||
</Box>
|
||||
<LeftPanelContent />
|
||||
{showHelpMenu && (
|
||||
<SeparatedSection showSeparator={false}>
|
||||
<Box $padding={{ horizontal: 'sm' }} $justify="flex-start">
|
||||
<HelpMenu />
|
||||
</Box>
|
||||
</SeparatedSection>
|
||||
)}
|
||||
</Box>
|
||||
<LeftPanelContent />
|
||||
{showHelpMenu && (
|
||||
<SeparatedSection showSeparator={false}>
|
||||
<Box $padding={{ horizontal: 'sm' }} $justify="flex-start">
|
||||
<HelpMenu />
|
||||
</Box>
|
||||
</SeparatedSection>
|
||||
)}
|
||||
</Box>
|
||||
);
|
||||
};
|
||||
|
||||
const LeftPanelMobile = () => {
|
||||
const { t } = useTranslation();
|
||||
const { spacingsTokens } = useCunninghamTheme();
|
||||
const { closePanel, isPanelOpenMobile } = useLeftPanelStore();
|
||||
const pathname = usePathname();
|
||||
|
||||
useEffect(() => {
|
||||
closePanel({ type: 'mobile' });
|
||||
}, [pathname, closePanel]);
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
{isPanelOpenMobile && <MobileLeftPanelStyle />}
|
||||
{isPanelOpenState && <MobileLeftPanelStyle />}
|
||||
<Box
|
||||
$hasTransition
|
||||
$height="100vh"
|
||||
@@ -96,7 +81,7 @@ const LeftPanelMobile = () => {
|
||||
height: calc(100dvh - 52px);
|
||||
border-right: 1px solid var(--c--globals--colors--gray-200);
|
||||
position: fixed;
|
||||
transform: translateX(${isPanelOpenMobile ? '0' : '-100dvw'});
|
||||
transform: translateX(${isPanelOpenState ? '0' : '-100dvw'});
|
||||
background-color: var(--c--contextuals--background--surface--primary);
|
||||
overflow-y: auto;
|
||||
overflow-x: hidden;
|
||||
|
||||
@@ -18,46 +18,8 @@ export const LeftPanelCollapseButton = () => {
|
||||
const { isPanelOpen, togglePanel } = useLeftPanelStore();
|
||||
const { currentDoc } = useDocStore();
|
||||
const [isDocTitleVisible, setIsDocTitleVisible] = useState(true);
|
||||
const [isDocTitleInDom, setIsDocTitleInDom] = useState(true);
|
||||
|
||||
/**
|
||||
* CLASS_DOC_TITLE is not every time in the DOM when
|
||||
* this component is rendered, we need to observe the DOM
|
||||
* to know when it is added, then we can observe
|
||||
* its visibility.
|
||||
*/
|
||||
useEffect(() => {
|
||||
setIsDocTitleInDom(false);
|
||||
|
||||
const docTitleEl = document.querySelector(`.${CLASS_DOC_TITLE}`);
|
||||
if (docTitleEl) {
|
||||
setIsDocTitleInDom(true);
|
||||
return;
|
||||
}
|
||||
|
||||
const mutationObserver = new MutationObserver(() => {
|
||||
if (document.querySelector(`.${CLASS_DOC_TITLE}`)) {
|
||||
mutationObserver.disconnect();
|
||||
setIsDocTitleInDom(true);
|
||||
}
|
||||
});
|
||||
|
||||
mutationObserver.observe(document.body, { childList: true, subtree: true });
|
||||
|
||||
return () => {
|
||||
mutationObserver.disconnect();
|
||||
};
|
||||
}, [currentDoc?.id]);
|
||||
|
||||
/**
|
||||
* When the doc title is in the DOM, we observe its
|
||||
* visibility to show or hide the collapse button accordingly
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (!isDocTitleInDom) {
|
||||
return;
|
||||
}
|
||||
|
||||
const mainContent = document.getElementById(MAIN_LAYOUT_ID);
|
||||
const docTitleEl = document.querySelector(`.${CLASS_DOC_TITLE}`);
|
||||
|
||||
@@ -81,7 +43,7 @@ export const LeftPanelCollapseButton = () => {
|
||||
observer.disconnect();
|
||||
setIsDocTitleVisible(true);
|
||||
};
|
||||
}, [isDocTitleInDom]);
|
||||
}, [currentDoc?.id]);
|
||||
|
||||
const { untitledDocument } = useTrans();
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ import {
|
||||
VariantType,
|
||||
useToastProvider,
|
||||
} from '@gouvfr-lasuite/cunningham-react';
|
||||
import { announce } from '@react-aria/live-announcer';
|
||||
import { useCallback } from 'react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
@@ -18,12 +19,14 @@ export const useClipboard = () => {
|
||||
toast(message, VariantType.SUCCESS, {
|
||||
duration: 3000,
|
||||
});
|
||||
announce(message, 'polite');
|
||||
})
|
||||
.catch(() => {
|
||||
const message = errorMessage ?? t('Failed to copy to clipboard');
|
||||
toast(message, VariantType.ERROR, {
|
||||
duration: 3000,
|
||||
});
|
||||
announce(message, 'assertive');
|
||||
});
|
||||
},
|
||||
[t, toast],
|
||||
|
||||
@@ -21,10 +21,6 @@ export const useDate = () => {
|
||||
.toLocaleString(format);
|
||||
};
|
||||
|
||||
const formatDateSpecial = (date: string, format: string): string => {
|
||||
return DateTime.fromISO(date).setLocale(i18n.language).toFormat(format);
|
||||
};
|
||||
|
||||
const relativeDate = (date: string): string => {
|
||||
const dateToCompare = DateTime.fromISO(date);
|
||||
|
||||
@@ -49,5 +45,5 @@ export const useDate = () => {
|
||||
),
|
||||
);
|
||||
|
||||
return { formatDate, formatDateSpecial, relativeDate, calculateDaysLeft };
|
||||
return { formatDate, relativeDate, calculateDaysLeft };
|
||||
};
|
||||
|
||||
@@ -1,10 +1,7 @@
|
||||
import { useRouter } from 'next/router';
|
||||
import { useEffect, useRef } from 'react';
|
||||
|
||||
import {
|
||||
focusMainContentStart,
|
||||
getMainContentFocusTarget,
|
||||
} from '@/layouts/utils';
|
||||
import { MAIN_LAYOUT_ID } from '@/layouts/conf';
|
||||
|
||||
export const useRouteChangeCompleteFocus = () => {
|
||||
const router = useRouter();
|
||||
@@ -28,24 +25,27 @@ export const useRouteChangeCompleteFocus = () => {
|
||||
lastCompletedPathRef.current = normalizedUrl;
|
||||
|
||||
requestAnimationFrame(() => {
|
||||
const focusTarget = getMainContentFocusTarget();
|
||||
const mainContent =
|
||||
document.getElementById(MAIN_LAYOUT_ID) ??
|
||||
document.getElementsByTagName('main')[0];
|
||||
|
||||
if (!focusTarget) {
|
||||
if (!mainContent) {
|
||||
return;
|
||||
}
|
||||
|
||||
const firstHeading = mainContent.querySelector('h1, h2, h3');
|
||||
const prefersReducedMotion = window.matchMedia(
|
||||
'(prefers-reduced-motion: reduce)',
|
||||
).matches;
|
||||
|
||||
if (isKeyboardNavigationRef.current) {
|
||||
focusMainContentStart({ preventScroll: true });
|
||||
(mainContent as HTMLElement | null)?.focus({ preventScroll: true });
|
||||
isKeyboardNavigationRef.current = false;
|
||||
}
|
||||
if (router.pathname === '/docs/[id]') {
|
||||
return;
|
||||
}
|
||||
focusTarget.scrollIntoView({
|
||||
(firstHeading ?? mainContent)?.scrollIntoView({
|
||||
behavior: prefersReducedMotion ? 'auto' : 'smooth',
|
||||
block: 'start',
|
||||
});
|
||||
|
||||
@@ -3,6 +3,7 @@ import { useTranslation } from 'react-i18next';
|
||||
import { css } from 'styled-components';
|
||||
|
||||
import { Box } from '@/components';
|
||||
import { useCunninghamTheme } from '@/cunningham';
|
||||
import { Header } from '@/features/header';
|
||||
import { HEADER_HEIGHT } from '@/features/header/conf';
|
||||
import { LeftPanel, ResizableLeftPanel } from '@/features/left-panel';
|
||||
@@ -93,6 +94,7 @@ const MainContent = ({
|
||||
const { isDesktop } = useResponsiveStore();
|
||||
|
||||
const { t } = useTranslation();
|
||||
const { colorsTokens } = useCunninghamTheme();
|
||||
const currentBackgroundColor = !isDesktop ? 'white' : backgroundColor;
|
||||
|
||||
return (
|
||||
@@ -101,6 +103,7 @@ const MainContent = ({
|
||||
role="main"
|
||||
aria-label={t('Main content')}
|
||||
id={MAIN_LAYOUT_ID}
|
||||
tabIndex={-1}
|
||||
$align="center"
|
||||
$flex={1}
|
||||
$width="100%"
|
||||
@@ -117,6 +120,14 @@ const MainContent = ({
|
||||
$css={css`
|
||||
overflow-y: auto;
|
||||
overflow-x: clip;
|
||||
&:focus-visible::after {
|
||||
content: '';
|
||||
position: absolute;
|
||||
inset: 0;
|
||||
border: 3px solid ${colorsTokens['brand-400']};
|
||||
pointer-events: none;
|
||||
z-index: 2001;
|
||||
}
|
||||
`}
|
||||
>
|
||||
<Skeleton>
|
||||
|
||||
@@ -1,37 +0,0 @@
|
||||
import { MAIN_LAYOUT_ID } from './conf';
|
||||
|
||||
export const getMainContentElement = (): HTMLElement | null =>
|
||||
document.getElementById(MAIN_LAYOUT_ID) ??
|
||||
document.getElementsByTagName('main')[0] ??
|
||||
null;
|
||||
|
||||
export const getMainContentFocusTarget = (): HTMLElement | null => {
|
||||
const mainContent = getMainContentElement();
|
||||
|
||||
if (!mainContent) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const firstHeading =
|
||||
mainContent.querySelector('h1') ?? mainContent.querySelector('h2');
|
||||
|
||||
return firstHeading instanceof HTMLElement ? firstHeading : mainContent;
|
||||
};
|
||||
|
||||
export const focusMainContentStart = (
|
||||
options?: FocusOptions,
|
||||
): HTMLElement | null => {
|
||||
const focusTarget = getMainContentFocusTarget();
|
||||
|
||||
if (!focusTarget) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!focusTarget.hasAttribute('tabindex')) {
|
||||
focusTarget.setAttribute('tabindex', '-1');
|
||||
}
|
||||
|
||||
focusTarget.focus(options);
|
||||
|
||||
return focusTarget;
|
||||
};
|
||||
@@ -19,7 +19,6 @@ import {
|
||||
useTrans,
|
||||
} from '@/docs/doc-management/';
|
||||
import { KEY_AUTH, setAuthUrl, useAuth } from '@/features/auth';
|
||||
import { FloatingBar } from '@/features/docs/doc-header/components/FloatingBar';
|
||||
import { getDocChildren, subPageToTree } from '@/features/docs/doc-tree/';
|
||||
import { DocEditorSkeleton, useSkeletonStore } from '@/features/skeletons';
|
||||
import { MainLayout } from '@/layouts';
|
||||
@@ -61,7 +60,6 @@ export function DocLayout() {
|
||||
}}
|
||||
>
|
||||
<MainLayout enableResizablePanel={true}>
|
||||
<FloatingBar />
|
||||
<DocPage id={id} />
|
||||
</MainLayout>
|
||||
</TreeProvider>
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "impress",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"private": true,
|
||||
"repository": "https://github.com/suitenumerique/docs",
|
||||
"author": "DINUM",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "eslint-plugin-docs",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"repository": "https://github.com/suitenumerique/docs",
|
||||
"author": "DINUM",
|
||||
"license": "MIT",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "packages-i18n",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"repository": "https://github.com/suitenumerique/docs",
|
||||
"author": "DINUM",
|
||||
"license": "MIT",
|
||||
|
||||
@@ -2868,10 +2868,10 @@
|
||||
"@emnapi/runtime" "^1.4.3"
|
||||
"@tybys/wasm-util" "^0.10.0"
|
||||
|
||||
"@next/env@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/env/-/env-16.1.7.tgz#169482a9a76aab0d9360813df898a88667d79ffc"
|
||||
integrity sha512-rJJbIdJB/RQr2F1nylZr/PJzamvNNhfr3brdKP6s/GW850jbtR70QlSfFselvIBbcPUOlQwBakexjFzqLzF6pg==
|
||||
"@next/env@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/env/-/env-16.1.6.tgz#0f85979498249a94ef606ef535042a831f905e89"
|
||||
integrity sha512-N1ySLuZjnAtN3kFnwhAwPvZah8RJxKasD7x1f8shFqhncnWZn4JMfg37diLNuoHsLAlrDfM3g4mawVdtAG8XLQ==
|
||||
|
||||
"@next/eslint-plugin-next@16.1.6":
|
||||
version "16.1.6"
|
||||
@@ -2880,45 +2880,45 @@
|
||||
dependencies:
|
||||
fast-glob "3.3.1"
|
||||
|
||||
"@next/swc-darwin-arm64@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-darwin-arm64/-/swc-darwin-arm64-16.1.7.tgz#6022bec143c23837ca4744fee4ab48b0f74b0faa"
|
||||
integrity sha512-b2wWIE8sABdyafc4IM8r5Y/dS6kD80JRtOGrUiKTsACFQfWWgUQ2NwoUX1yjFMXVsAwcQeNpnucF2ZrujsBBPg==
|
||||
"@next/swc-darwin-arm64@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-darwin-arm64/-/swc-darwin-arm64-16.1.6.tgz#fbe1e360efdcc9ebd0a10301518275bc59e12a91"
|
||||
integrity sha512-wTzYulosJr/6nFnqGW7FrG3jfUUlEf8UjGA0/pyypJl42ExdVgC6xJgcXQ+V8QFn6niSG2Pb8+MIG1mZr2vczw==
|
||||
|
||||
"@next/swc-darwin-x64@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-darwin-x64/-/swc-darwin-x64-16.1.7.tgz#3f1604e5a59645a0394b74e50791ff2c477167b8"
|
||||
integrity sha512-zcnVaaZulS1WL0Ss38R5Q6D2gz7MtBu8GZLPfK+73D/hp4GFMrC2sudLky1QibfV7h6RJBJs/gOFvYP0X7UVlQ==
|
||||
"@next/swc-darwin-x64@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-darwin-x64/-/swc-darwin-x64-16.1.6.tgz#0e3781ef3abc8251c2a21addc733d9a87f44829b"
|
||||
integrity sha512-BLFPYPDO+MNJsiDWbeVzqvYd4NyuRrEYVB5k2N3JfWncuHAy2IVwMAOlVQDFjj+krkWzhY2apvmekMkfQR0CUQ==
|
||||
|
||||
"@next/swc-linux-arm64-gnu@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-16.1.7.tgz#458c98c8b790efa30bc2866f715352ba3cb47cb5"
|
||||
integrity sha512-2ant89Lux/Q3VyC8vNVg7uBaFVP9SwoK2jJOOR0L8TQnX8CAYnh4uctAScy2Hwj2dgjVHqHLORQZJ2wH6VxhSQ==
|
||||
"@next/swc-linux-arm64-gnu@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-16.1.6.tgz#b24511af2c6129f2deaf5c8c04d297fe09cd40d7"
|
||||
integrity sha512-OJYkCd5pj/QloBvoEcJ2XiMnlJkRv9idWA/j0ugSuA34gMT6f5b7vOiCQHVRpvStoZUknhl6/UxOXL4OwtdaBw==
|
||||
|
||||
"@next/swc-linux-arm64-musl@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-16.1.7.tgz#1a6ae4d0894f8751f8951ca0fed6e103e27bfc7a"
|
||||
integrity sha512-uufcze7LYv0FQg9GnNeZ3/whYfo+1Q3HnQpm16o6Uyi0OVzLlk2ZWoY7j07KADZFY8qwDbsmFnMQP3p3+Ftprw==
|
||||
"@next/swc-linux-arm64-musl@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-16.1.6.tgz#9d4ed0565689fc6a867250f994736a5b8c542ccb"
|
||||
integrity sha512-S4J2v+8tT3NIO9u2q+S0G5KdvNDjXfAv06OhfOzNDaBn5rw84DGXWndOEB7d5/x852A20sW1M56vhC/tRVbccQ==
|
||||
|
||||
"@next/swc-linux-x64-gnu@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-16.1.7.tgz#183126cfbf45a96e3b3b37e35b5429e4c48795bd"
|
||||
integrity sha512-KWVf2gxYvHtvuT+c4MBOGxuse5TD7DsMFYSxVxRBnOzok/xryNeQSjXgxSv9QpIVlaGzEn/pIuI6Koosx8CGWA==
|
||||
"@next/swc-linux-x64-gnu@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-16.1.6.tgz#cc757f4384e7eab7d3dba704a97f737518bae0d2"
|
||||
integrity sha512-2eEBDkFlMMNQnkTyPBhQOAyn2qMxyG2eE7GPH2WIDGEpEILcBPI/jdSv4t6xupSP+ot/jkfrCShLAa7+ZUPcJQ==
|
||||
|
||||
"@next/swc-linux-x64-musl@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-16.1.7.tgz#59906d387aa934fc2d066ff6c0ba695ebc904381"
|
||||
integrity sha512-HguhaGwsGr1YAGs68uRKc4aGWxLET+NevJskOcCAwXbwj0fYX0RgZW2gsOCzr9S11CSQPIkxmoSbuVaBp4Z3dA==
|
||||
"@next/swc-linux-x64-musl@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-16.1.6.tgz#ef1341740f29717deea7c6ec27ae6269386e20d1"
|
||||
integrity sha512-oicJwRlyOoZXVlxmIMaTq7f8pN9QNbdes0q2FXfRsPhfCi8n8JmOZJm5oo1pwDaFbnnD421rVU409M3evFbIqg==
|
||||
|
||||
"@next/swc-win32-arm64-msvc@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-16.1.7.tgz#e97a31605ca10e5ca493555f840e4972496ce350"
|
||||
integrity sha512-S0n3KrDJokKTeFyM/vGGGR8+pCmXYrjNTk2ZozOL1C/JFdfUIL9O1ATaJOl5r2POe56iRChbsszrjMAdWSv7kQ==
|
||||
"@next/swc-win32-arm64-msvc@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-16.1.6.tgz#fee8719242aecf9c39c3a66f1f73821f7884dd16"
|
||||
integrity sha512-gQmm8izDTPgs+DCWH22kcDmuUp7NyiJgEl18bcr8irXA5N2m2O+JQIr6f3ct42GOs9c0h8QF3L5SzIxcYAAXXw==
|
||||
|
||||
"@next/swc-win32-x64-msvc@16.1.7":
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-16.1.7.tgz#eeaf9fb75de437232e1c8b46d16d1ba8b0c635c2"
|
||||
integrity sha512-mwgtg8CNZGYm06LeEd+bNnOUfwOyNem/rOiP14Lsz+AnUY92Zq/LXwtebtUiaeVkhbroRCQ0c8GlR4UT1U+0yg==
|
||||
"@next/swc-win32-x64-msvc@16.1.6":
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-16.1.6.tgz#60c27323c30f35722b20fd6d62449fbb768e46d9"
|
||||
integrity sha512-NRfO39AIrzBnixKbjuo2YiYhB6o9d8v/ymU9m/Xk8cyVk+k7XylniXkHwjs4s70wedVffc6bQNbufk5v0xEm0A==
|
||||
|
||||
"@noble/hashes@^2.0.1":
|
||||
version "2.0.1"
|
||||
@@ -4241,7 +4241,7 @@
|
||||
"@react-types/shared" "^3.33.0"
|
||||
"@swc/helpers" "^0.5.0"
|
||||
|
||||
"@react-aria/live-announcer@3.4.4", "@react-aria/live-announcer@^3.4.2", "@react-aria/live-announcer@^3.4.4":
|
||||
"@react-aria/live-announcer@^3.4.2", "@react-aria/live-announcer@^3.4.4":
|
||||
version "3.4.4"
|
||||
resolved "https://registry.yarnpkg.com/@react-aria/live-announcer/-/live-announcer-3.4.4.tgz#0e6533940222208b323b71d56ac8e115b2121e6a"
|
||||
integrity sha512-PTTBIjNRnrdJOIRTDGNifY2d//kA7GUAwRFJNOEwSNG4FW+Bq9awqLiflw0JkpyB0VNIwou6lqKPHZVLsGWOXA==
|
||||
@@ -8585,6 +8585,11 @@ base64-js@^1.1.2, base64-js@^1.3.0, base64-js@^1.3.1:
|
||||
resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.5.1.tgz#1b1b440160a5bf7ad40b650f095963481903930a"
|
||||
integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==
|
||||
|
||||
baseline-browser-mapping@^2.8.3:
|
||||
version "2.10.0"
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.0.tgz#5b09935025bf8a80e29130251e337c6a7fc8cbb9"
|
||||
integrity sha512-lIyg0szRfYbiy67j9KN8IyeD7q7hcmqnJ1ddWmNt19ItGpNN64mnllmxUNFIOdOm6by97jlL6wfpTTJrmnjWAA==
|
||||
|
||||
baseline-browser-mapping@^2.8.9:
|
||||
version "2.8.19"
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.8.19.tgz#8d99bb7f06bc6ea5c9c1b961e631a1713069bbe0"
|
||||
@@ -8595,11 +8600,6 @@ baseline-browser-mapping@^2.9.0:
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.9.12.tgz#fbed5f37edf24b708e6e0b1fb26c70982a577dfc"
|
||||
integrity sha512-Mij6Lij93pTAIsSYy5cyBQ975Qh9uLEc5rwGTpomiZeXZL9yIS6uORJakb3ScHgfs0serMMfIbXzokPMuEiRyw==
|
||||
|
||||
baseline-browser-mapping@^2.9.19:
|
||||
version "2.10.9"
|
||||
resolved "https://registry.yarnpkg.com/baseline-browser-mapping/-/baseline-browser-mapping-2.10.9.tgz#8614229add633061c001a0b7c7c85d4b7c44e6ca"
|
||||
integrity sha512-OZd0e2mU11ClX8+IdXe3r0dbqMEznRiT4TfbhYIbcRPZkqJ7Qwer8ij3GZAmLsRKa+II9V1v5czCkvmHH3XZBg==
|
||||
|
||||
bidi-js@^1.0.2, bidi-js@^1.0.3:
|
||||
version "1.0.3"
|
||||
resolved "https://registry.yarnpkg.com/bidi-js/-/bidi-js-1.0.3.tgz#6f8bcf3c877c4d9220ddf49b9bb6930c88f877d2"
|
||||
@@ -13350,26 +13350,26 @@ neo-async@^2.6.2:
|
||||
resolved "https://registry.yarnpkg.com/neo-async/-/neo-async-2.6.2.tgz#b4aafb93e3aeb2d8174ca53cf163ab7d7308305f"
|
||||
integrity sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==
|
||||
|
||||
next@16.1.7:
|
||||
version "16.1.7"
|
||||
resolved "https://registry.yarnpkg.com/next/-/next-16.1.7.tgz#fccdda75050ffc11ace27526b8a9ac7c308c8c48"
|
||||
integrity sha512-WM0L7WrSvKwoLegLYr6V+mz+RIofqQgVAfHhMp9a88ms0cFX8iX9ew+snpWlSBwpkURJOUdvCEt3uLl3NNzvWg==
|
||||
next@16.1.6:
|
||||
version "16.1.6"
|
||||
resolved "https://registry.yarnpkg.com/next/-/next-16.1.6.tgz#24a861371cbe211be7760d9a89ddf2415e3824de"
|
||||
integrity sha512-hkyRkcu5x/41KoqnROkfTm2pZVbKxvbZRuNvKXLRXxs3VfyO0WhY50TQS40EuKO9SW3rBj/sF3WbVwDACeMZyw==
|
||||
dependencies:
|
||||
"@next/env" "16.1.7"
|
||||
"@next/env" "16.1.6"
|
||||
"@swc/helpers" "0.5.15"
|
||||
baseline-browser-mapping "^2.9.19"
|
||||
baseline-browser-mapping "^2.8.3"
|
||||
caniuse-lite "^1.0.30001579"
|
||||
postcss "8.4.31"
|
||||
styled-jsx "5.1.6"
|
||||
optionalDependencies:
|
||||
"@next/swc-darwin-arm64" "16.1.7"
|
||||
"@next/swc-darwin-x64" "16.1.7"
|
||||
"@next/swc-linux-arm64-gnu" "16.1.7"
|
||||
"@next/swc-linux-arm64-musl" "16.1.7"
|
||||
"@next/swc-linux-x64-gnu" "16.1.7"
|
||||
"@next/swc-linux-x64-musl" "16.1.7"
|
||||
"@next/swc-win32-arm64-msvc" "16.1.7"
|
||||
"@next/swc-win32-x64-msvc" "16.1.7"
|
||||
"@next/swc-darwin-arm64" "16.1.6"
|
||||
"@next/swc-darwin-x64" "16.1.6"
|
||||
"@next/swc-linux-arm64-gnu" "16.1.6"
|
||||
"@next/swc-linux-arm64-musl" "16.1.6"
|
||||
"@next/swc-linux-x64-gnu" "16.1.6"
|
||||
"@next/swc-linux-x64-musl" "16.1.6"
|
||||
"@next/swc-win32-arm64-msvc" "16.1.6"
|
||||
"@next/swc-win32-x64-msvc" "16.1.6"
|
||||
sharp "^0.34.4"
|
||||
|
||||
no-case@^3.0.4:
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
environments:
|
||||
dev:
|
||||
values:
|
||||
- version: 4.8.3
|
||||
- version: 4.8.2
|
||||
feature:
|
||||
values:
|
||||
- version: 4.8.3
|
||||
- version: 4.8.2
|
||||
feature: ci
|
||||
domain: example.com
|
||||
imageTag: demo
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
apiVersion: v2
|
||||
type: application
|
||||
name: docs
|
||||
version: 4.8.3
|
||||
version: 4.8.2
|
||||
appVersion: latest
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "mail_mjml",
|
||||
"version": "4.8.3",
|
||||
"version": "4.8.2",
|
||||
"description": "An util to generate html and text django's templates from mjml templates",
|
||||
"type": "module",
|
||||
"dependencies": {
|
||||
|
||||
Reference in New Issue
Block a user