Compare commits

...

6 Commits

Author SHA1 Message Date
Stephan Meijer
5cf508b35b ♻️(backend) stylistic and consistency changes
Refactored converter services based on PR #1609 review comments:
- Renamed parameter to `data` across all convert methods for consistency
- Replaced recursive call with explicit sequential calls for readability
- Hardcoded CONVERSION_API_SECURE=True in Production class for security
- Removed unused YdocConverter import from viewsets.py
- Updated tests to match new error message wording

Signed-off-by: Stephan Meijer <me@stephanmeijer.com>
2026-01-08 16:03:51 +01:00
Stephan Meijer
13ce791c6a (backend) add tests for document import feature
Added comprehensive tests covering DocSpec converter service,
converter orchestration, and document creation with file uploads.

Tests validate DOCX and Markdown conversion workflows, error
handling, service availability, and edge cases including empty
files and Unicode filenames.

Signed-off-by: Stephan Meijer <me@stephanmeijer.com>
2026-01-08 16:03:38 +01:00
Stephan Meijer
be90c621b1 ⬆️(docker) upgrade docspec api to version 2.4.4
Updated docspec service image from 2.0.0 to 2.4.4 to
include latest features and bug fixes.

Signed-off-by: Stephan Meijer <me@stephanmeijer.com>
2026-01-08 16:02:35 +01:00
Anthony LC
d3088b82d7 (frontend) add import document area in docs grid
Add import document area with drag and drop
support in the docs grid component.
We can now import docx and and md files just
by dropping them into the designated area.

We are using the `react-dropzone` library to
handle the drag and drop functionality.
2026-01-08 16:02:14 +01:00
Anthony LC
0fdc42fa5f 💄(frontend) adapt the docs grid title bar
Adapt the docs grid title bar to align with the
new design. We will add a upload button in a
future iteration.
2026-01-08 16:02:14 +01:00
Stephan Meijer
7e17cf1c47 (backend) Import of documents
We can now import documents in formats .docx and .md.
To do so we added a new container "docspec", which
uses the docspec service to convert
these formats to Blocknote format.

More here: #1567 #1569.

Signed-off-by: Stephan Meijer <me@stephanmeijer.com>
2026-01-08 16:01:56 +01:00
29 changed files with 1482 additions and 172 deletions

View File

@@ -8,6 +8,7 @@ and this project adheres to
### Added ### Added
- ✨(frontend) add import document area in docs grid #1567
- ✨(backend) add documents/all endpoint with descendants #1553 - ✨(backend) add documents/all endpoint with descendants #1553
- ✅(export) add PDF regression tests #1762 - ✅(export) add PDF regression tests #1762
- 📝(docs) Add language configuration documentation #1757 - 📝(docs) Add language configuration documentation #1757

View File

@@ -213,6 +213,7 @@ logs: ## display app-dev logs (follow mode)
.PHONY: logs .PHONY: logs
run-backend: ## Start only the backend application and all needed services run-backend: ## Start only the backend application and all needed services
@$(COMPOSE) up --force-recreate -d docspec
@$(COMPOSE) up --force-recreate -d celery-dev @$(COMPOSE) up --force-recreate -d celery-dev
@$(COMPOSE) up --force-recreate -d y-provider-development @$(COMPOSE) up --force-recreate -d y-provider-development
@$(COMPOSE) up --force-recreate -d nginx @$(COMPOSE) up --force-recreate -d nginx

View File

@@ -231,6 +231,11 @@ services:
condition: service_healthy condition: service_healthy
restart: true restart: true
docspec:
image: ghcr.io/docspecio/api:2.4.4
ports:
- "4000:4000"
networks: networks:
lasuite: lasuite:
name: lasuite-network name: lasuite-network

View File

@@ -113,6 +113,7 @@ These are the environment variables you can set for the `impress-backend` contai
| USER_OIDC_ESSENTIAL_CLAIMS | Essential claims in OIDC token | [] | | USER_OIDC_ESSENTIAL_CLAIMS | Essential claims in OIDC token | [] |
| Y_PROVIDER_API_BASE_URL | Y Provider url | | | Y_PROVIDER_API_BASE_URL | Y Provider url | |
| Y_PROVIDER_API_KEY | Y provider API key | | | Y_PROVIDER_API_KEY | Y provider API key | |
| DOCSPEC_API_URL | URL to endpoint of DocSpec conversion API | |
## impress-frontend image ## impress-frontend image

View File

@@ -76,6 +76,8 @@ DJANGO_SERVER_TO_SERVER_API_TOKENS=server-api-token
Y_PROVIDER_API_BASE_URL=http://y-provider-development:4444/api/ Y_PROVIDER_API_BASE_URL=http://y-provider-development:4444/api/
Y_PROVIDER_API_KEY=yprovider-api-key Y_PROVIDER_API_KEY=yprovider-api-key
DOCSPEC_API_URL=http://docspec:4000/conversion
# Theme customization # Theme customization
THEME_CUSTOMIZATION_CACHE_TIMEOUT=15 THEME_CUSTOMIZATION_CACHE_TIMEOUT=15

View File

@@ -6,4 +6,4 @@ Y_PROVIDER_API_BASE_URL=http://y-provider:4444/api/
# Throttle # Throttle
API_DOCUMENT_THROTTLE_RATE=1000/min API_DOCUMENT_THROTTLE_RATE=1000/min
API_CONFIG_THROTTLE_RATE=1000/min API_CONFIG_THROTTLE_RATE=1000/min

View File

@@ -15,10 +15,11 @@ import magic
from rest_framework import serializers from rest_framework import serializers
from core import choices, enums, models, utils, validators from core import choices, enums, models, utils, validators
from core.services import mime_types
from core.services.ai_services import AI_ACTIONS from core.services.ai_services import AI_ACTIONS
from core.services.converter_services import ( from core.services.converter_services import (
ConversionError, ConversionError,
YdocConverter, Converter,
) )
@@ -188,6 +189,7 @@ class DocumentSerializer(ListDocumentSerializer):
content = serializers.CharField(required=False) content = serializers.CharField(required=False)
websocket = serializers.BooleanField(required=False, write_only=True) websocket = serializers.BooleanField(required=False, write_only=True)
file = serializers.FileField(required=False, write_only=True, allow_null=True)
class Meta: class Meta:
model = models.Document model = models.Document
@@ -204,6 +206,7 @@ class DocumentSerializer(ListDocumentSerializer):
"deleted_at", "deleted_at",
"depth", "depth",
"excerpt", "excerpt",
"file",
"is_favorite", "is_favorite",
"link_role", "link_role",
"link_reach", "link_reach",
@@ -461,7 +464,9 @@ class ServerCreateDocumentSerializer(serializers.Serializer):
language = user.language or language language = user.language or language
try: try:
document_content = YdocConverter().convert(validated_data["content"]) document_content = Converter().convert(
validated_data["content"], mime_types.MARKDOWN, mime_types.YJS
)
except ConversionError as err: except ConversionError as err:
raise serializers.ValidationError( raise serializers.ValidationError(
{"content": ["Could not convert content"]} {"content": ["Could not convert content"]}

View File

@@ -41,17 +41,15 @@ from rest_framework.permissions import AllowAny
from core import authentication, choices, enums, models from core import authentication, choices, enums, models
from core.api.filters import remove_accents from core.api.filters import remove_accents
from core.services import mime_types
from core.services.ai_services import AIService from core.services.ai_services import AIService
from core.services.collaboration_services import CollaborationService from core.services.collaboration_services import CollaborationService
from core.services.converter_services import ( from core.services.converter_services import (
ConversionError,
Converter,
ServiceUnavailableError as YProviderServiceUnavailableError, ServiceUnavailableError as YProviderServiceUnavailableError,
)
from core.services.converter_services import (
ValidationError as YProviderValidationError, ValidationError as YProviderValidationError,
) )
from core.services.converter_services import (
YdocConverter,
)
from core.services.search_indexers import ( from core.services.search_indexers import (
get_document_indexer, get_document_indexer,
get_visited_document_ids_of, get_visited_document_ids_of,
@@ -525,6 +523,28 @@ class DocumentViewSet(
"IN SHARE ROW EXCLUSIVE MODE;" "IN SHARE ROW EXCLUSIVE MODE;"
) )
# Remove file from validated_data as it's not a model field
# Process it if present
uploaded_file = serializer.validated_data.pop("file", None)
# If a file is uploaded, convert it to Yjs format and set as content
if uploaded_file:
try:
file_content = uploaded_file.read()
converter = Converter()
converted_content = converter.convert(
file_content,
content_type=uploaded_file.content_type,
accept=mime_types.YJS,
)
serializer.validated_data["content"] = converted_content
serializer.validated_data["title"] = uploaded_file.name
except ConversionError as err:
raise drf.exceptions.ValidationError(
{"file": ["Could not convert file content"]}
) from err
obj = models.Document.add_root( obj = models.Document.add_root(
creator=self.request.user, creator=self.request.user,
**serializer.validated_data, **serializer.validated_data,
@@ -1755,14 +1775,14 @@ class DocumentViewSet(
if base64_content is not None: if base64_content is not None:
# Convert using the y-provider service # Convert using the y-provider service
try: try:
yprovider = YdocConverter() yprovider = Converter()
result = yprovider.convert( result = yprovider.convert(
base64.b64decode(base64_content), base64.b64decode(base64_content),
"application/vnd.yjs.doc", mime_types.YJS,
{ {
"markdown": "text/markdown", "markdown": mime_types.MARKDOWN,
"html": "text/html", "html": mime_types.HTML,
"json": "application/json", "json": mime_types.JSON,
}[content_format], }[content_format],
) )
content = result content = result

View File

@@ -1,11 +1,14 @@
"""Y-Provider API services.""" """Y-Provider API services."""
import typing
from base64 import b64encode from base64 import b64encode
from django.conf import settings from django.conf import settings
import requests import requests
from core.services import mime_types
class ConversionError(Exception): class ConversionError(Exception):
"""Base exception for conversion-related errors.""" """Base exception for conversion-related errors."""
@@ -19,8 +22,73 @@ class ServiceUnavailableError(ConversionError):
"""Raised when the conversion service is unavailable.""" """Raised when the conversion service is unavailable."""
class ConverterProtocol(typing.Protocol):
"""Protocol for converter classes."""
def convert(self, data, content_type, accept):
"""Convert content from one format to another."""
class Converter:
"""Orchestrates conversion between different formats using specialized converters."""
docspec: ConverterProtocol
ydoc: ConverterProtocol
def __init__(self):
self.docspec = DocSpecConverter()
self.ydoc = YdocConverter()
def convert(self, data, content_type, accept):
"""Convert input into other formats using external microservices."""
if content_type == mime_types.DOCX and accept == mime_types.YJS:
blocknote_data = self.docspec.convert(
data, mime_types.DOCX, mime_types.BLOCKNOTE
)
return self.ydoc.convert(
blocknote_data, mime_types.BLOCKNOTE, mime_types.YJS
)
return self.ydoc.convert(data, content_type, accept)
class DocSpecConverter:
"""Service class for DocSpec conversion-related operations."""
def _request(self, url, data, content_type):
"""Make a request to the DocSpec API."""
response = requests.post(
url,
headers={"Accept": mime_types.BLOCKNOTE},
files={"file": ("document.docx", data, content_type)},
timeout=settings.CONVERSION_API_TIMEOUT,
verify=settings.CONVERSION_API_SECURE,
)
response.raise_for_status()
return response
def convert(self, data, content_type, accept):
"""Convert a Document to BlockNote."""
if not data:
raise ValidationError("Input data cannot be empty")
if content_type != mime_types.DOCX or accept != mime_types.BLOCKNOTE:
raise ValidationError(
f"Conversion from {content_type} to {accept} is not supported."
)
try:
return self._request(settings.DOCSPEC_API_URL, data, content_type).content
except requests.RequestException as err:
raise ServiceUnavailableError(
"Failed to connect to DocSpec conversion service",
) from err
class YdocConverter: class YdocConverter:
"""Service class for conversion-related operations.""" """Service class for YDoc conversion-related operations."""
@property @property
def auth_header(self): def auth_header(self):
@@ -44,29 +112,27 @@ class YdocConverter:
response.raise_for_status() response.raise_for_status()
return response return response
def convert( def convert(self, data, content_type=mime_types.MARKDOWN, accept=mime_types.YJS):
self, text, content_type="text/markdown", accept="application/vnd.yjs.doc"
):
"""Convert a Markdown text into our internal format using an external microservice.""" """Convert a Markdown text into our internal format using an external microservice."""
if not text: if not data:
raise ValidationError("Input text cannot be empty") raise ValidationError("Input data cannot be empty")
try: try:
response = self._request( response = self._request(
f"{settings.Y_PROVIDER_API_BASE_URL}{settings.CONVERSION_API_ENDPOINT}/", f"{settings.Y_PROVIDER_API_BASE_URL}{settings.CONVERSION_API_ENDPOINT}/",
text, data,
content_type, content_type,
accept, accept,
) )
if accept == "application/vnd.yjs.doc": if accept == mime_types.YJS:
return b64encode(response.content).decode("utf-8") return b64encode(response.content).decode("utf-8")
if accept in {"text/markdown", "text/html"}: if accept in {mime_types.MARKDOWN, "text/html"}:
return response.text return response.text
if accept == "application/json": if accept == mime_types.JSON:
return response.json() return response.json()
raise ValidationError("Unsupported format") raise ValidationError("Unsupported format")
except requests.RequestException as err: except requests.RequestException as err:
raise ServiceUnavailableError( raise ServiceUnavailableError(
"Failed to connect to conversion service", f"Failed to connect to YDoc conversion service {content_type}, {accept}",
) from err ) from err

View File

@@ -0,0 +1,8 @@
"""MIME type constants for document conversion."""
BLOCKNOTE = "application/vnd.blocknote+json"
YJS = "application/vnd.yjs.doc"
MARKDOWN = "text/markdown"
JSON = "application/json"
DOCX = "application/vnd.openxmlformats-officedocument.wordprocessingml.document"
HTML = "text/html"

View File

@@ -16,6 +16,7 @@ from rest_framework.test import APIClient
from core import factories from core import factories
from core.api.serializers import ServerCreateDocumentSerializer from core.api.serializers import ServerCreateDocumentSerializer
from core.models import Document, Invitation, User from core.models import Document, Invitation, User
from core.services import mime_types
from core.services.converter_services import ConversionError, YdocConverter from core.services.converter_services import ConversionError, YdocConverter
pytestmark = pytest.mark.django_db pytestmark = pytest.mark.django_db
@@ -191,7 +192,9 @@ def test_api_documents_create_for_owner_existing(mock_convert_md):
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
document = Document.objects.get() document = Document.objects.get()
assert response.json() == {"id": str(document.id)} assert response.json() == {"id": str(document.id)}
@@ -236,7 +239,9 @@ def test_api_documents_create_for_owner_new_user(mock_convert_md):
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
document = Document.objects.get() document = Document.objects.get()
assert response.json() == {"id": str(document.id)} assert response.json() == {"id": str(document.id)}
@@ -297,7 +302,9 @@ def test_api_documents_create_for_owner_existing_user_email_no_sub_with_fallback
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
document = Document.objects.get() document = Document.objects.get()
assert response.json() == {"id": str(document.id)} assert response.json() == {"id": str(document.id)}
@@ -393,7 +400,9 @@ def test_api_documents_create_for_owner_new_user_no_sub_no_fallback_allow_duplic
HTTP_AUTHORIZATION="Bearer DummyToken", HTTP_AUTHORIZATION="Bearer DummyToken",
) )
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
document = Document.objects.get() document = Document.objects.get()
assert response.json() == {"id": str(document.id)} assert response.json() == {"id": str(document.id)}
@@ -474,7 +483,9 @@ def test_api_documents_create_for_owner_with_default_language(
) )
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
assert mock_send.call_args[0][3] == "de-de" assert mock_send.call_args[0][3] == "de-de"
@@ -501,7 +512,9 @@ def test_api_documents_create_for_owner_with_custom_language(mock_convert_md):
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
assert len(mail.outbox) == 1 assert len(mail.outbox) == 1
email = mail.outbox[0] email = mail.outbox[0]
@@ -537,7 +550,9 @@ def test_api_documents_create_for_owner_with_custom_subject_and_message(
assert response.status_code == 201 assert response.status_code == 201
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
assert len(mail.outbox) == 1 assert len(mail.outbox) == 1
email = mail.outbox[0] email = mail.outbox[0]
@@ -571,7 +586,9 @@ def test_api_documents_create_for_owner_with_converter_exception(
format="json", format="json",
HTTP_AUTHORIZATION="Bearer DummyToken", HTTP_AUTHORIZATION="Bearer DummyToken",
) )
mock_convert_md.assert_called_once_with("Document content") mock_convert_md.assert_called_once_with(
"Document content", mime_types.MARKDOWN, mime_types.YJS
)
assert response.status_code == 400 assert response.status_code == 400
assert response.json() == {"content": ["Could not convert content"]} assert response.json() == {"content": ["Could not convert content"]}

View File

@@ -0,0 +1,358 @@
"""
Tests for Documents API endpoint in impress's core app: create with file upload
"""
from base64 import b64decode, binascii
from io import BytesIO
from unittest.mock import patch
import pytest
from rest_framework.test import APIClient
from core import factories
from core.models import Document
from core.services import mime_types
from core.services.converter_services import (
ConversionError,
ServiceUnavailableError,
)
pytestmark = pytest.mark.django_db
def test_api_documents_create_with_file_anonymous():
"""Anonymous users should not be allowed to create documents with file upload."""
# Create a fake DOCX file
file_content = b"fake docx content"
file = BytesIO(file_content)
file.name = "test_document.docx"
response = APIClient().post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 401
assert not Document.objects.exists()
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_docx_file_success(mock_convert):
"""
Authenticated users should be able to create documents by uploading a DOCX file.
The file should be converted to YJS format and the title should be set from filename.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion
converted_yjs = "base64encodedyjscontent"
mock_convert.return_value = converted_yjs
# Create a fake DOCX file
file_content = b"fake docx content"
file = BytesIO(file_content)
file.name = "My Important Document.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 201
document = Document.objects.get()
assert document.title == "My Important Document.docx"
assert document.content == converted_yjs
assert document.accesses.filter(role="owner", user=user).exists()
# Verify the converter was called correctly
mock_convert.assert_called_once_with(
file_content,
content_type=mime_types.DOCX,
accept=mime_types.YJS,
)
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_markdown_file_success(mock_convert):
"""
Authenticated users should be able to create documents by uploading a Markdown file.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion
converted_yjs = "base64encodedyjscontent"
mock_convert.return_value = converted_yjs
# Create a fake Markdown file
file_content = b"# Test Document\n\nThis is a test."
file = BytesIO(file_content)
file.name = "readme.md"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 201
document = Document.objects.get()
assert document.title == "readme.md"
assert document.content == converted_yjs
assert document.accesses.filter(role="owner", user=user).exists()
# Verify the converter was called correctly
mock_convert.assert_called_once_with(
file_content,
content_type=mime_types.MARKDOWN,
accept=mime_types.YJS,
)
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_and_explicit_title(mock_convert):
"""
When both file and title are provided, the filename should override the title.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion
converted_yjs = "base64encodedyjscontent"
mock_convert.return_value = converted_yjs
# Create a fake DOCX file
file_content = b"fake docx content"
file = BytesIO(file_content)
file.name = "Uploaded Document.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
"title": "This should be overridden",
},
format="multipart",
)
assert response.status_code == 201
document = Document.objects.get()
# The filename should take precedence
assert document.title == "Uploaded Document.docx"
def test_api_documents_create_with_empty_file():
"""
Creating a document with an empty file should fail with a validation error.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Create an empty file
file = BytesIO(b"")
file.name = "empty.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 400
assert response.json() == {"file": ["The submitted file is empty."]}
assert not Document.objects.exists()
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_conversion_error(mock_convert):
"""
When conversion fails, the API should return a 400 error with appropriate message.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion to raise an error
mock_convert.side_effect = ConversionError("Failed to convert document")
# Create a fake DOCX file
file_content = b"fake invalid docx content"
file = BytesIO(file_content)
file.name = "corrupted.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 400
assert response.json() == {"file": ["Could not convert file content"]}
assert not Document.objects.exists()
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_service_unavailable(mock_convert):
"""
When the conversion service is unavailable, appropriate error should be returned.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion to raise ServiceUnavailableError
mock_convert.side_effect = ServiceUnavailableError(
"Failed to connect to conversion service"
)
# Create a fake DOCX file
file_content = b"fake docx content"
file = BytesIO(file_content)
file.name = "document.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 400
assert response.json() == {"file": ["Could not convert file content"]}
assert not Document.objects.exists()
def test_api_documents_create_without_file_still_works():
"""
Creating a document without a file should still work as before (backward compatibility).
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
response = client.post(
"/api/v1.0/documents/",
{
"title": "Regular document without file",
},
format="json",
)
assert response.status_code == 201
document = Document.objects.get()
assert document.title == "Regular document without file"
assert document.content is None
assert document.accesses.filter(role="owner", user=user).exists()
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_null_value(mock_convert):
"""
Passing file=null should be treated as no file upload.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
response = client.post(
"/api/v1.0/documents/",
{
"title": "Document with null file",
"file": None,
},
format="json",
)
assert response.status_code == 201
document = Document.objects.get()
assert document.title == "Document with null file"
# Converter should not have been called
mock_convert.assert_not_called()
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_preserves_content_format(mock_convert):
"""
Verify that the converted content is stored correctly in the document.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion with realistic base64-encoded YJS data
converted_yjs = "AQMEBQYHCAkKCwwNDg8QERITFBUWFxgZGhscHR4fICA="
mock_convert.return_value = converted_yjs
# Create a fake DOCX file
file_content = b"fake docx with complex formatting"
file = BytesIO(file_content)
file.name = "complex_document.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 201
document = Document.objects.get()
# Verify the content is stored as returned by the converter
assert document.content == converted_yjs
# Verify it's valid base64 (can be decoded)
try:
b64decode(converted_yjs)
except binascii.Error:
pytest.fail("Content should be valid base64-encoded data")
@patch("core.services.converter_services.Converter.convert")
def test_api_documents_create_with_file_unicode_filename(mock_convert):
"""
Test that Unicode characters in filenames are handled correctly.
"""
user = factories.UserFactory()
client = APIClient()
client.force_login(user)
# Mock the conversion
converted_yjs = "base64encodedyjscontent"
mock_convert.return_value = converted_yjs
# Create a file with Unicode characters in the name
file_content = b"fake docx content"
file = BytesIO(file_content)
file.name = "文档-télécharger-документ.docx"
response = client.post(
"/api/v1.0/documents/",
{
"file": file,
},
format="multipart",
)
assert response.status_code == 201
document = Document.objects.get()
assert document.title == "文档-télécharger-документ.docx"

View File

@@ -0,0 +1,93 @@
"""Test Converter orchestration services."""
from unittest.mock import MagicMock, patch
from core.services import mime_types
from core.services.converter_services import Converter
@patch("core.services.converter_services.DocSpecConverter")
@patch("core.services.converter_services.YdocConverter")
def test_converter_docx_to_yjs_orchestration(mock_ydoc_class, mock_docspec_class):
"""Test that DOCX to YJS conversion uses both DocSpec and Ydoc converters."""
# Setup mocks
mock_docspec = MagicMock()
mock_ydoc = MagicMock()
mock_docspec_class.return_value = mock_docspec
mock_ydoc_class.return_value = mock_ydoc
# Mock the conversion chain: DOCX -> BlockNote -> YJS
blocknote_data = b'[{"type": "paragraph", "content": "test"}]'
yjs_data = "base64encodedyjs"
mock_docspec.convert.return_value = blocknote_data
mock_ydoc.convert.return_value = yjs_data
# Execute conversion
converter = Converter()
docx_data = b"fake docx data"
result = converter.convert(docx_data, mime_types.DOCX, mime_types.YJS)
# Verify the orchestration
mock_docspec.convert.assert_called_once_with(
docx_data, mime_types.DOCX, mime_types.BLOCKNOTE
)
mock_ydoc.convert.assert_called_once_with(
blocknote_data, mime_types.BLOCKNOTE, mime_types.YJS
)
assert result == yjs_data
@patch("core.services.converter_services.YdocConverter")
def test_converter_markdown_to_yjs_delegation(mock_ydoc_class):
"""Test that Markdown to YJS conversion is delegated to YdocConverter."""
mock_ydoc = MagicMock()
mock_ydoc_class.return_value = mock_ydoc
yjs_data = "base64encodedyjs"
mock_ydoc.convert.return_value = yjs_data
converter = Converter()
markdown_data = "# Test Document"
result = converter.convert(markdown_data, mime_types.MARKDOWN, mime_types.YJS)
mock_ydoc.convert.assert_called_once_with(
markdown_data, mime_types.MARKDOWN, mime_types.YJS
)
assert result == yjs_data
@patch("core.services.converter_services.YdocConverter")
def test_converter_yjs_to_html_delegation(mock_ydoc_class):
"""Test that YJS to HTML conversion is delegated to YdocConverter."""
mock_ydoc = MagicMock()
mock_ydoc_class.return_value = mock_ydoc
html_data = "<p>Test Document</p>"
mock_ydoc.convert.return_value = html_data
converter = Converter()
yjs_data = b"yjs binary data"
result = converter.convert(yjs_data, mime_types.YJS, mime_types.HTML)
mock_ydoc.convert.assert_called_once_with(yjs_data, mime_types.YJS, mime_types.HTML)
assert result == html_data
@patch("core.services.converter_services.YdocConverter")
def test_converter_blocknote_to_yjs_delegation(mock_ydoc_class):
"""Test that BlockNote to YJS conversion is delegated to YdocConverter."""
mock_ydoc = MagicMock()
mock_ydoc_class.return_value = mock_ydoc
yjs_data = "base64encodedyjs"
mock_ydoc.convert.return_value = yjs_data
converter = Converter()
blocknote_data = b'[{"type": "paragraph"}]'
result = converter.convert(blocknote_data, mime_types.BLOCKNOTE, mime_types.YJS)
mock_ydoc.convert.assert_called_once_with(
blocknote_data, mime_types.BLOCKNOTE, mime_types.YJS
)
assert result == yjs_data

View File

@@ -6,6 +6,7 @@ from unittest.mock import MagicMock, patch
import pytest import pytest
import requests import requests
from core.services import mime_types
from core.services.converter_services import ( from core.services.converter_services import (
ServiceUnavailableError, ServiceUnavailableError,
ValidationError, ValidationError,
@@ -21,9 +22,9 @@ def test_auth_header(settings):
def test_convert_empty_text(): def test_convert_empty_text():
"""Should raise ValidationError when text is empty.""" """Should raise ValidationError when data is empty."""
converter = YdocConverter() converter = YdocConverter()
with pytest.raises(ValidationError, match="Input text cannot be empty"): with pytest.raises(ValidationError, match="Input data cannot be empty"):
converter.convert("") converter.convert("")
@@ -36,7 +37,7 @@ def test_convert_service_unavailable(mock_post):
with pytest.raises( with pytest.raises(
ServiceUnavailableError, ServiceUnavailableError,
match="Failed to connect to conversion service", match="Failed to connect to YDoc conversion service",
): ):
converter.convert("test text") converter.convert("test text")
@@ -52,7 +53,7 @@ def test_convert_http_error(mock_post):
with pytest.raises( with pytest.raises(
ServiceUnavailableError, ServiceUnavailableError,
match="Failed to connect to conversion service", match="Failed to connect to YDoc conversion service",
): ):
converter.convert("test text") converter.convert("test text")
@@ -83,8 +84,8 @@ def test_convert_full_integration(mock_post, settings):
data="test markdown", data="test markdown",
headers={ headers={
"Authorization": "Bearer test-key", "Authorization": "Bearer test-key",
"Content-Type": "text/markdown", "Content-Type": mime_types.MARKDOWN,
"Accept": "application/vnd.yjs.doc", "Accept": mime_types.YJS,
}, },
timeout=5, timeout=5,
verify=False, verify=False,
@@ -108,9 +109,7 @@ def test_convert_full_integration_with_specific_headers(mock_post, settings):
mock_response.raise_for_status.return_value = None mock_response.raise_for_status.return_value = None
mock_post.return_value = mock_response mock_post.return_value = mock_response
result = converter.convert( result = converter.convert(b"test_content", mime_types.YJS, mime_types.MARKDOWN)
b"test_content", "application/vnd.yjs.doc", "text/markdown"
)
assert result == expected_response assert result == expected_response
mock_post.assert_called_once_with( mock_post.assert_called_once_with(
@@ -118,8 +117,8 @@ def test_convert_full_integration_with_specific_headers(mock_post, settings):
data=b"test_content", data=b"test_content",
headers={ headers={
"Authorization": "Bearer test-key", "Authorization": "Bearer test-key",
"Content-Type": "application/vnd.yjs.doc", "Content-Type": mime_types.YJS,
"Accept": "text/markdown", "Accept": mime_types.MARKDOWN,
}, },
timeout=5, timeout=5,
verify=False, verify=False,
@@ -135,7 +134,7 @@ def test_convert_timeout(mock_post):
with pytest.raises( with pytest.raises(
ServiceUnavailableError, ServiceUnavailableError,
match="Failed to connect to conversion service", match="Failed to connect to YDoc conversion service",
): ):
converter.convert("test text") converter.convert("test text")
@@ -144,5 +143,5 @@ def test_convert_none_input():
"""Should raise ValidationError when input is None.""" """Should raise ValidationError when input is None."""
converter = YdocConverter() converter = YdocConverter()
with pytest.raises(ValidationError, match="Input text cannot be empty"): with pytest.raises(ValidationError, match="Input data cannot be empty"):
converter.convert(None) converter.convert(None)

View File

@@ -0,0 +1,117 @@
"""Test DocSpec converter services."""
from unittest.mock import MagicMock, patch
import pytest
import requests
from core.services import mime_types
from core.services.converter_services import (
DocSpecConverter,
ServiceUnavailableError,
ValidationError,
)
def test_docspec_convert_empty_data():
"""Should raise ValidationError when data is empty."""
converter = DocSpecConverter()
with pytest.raises(ValidationError, match="Input data cannot be empty"):
converter.convert("", mime_types.DOCX, mime_types.BLOCKNOTE)
def test_docspec_convert_none_input():
"""Should raise ValidationError when input is None."""
converter = DocSpecConverter()
with pytest.raises(ValidationError, match="Input data cannot be empty"):
converter.convert(None, mime_types.DOCX, mime_types.BLOCKNOTE)
def test_docspec_convert_unsupported_content_type():
"""Should raise ValidationError when content type is not DOCX."""
converter = DocSpecConverter()
with pytest.raises(
ValidationError, match="Conversion from text/plain to .* is not supported"
):
converter.convert(b"test data", "text/plain", mime_types.BLOCKNOTE)
def test_docspec_convert_unsupported_accept():
"""Should raise ValidationError when accept type is not BLOCKNOTE."""
converter = DocSpecConverter()
with pytest.raises(
ValidationError,
match=f"Conversion from {mime_types.DOCX} to {mime_types.YJS} is not supported",
):
converter.convert(b"test data", mime_types.DOCX, mime_types.YJS)
@patch("requests.post")
def test_docspec_convert_service_unavailable(mock_post):
"""Should raise ServiceUnavailableError when service is unavailable."""
converter = DocSpecConverter()
mock_post.side_effect = requests.RequestException("Connection error")
with pytest.raises(
ServiceUnavailableError,
match="Failed to connect to DocSpec conversion service",
):
converter.convert(b"test data", mime_types.DOCX, mime_types.BLOCKNOTE)
@patch("requests.post")
def test_docspec_convert_http_error(mock_post):
"""Should raise ServiceUnavailableError when HTTP error occurs."""
converter = DocSpecConverter()
mock_response = MagicMock()
mock_response.raise_for_status.side_effect = requests.HTTPError("HTTP Error")
mock_post.return_value = mock_response
with pytest.raises(
ServiceUnavailableError,
match="Failed to connect to DocSpec conversion service",
):
converter.convert(b"test data", mime_types.DOCX, mime_types.BLOCKNOTE)
@patch("requests.post")
def test_docspec_convert_timeout(mock_post):
"""Should raise ServiceUnavailableError when request times out."""
converter = DocSpecConverter()
mock_post.side_effect = requests.Timeout("Request timed out")
with pytest.raises(
ServiceUnavailableError,
match="Failed to connect to DocSpec conversion service",
):
converter.convert(b"test data", mime_types.DOCX, mime_types.BLOCKNOTE)
@patch("requests.post")
def test_docspec_convert_success(mock_post, settings):
"""Test successful DOCX to BlockNote conversion."""
settings.DOCSPEC_API_URL = "http://docspec.test/convert"
settings.CONVERSION_API_TIMEOUT = 5
settings.CONVERSION_API_SECURE = False
converter = DocSpecConverter()
expected_content = b'[{"type": "paragraph", "content": "test"}]'
mock_response = MagicMock()
mock_response.content = expected_content
mock_response.raise_for_status.return_value = None
mock_post.return_value = mock_response
docx_data = b"fake docx binary data"
result = converter.convert(docx_data, mime_types.DOCX, mime_types.BLOCKNOTE)
assert result == expected_content
# Verify the request was made correctly
mock_post.assert_called_once_with(
"http://docspec.test/convert",
headers={"Accept": mime_types.BLOCKNOTE},
files={"file": ("document.docx", docx_data, mime_types.DOCX)},
timeout=5,
verify=False,
)

View File

@@ -709,6 +709,9 @@ class Base(Configuration):
environ_prefix=None, environ_prefix=None,
) )
# DocSpec API microservice
DOCSPEC_API_URL = values.Value(environ_name="DOCSPEC_API_URL", environ_prefix=None)
# Conversion endpoint # Conversion endpoint
CONVERSION_API_ENDPOINT = values.Value( CONVERSION_API_ENDPOINT = values.Value(
default="convert", default="convert",
@@ -1054,6 +1057,9 @@ class Production(Base):
# Privacy # Privacy
SECURE_REFERRER_POLICY = "same-origin" SECURE_REFERRER_POLICY = "same-origin"
# Conversion API: Always verify SSL in production
CONVERSION_API_SECURE = True
CACHES = { CACHES = {
"default": { "default": {
"BACKEND": "django_redis.cache.RedisCache", "BACKEND": "django_redis.cache.RedisCache",

View File

@@ -0,0 +1,60 @@
![473389927-e4ff1794-69f3-460a-85f8-fec993cd74d6.png](http://localhost:3000/assets/logo-suite-numerique.png)![497094770-53e5f8e2-c93e-4a0b-a82f-cd184fd03f51.svg](http://localhost:3000/assets/icon-docs.svg)
# Lorem Ipsum import Document
## Introduction
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam auctor, nisl eget ultricies tincidunt, nisl nisl aliquam nisl, eget ultricies nisl nisl eget nisl.
### Subsection 1.1
* **Bold text**: Lorem ipsum dolor sit amet.
* *Italic text*: Consectetur adipiscing elit.
* ~~Strikethrough text~~: Nullam auctor, nisl eget ultricies tincidunt.
1. First item in an ordered list.
2. Second item in an ordered list.
* Indented bullet point.
* Another indented bullet point.
3. Third item in an ordered list.
### Subsection 1.2
**Code block:**
```js
const hello_world = () => {
console.log("Hello, world!");
}
```
**Blockquote:**
> Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam auctor, nisl eget ultricies tincidunt.
**Horizontal rule:**
***
**Table:**
| Syntax | Description |
| --------- | ----------- |
| Header | Title |
| Paragraph | Text |
**Inline code:**
Use the `printf()` function.
**Link:** [Example](http://localhost:3000/)
## Conclusion
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam auctor, nisl eget ultricies tincidunt, nisl nisl aliquam nisl, eget ultricies nisl nisl eget nisl.

View File

@@ -0,0 +1,172 @@
import { readFileSync } from 'fs';
import path from 'path';
import { Page, expect, test } from '@playwright/test';
import { getEditor } from './utils-editor';
test.beforeEach(async ({ page }) => {
await page.goto('/');
});
test.describe('Doc Import', () => {
test('it imports 2 docs with the import icon', async ({ page }) => {
const fileChooserPromise = page.waitForEvent('filechooser');
await page.getByLabel('Open the upload dialog').click();
const fileChooser = await fileChooserPromise;
await fileChooser.setFiles(path.join(__dirname, 'assets/test_import.docx'));
await fileChooser.setFiles(path.join(__dirname, 'assets/test_import.md'));
await expect(
page.getByText(
'The document "test_import.docx" has been successfully imported',
),
).toBeVisible();
await expect(
page.getByText(
'The document "test_import.md" has been successfully imported',
),
).toBeVisible();
const docsGrid = page.getByTestId('docs-grid');
await expect(docsGrid.getByText('test_import.docx').first()).toBeVisible();
await expect(docsGrid.getByText('test_import.md').first()).toBeVisible();
// Check content of imported md
await docsGrid.getByText('test_import.md').first().click();
const editor = await getEditor({ page });
const contentCheck = async (isMDCheck = false) => {
await expect(
editor.getByRole('heading', {
name: 'Lorem Ipsum import Document',
level: 1,
}),
).toBeVisible();
await expect(
editor.getByRole('heading', {
name: 'Introduction',
level: 2,
}),
).toBeVisible();
await expect(
editor.getByRole('heading', {
name: 'Subsection 1.1',
level: 3,
}),
).toBeVisible();
await expect(
editor
.locator('div[data-content-type="bulletListItem"] strong')
.getByText('Bold text'),
).toBeVisible();
await expect(
editor
.locator('div[data-content-type="codeBlock"]')
.getByText('hello_world'),
).toBeVisible();
await expect(
editor
.locator('div[data-content-type="table"] td')
.getByText('Paragraph'),
).toBeVisible();
await expect(
editor.locator('a[href="http://localhost:3000/"]').getByText('Example'),
).toBeVisible();
/* eslint-disable playwright/no-conditional-expect */
if (isMDCheck) {
await expect(
editor.locator(
'img[src="http://localhost:3000/assets/logo-suite-numerique.png"]',
),
).toBeVisible();
await expect(
editor.locator(
'img[src="http://localhost:3000/assets/icon-docs.svg"]',
),
).toBeVisible();
} else {
await expect(editor.locator('img')).toHaveCount(2);
}
/* eslint-enable playwright/no-conditional-expect */
await expect(
editor.locator('div[data-content-type="divider"] hr'),
).toBeVisible();
};
await contentCheck();
// Check content of imported docx
await page.getByLabel('Back to homepage').first().click();
await docsGrid.getByText('test_import.docx').first().click();
await contentCheck();
});
test('it imports 2 docs with the drag and drop area', async ({ page }) => {
const docsGrid = page.getByTestId('docs-grid');
await expect(docsGrid).toBeVisible();
await dragAndDropFiles(page, "[data-testid='docs-grid']", [
{
filePath: path.join(__dirname, 'assets/test_import.docx'),
fileName: 'test_import.docx',
fileType:
'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
},
{
filePath: path.join(__dirname, 'assets/test_import.md'),
fileName: 'test_import.md',
fileType: 'text/markdown',
},
]);
// Wait for success messages
await expect(
page.getByText(
'The document "test_import.docx" has been successfully imported',
),
).toBeVisible();
await expect(
page.getByText(
'The document "test_import.md" has been successfully imported',
),
).toBeVisible();
await expect(docsGrid.getByText('test_import.docx').first()).toBeVisible();
await expect(docsGrid.getByText('test_import.md').first()).toBeVisible();
});
});
const dragAndDropFiles = async (
page: Page,
selector: string,
files: Array<{ filePath: string; fileName: string; fileType?: string }>,
) => {
const filesData = files.map((file) => ({
bufferData: `data:application/octet-stream;base64,${readFileSync(file.filePath).toString('base64')}`,
fileName: file.fileName,
fileType: file.fileType || '',
}));
const dataTransfer = await page.evaluateHandle(async (filesInfo) => {
const dt = new DataTransfer();
for (const fileInfo of filesInfo) {
const blobData = await fetch(fileInfo.bufferData).then((res) =>
res.blob(),
);
const file = new File([blobData], fileInfo.fileName, {
type: fileInfo.fileType,
});
dt.items.add(file);
}
return dt;
}, filesData);
await page.dispatchEvent(selector, 'drop', { dataTransfer });
};

View File

@@ -62,6 +62,7 @@
"react": "*", "react": "*",
"react-aria-components": "1.13.0", "react-aria-components": "1.13.0",
"react-dom": "*", "react-dom": "*",
"react-dropzone": "14.3.8",
"react-i18next": "16.5.0", "react-i18next": "16.5.0",
"react-intersection-observer": "10.0.0", "react-intersection-observer": "10.0.0",
"react-resizable-panels": "3.0.6", "react-resizable-panels": "3.0.6",

View File

@@ -20,7 +20,7 @@ export type DefinedInitialDataInfiniteOptionsAPI<
QueryKey, QueryKey,
TPageParam TPageParam
>; >;
export type UseInfiniteQueryResultAPI<Q> = InfiniteData<Q>;
export type InfiniteQueryConfig<Q> = Omit< export type InfiniteQueryConfig<Q> = Omit<
DefinedInitialDataInfiniteOptionsAPI<Q>, DefinedInitialDataInfiniteOptionsAPI<Q>,
'queryKey' | 'initialData' | 'getNextPageParam' | 'initialPageParam' 'queryKey' | 'initialData' | 'getNextPageParam' | 'initialPageParam'

View File

@@ -0,0 +1,20 @@
<svg viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path
d="M6.12757 9.8486C5.98657 9.6993 5.91709 9.5143 5.91709 9.30858C5.91709 9.10284 5.98679 8.91775 6.13233 8.77221C6.28262 8.62192 6.47291 8.54842 6.68579 8.54842H13.1697C13.3775 8.54842 13.5623 8.62245 13.7061 8.77215C13.8559 8.91601 13.9299 9.10081 13.9299 9.30858C13.9299 9.51737 13.8553 9.70306 13.7085 9.8511C13.5643 10.0024 13.3787 10.0773 13.1697 10.0773H6.68579C6.47291 10.0773 6.28262 10.0038 6.13233 9.85349L6.13076 9.85192L6.12757 9.8486Z"
fill="currentColor"
/>
<path
d="M6.12757 12.83C5.98657 12.6807 5.91709 12.4957 5.91709 12.29C5.91709 12.0843 5.98679 11.8992 6.13233 11.7536C6.28262 11.6033 6.47291 11.5298 6.68579 11.5298H13.1697C13.3775 11.5298 13.5623 11.6039 13.7061 11.7536C13.8559 11.8974 13.9299 12.0822 13.9299 12.29C13.9299 12.4988 13.8553 12.6845 13.7085 12.8325C13.5643 12.9838 13.3787 13.0587 13.1697 13.0587H6.68579C6.47291 13.0587 6.28262 12.9852 6.13233 12.8349L6.13076 12.8333L6.12757 12.83Z"
fill="currentColor"
/>
<path
d="M5.91709 15.2885C5.91709 15.4912 5.98839 15.6726 6.12757 15.82L6.134 15.8266L6.13723 15.8296C6.28833 15.9723 6.47704 16.0401 6.68579 16.0401H9.75263C9.96123 16.0401 10.1502 15.9722 10.2975 15.8249C10.444 15.6784 10.5213 15.4956 10.5213 15.2885C10.5213 15.0768 10.4486 14.8874 10.2999 14.7374C10.1539 14.5842 9.96433 14.5113 9.75263 14.5113H6.68579C6.47293 14.5113 6.28257 14.5847 6.13226 14.735L6.12757 14.7399C5.98486 14.891 5.91709 15.0797 5.91709 15.2885Z"
fill="currentColor"
/>
<path
fill-rule="evenodd"
clip-rule="evenodd"
d="M7.37975 1.24597C7.88425 0.735004 8.61944 0.5 9.54031 0.5H18.6127C19.533 0.5 20.2661 0.734736 20.7653 1.24652C21.2686 1.75666 21.5 2.49628 21.5 3.42147V16.3808C21.5 17.3112 21.2688 18.0521 20.7638 18.5572C20.2645 19.0624 19.532 19.2937 18.6127 19.2937H17.347V20.5338C17.347 21.4641 17.1158 22.2051 16.6108 22.7102C16.1115 23.2153 15.3789 23.4467 14.4597 23.4467H5.3873C4.46721 23.4467 3.73242 23.2149 3.22781 22.7103C2.72908 22.2051 2.5 21.4635 2.5 20.5338V7.57442C2.5 6.64962 2.72942 5.90915 3.22673 5.39893C3.73123 4.88796 4.46643 4.65295 5.3873 4.65295H6.65302V3.42147C6.65302 2.49666 6.88244 1.7562 7.37975 1.24597ZM8.42319 4.65295H14.4597C15.38 4.65295 16.1131 4.88769 16.6122 5.39947C17.1156 5.90962 17.347 6.64923 17.347 7.57442V17.5236H18.5444C18.9636 17.5236 19.2496 17.4163 19.4324 17.2289L19.4337 17.2275C19.6238 17.0374 19.7298 16.7549 19.7298 16.3552V3.4471C19.7298 3.04734 19.6238 2.76485 19.4337 2.57481L19.431 2.57206C19.248 2.37972 18.9625 2.27017 18.5444 2.27017H9.60866C9.19081 2.27017 8.90126 2.37956 8.71212 2.57341C8.52701 2.76329 8.42319 3.04633 8.42319 3.4471V4.65295ZM5.45564 21.6765C5.03728 21.6765 4.74743 21.5697 4.55844 21.3811C4.37372 21.1913 4.27017 20.9084 4.27017 20.5081V7.60005C4.27017 7.19928 4.37399 6.91625 4.55911 6.72636C4.74825 6.53252 5.03779 6.42313 5.45564 6.42313H14.3913C14.8095 6.42313 15.095 6.53268 15.278 6.72501L15.2807 6.72776C15.4708 6.9178 15.5768 7.20029 15.5768 7.60005V20.5081C15.5768 20.9079 15.4708 21.1904 15.2807 21.3804L15.2793 21.3818C15.0966 21.5693 14.8105 21.6765 14.3913 21.6765H5.45564Z"
fill="currentColor"
/>
</svg>

After

Width:  |  Height:  |  Size: 3.1 KiB

View File

@@ -1,21 +1,34 @@
import clsx from 'clsx'; import clsx from 'clsx';
import React from 'react';
import { css } from 'styled-components'; import { css } from 'styled-components';
import { Text, TextType } from '@/components'; import { Text, TextType } from '@/components';
type IconProps = TextType & { type IconBase = TextType & {
disabled?: boolean; disabled?: boolean;
};
type IconMaterialProps = IconBase & {
iconName: string; iconName: string;
variant?: 'filled' | 'outlined' | 'symbols-outlined'; variant?: 'filled' | 'outlined' | 'symbols-outlined';
icon?: never;
}; };
type IconSVGProps = IconBase & {
icon: React.ReactNode;
iconName?: never;
variant?: never;
};
export const Icon = ({ export const Icon = ({
className, className,
iconName,
disabled, disabled,
iconName,
icon,
variant = 'outlined', variant = 'outlined',
$theme = 'neutral', $theme = 'neutral',
...textProps ...textProps
}: IconProps) => { }: IconMaterialProps | IconSVGProps) => {
const hasLabel = 'aria-label' in textProps || 'aria-labelledby' in textProps; const hasLabel = 'aria-label' in textProps || 'aria-labelledby' in textProps;
const ariaHidden = const ariaHidden =
'aria-hidden' in textProps ? textProps['aria-hidden'] : !hasLabel; 'aria-hidden' in textProps ? textProps['aria-hidden'] : !hasLabel;
@@ -24,15 +37,15 @@ export const Icon = ({
<Text <Text
aria-hidden={ariaHidden} aria-hidden={ariaHidden}
className={clsx('--docs--icon-bg', className, { className={clsx('--docs--icon-bg', className, {
'material-icons-filled': variant === 'filled', 'material-icons-filled': variant === 'filled' && iconName,
'material-icons': variant === 'outlined', 'material-icons': variant === 'outlined' && iconName,
'material-symbols-outlined': variant === 'symbols-outlined', 'material-symbols-outlined': variant === 'symbols-outlined' && iconName,
})} })}
$theme={disabled ? 'disabled' : $theme} $theme={disabled ? 'disabled' : $theme}
aria-disabled={disabled} aria-disabled={disabled}
{...textProps} {...textProps}
> >
{iconName} {iconName ?? icon}
</Text> </Text>
); );
}; };

View File

@@ -0,0 +1,125 @@
import { VariantType, useToastProvider } from '@openfun/cunningham-react';
import {
UseMutationOptions,
useMutation,
useQueryClient,
} from '@tanstack/react-query';
import { useTranslation } from 'react-i18next';
import {
APIError,
UseInfiniteQueryResultAPI,
errorCauses,
fetchAPI,
} from '@/api';
import { Doc, DocsResponse, KEY_LIST_DOC } from '@/docs/doc-management';
enum ContentTypes {
Docx = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document',
Markdown = 'text/markdown',
OctetStream = 'application/octet-stream',
}
export enum ContentTypesAllowed {
Docx = ContentTypes.Docx,
Markdown = ContentTypes.Markdown,
}
const getMimeType = (file: File): string => {
if (file.type) {
return file.type;
}
const extension = file.name.split('.').pop()?.toLowerCase();
switch (extension) {
case 'md':
return ContentTypes.Markdown;
case 'markdown':
return ContentTypes.Markdown;
case 'docx':
return ContentTypes.Docx;
default:
return ContentTypes.OctetStream;
}
};
export const importDoc = async (file: File): Promise<Doc> => {
const form = new FormData();
form.append(
'file',
new File([file], file.name, {
type: getMimeType(file),
lastModified: file.lastModified,
}),
);
const response = await fetchAPI(`documents/`, {
method: 'POST',
body: form,
withoutContentType: true,
});
if (!response.ok) {
throw new APIError('Failed to import the doc', await errorCauses(response));
}
return response.json() as Promise<Doc>;
};
type UseImportDocOptions = UseMutationOptions<Doc, APIError, File>;
export function useImportDoc(props?: UseImportDocOptions) {
const { toast } = useToastProvider();
const queryClient = useQueryClient();
const { t } = useTranslation();
return useMutation<Doc, APIError, File>({
mutationFn: importDoc,
...props,
onSuccess: (...successProps) => {
queryClient.setQueriesData<UseInfiniteQueryResultAPI<DocsResponse>>(
{ queryKey: [KEY_LIST_DOC] },
(oldData) => {
if (!oldData || oldData?.pages.length === 0) {
return oldData;
}
return {
...oldData,
pages: oldData.pages.map((page, index) => {
// Add the new doc to the first page only
if (index === 0) {
return {
...page,
results: [successProps[0], ...page.results],
};
}
return page;
}),
};
},
);
toast(
t('The document "{{documentName}}" has been successfully imported', {
documentName: successProps?.[0].title || '',
}),
VariantType.SUCCESS,
);
props?.onSuccess?.(...successProps);
},
onError: (...errorProps) => {
toast(
t(`The document "{{documentName}}" import has failed`, {
documentName: errorProps?.[1].name || '',
}),
VariantType.ERROR,
);
props?.onError?.(...errorProps);
},
});
}

View File

@@ -1,14 +1,21 @@
import { Button } from '@openfun/cunningham-react'; import {
import { useMemo } from 'react'; Button,
VariantType,
useToastProvider,
} from '@openfun/cunningham-react';
import { useMemo, useState } from 'react';
import { useDropzone } from 'react-dropzone';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { InView } from 'react-intersection-observer'; import { InView } from 'react-intersection-observer';
import { css } from 'styled-components'; import { css } from 'styled-components';
import { Box, Card, Text } from '@/components'; import AllDocs from '@/assets/icons/doc-all.svg';
import { Box, Card, Icon, Text } from '@/components';
import { DocDefaultFilter, useInfiniteDocs } from '@/docs/doc-management'; import { DocDefaultFilter, useInfiniteDocs } from '@/docs/doc-management';
import { useResponsiveStore } from '@/stores'; import { useResponsiveStore } from '@/stores';
import { useInfiniteDocsTrashbin } from '../api'; import { useInfiniteDocsTrashbin } from '../api';
import { ContentTypesAllowed, useImportDoc } from '../api/useImportDoc';
import { useResponsiveDocGrid } from '../hooks/useResponsiveDocGrid'; import { useResponsiveDocGrid } from '../hooks/useResponsiveDocGrid';
import { import {
@@ -24,6 +31,44 @@ export const DocsGrid = ({
target = DocDefaultFilter.ALL_DOCS, target = DocDefaultFilter.ALL_DOCS,
}: DocsGridProps) => { }: DocsGridProps) => {
const { t } = useTranslation(); const { t } = useTranslation();
const [isDragOver, setIsDragOver] = useState(false);
const { toast } = useToastProvider();
const { getRootProps, getInputProps, open } = useDropzone({
accept: {
[ContentTypesAllowed.Docx]: ['.docx'],
[ContentTypesAllowed.Markdown]: ['.md'],
},
onDrop(acceptedFiles) {
setIsDragOver(false);
for (const file of acceptedFiles) {
importDoc(file);
}
},
onDragEnter: () => {
setIsDragOver(true);
},
onDragLeave: () => {
setIsDragOver(false);
},
onDropRejected(fileRejections) {
toast(
t(
`The document "{{documentName}}" import has failed (only .docx and .md files are allowed)`,
{
documentName: fileRejections?.[0].file.name || '',
},
),
VariantType.ERROR,
);
},
noClick: true,
});
const { mutate: importDoc } = useImportDoc();
const withUpload =
!target ||
target === DocDefaultFilter.ALL_DOCS ||
target === DocDefaultFilter.MY_DOCS;
const { isDesktop } = useResponsiveStore(); const { isDesktop } = useResponsiveStore();
const { flexLeft, flexRight } = useResponsiveDocGrid(); const { flexLeft, flexRight } = useResponsiveDocGrid();
@@ -60,21 +105,6 @@ export const DocsGrid = ({
void fetchNextPage(); void fetchNextPage();
}; };
let title = t('All docs');
switch (target) {
case DocDefaultFilter.MY_DOCS:
title = t('My docs');
break;
case DocDefaultFilter.SHARED_WITH_ME:
title = t('Shared with me');
break;
case DocDefaultFilter.TRASHBIN:
title = t('Trashbin');
break;
default:
title = t('All docs');
}
return ( return (
<Box <Box
$position="relative" $position="relative"
@@ -91,16 +121,24 @@ export const DocsGrid = ({
$width="100%" $width="100%"
$css={css` $css={css`
${!isDesktop ? 'border: none;' : ''} ${!isDesktop ? 'border: none;' : ''}
${isDragOver
? `
border: 2px dashed var(--c--contextuals--border--semantic--brand--primary);
background-color: var(--c--contextuals--background--semantic--brand--tertiary);
`
: ''}
`} `}
$padding={{ $padding={{
top: 'base',
horizontal: isDesktop ? 'md' : 'xs',
bottom: 'md', bottom: 'md',
}} }}
{...(withUpload ? getRootProps({ className: 'dropzone' }) : {})}
> >
<Text as="h2" $size="h4" $margin={{ top: '0px', bottom: '10px' }}> {withUpload && <input {...getInputProps()} />}
{title} <DocGridTitleBar
</Text> target={target}
onUploadClick={open}
withUpload={withUpload}
/>
{!hasDocs && !loading && ( {!hasDocs && !loading && (
<Box $padding={{ vertical: 'sm' }} $align="center" $justify="center"> <Box $padding={{ vertical: 'sm' }} $align="center" $justify="center">
@@ -110,7 +148,11 @@ export const DocsGrid = ({
</Box> </Box>
)} )}
{hasDocs && ( {hasDocs && (
<Box $gap="6px" $overflow="auto"> <Box
$gap="6px"
$overflow="auto"
$padding={{ vertical: 'sm', horizontal: isDesktop ? 'md' : 'xs' }}
>
<Box role="grid" aria-label={t('Documents grid')}> <Box role="grid" aria-label={t('Documents grid')}>
<Box role="rowgroup"> <Box role="rowgroup">
<Box <Box
@@ -172,6 +214,73 @@ export const DocsGrid = ({
); );
}; };
const DocGridTitleBar = ({
target,
onUploadClick,
withUpload,
}: {
target: DocDefaultFilter;
onUploadClick: () => void;
withUpload: boolean;
}) => {
const { t } = useTranslation();
const { isDesktop } = useResponsiveStore();
let title = t('All docs');
let icon = <Icon icon={<AllDocs width={24} height={24} />} />;
switch (target) {
case DocDefaultFilter.MY_DOCS:
icon = <Icon iconName="lock" />;
title = t('My docs');
break;
case DocDefaultFilter.SHARED_WITH_ME:
icon = <Icon iconName="group" />;
title = t('Shared with me');
break;
case DocDefaultFilter.TRASHBIN:
icon = <Icon iconName="delete" />;
title = t('Trashbin');
break;
default:
title = t('All docs');
}
return (
<Box
$direction="row"
$padding={{
vertical: 'md',
horizontal: isDesktop ? 'md' : 'xs',
}}
$css={css`
border-bottom: 1px solid var(--c--contextuals--border--surface--primary);
`}
$align="center"
$justify="space-between"
>
<Box $direction="row" $gap="xs" $align="center">
{icon}
<Text as="h2" $size="h4" $margin="none">
{title}
</Text>
</Box>
{withUpload && (
<Button
color="brand"
variant="tertiary"
onClick={(e) => {
e.stopPropagation();
onUploadClick();
}}
aria-label={t('Open the upload dialog')}
>
<Icon iconName="upload_file" $withThemeInherited />
</Button>
)}
</Box>
);
};
const useDocsQuery = (target: DocDefaultFilter) => { const useDocsQuery = (target: DocDefaultFilter) => {
const trashbinQuery = useInfiniteDocsTrashbin( const trashbinQuery = useInfiniteDocsTrashbin(
{ {

View File

@@ -2,6 +2,7 @@ import { usePathname, useSearchParams } from 'next/navigation';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { css } from 'styled-components'; import { css } from 'styled-components';
import AllDocs from '@/assets/icons/doc-all.svg';
import { Box, Icon, StyledLink, Text } from '@/components'; import { Box, Icon, StyledLink, Text } from '@/components';
import { useCunninghamTheme } from '@/cunningham'; import { useCunninghamTheme } from '@/cunningham';
import { DocDefaultFilter } from '@/docs/doc-management'; import { DocDefaultFilter } from '@/docs/doc-management';
@@ -21,22 +22,22 @@ export const LeftPanelTargetFilters = () => {
const defaultQueries = [ const defaultQueries = [
{ {
icon: 'apps', icon: <Icon icon={<AllDocs width={24} height={24} />} />,
label: t('All docs'), label: t('All docs'),
targetQuery: DocDefaultFilter.ALL_DOCS, targetQuery: DocDefaultFilter.ALL_DOCS,
}, },
{ {
icon: 'lock', icon: <Icon iconName="lock" />,
label: t('My docs'), label: t('My docs'),
targetQuery: DocDefaultFilter.MY_DOCS, targetQuery: DocDefaultFilter.MY_DOCS,
}, },
{ {
icon: 'group', icon: <Icon iconName="group" />,
label: t('Shared with me'), label: t('Shared with me'),
targetQuery: DocDefaultFilter.SHARED_WITH_ME, targetQuery: DocDefaultFilter.SHARED_WITH_ME,
}, },
{ {
icon: 'delete', icon: <Icon iconName="delete" />,
label: t('Trashbin'), label: t('Trashbin'),
targetQuery: DocDefaultFilter.TRASHBIN, targetQuery: DocDefaultFilter.TRASHBIN,
}, },
@@ -96,7 +97,7 @@ export const LeftPanelTargetFilters = () => {
} }
`} `}
> >
<Icon iconName={query.icon} /> {query.icon}
<Text $size="sm">{query.label}</Text> <Text $size="sm">{query.label}</Text>
</StyledLink> </StyledLink>
); );

View File

@@ -69,7 +69,7 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', 'wrong-api-key') .set('authorization', `Bearer wrong-api-key`)
.set('content-type', 'application/json'); .set('content-type', 'application/json');
expect(response.status).toBe(401); expect(response.status).toBe(401);
@@ -99,7 +99,7 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/json'); .set('content-type', 'application/json');
expect(response.status).toBe(400); expect(response.status).toBe(400);
@@ -114,7 +114,7 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/json') .set('content-type', 'application/json')
.send(''); .send('');
@@ -129,9 +129,10 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'image/png') .set('content-type', 'image/png')
.send('randomdata'); .send('randomdata');
expect(response.status).toBe(415); expect(response.status).toBe(415);
expect(response.body).toStrictEqual({ error: 'Unsupported Content-Type' }); expect(response.body).toStrictEqual({ error: 'Unsupported Content-Type' });
}); });
@@ -141,38 +142,73 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'text/markdown') .set('content-type', 'text/markdown')
.set('accept', 'image/png') .set('accept', 'image/png')
.send('# Header'); .send('# Header');
expect(response.status).toBe(406); expect(response.status).toBe(406);
expect(response.body).toStrictEqual({ error: 'Unsupported format' }); expect(response.body).toStrictEqual({ error: 'Unsupported format' });
}); });
test.each([[apiKey], [`Bearer ${apiKey}`]])( test('POST /api/convert BlockNote to Markdown', async () => {
'POST /api/convert with correct content with Authorization: %s', const app = initApp();
async (authHeader) => { const response = await request(app)
const app = initApp(); .post('/api/convert')
.set('origin', origin)
.set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.blocknote+json')
.set('accept', 'text/markdown')
.send(expectedBlocks);
const response = await request(app) expect(response.status).toBe(200);
.post('/api/convert') expect(response.header['content-type']).toBe(
.set('Origin', origin) 'text/markdown; charset=utf-8',
.set('Authorization', authHeader) );
.set('content-type', 'text/markdown') expect(typeof response.text).toBe('string');
.set('accept', 'application/vnd.yjs.doc') expect(response.text.trim()).toBe(expectedMarkdown);
.send(expectedMarkdown); });
expect(response.status).toBe(200); test('POST /api/convert BlockNote to Yjs', async () => {
expect(response.body).toBeInstanceOf(Buffer); const app = initApp();
const editor = ServerBlockNoteEditor.create();
const blocks = await editor.tryParseMarkdownToBlocks(expectedMarkdown);
const response = await request(app)
.post('/api/convert')
.set('origin', origin)
.set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.blocknote+json')
.set('accept', 'application/vnd.yjs.doc')
.send(blocks)
.responseType('blob');
const editor = ServerBlockNoteEditor.create(); expect(response.status).toBe(200);
const doc = new Y.Doc(); expect(response.header['content-type']).toBe('application/vnd.yjs.doc');
Y.applyUpdate(doc, response.body);
const blocks = editor.yDocToBlocks(doc, 'document-store');
expect(blocks).toStrictEqual(expectedBlocks); // Decode the Yjs response and verify it contains the correct blocks
}, const responseBuffer = Buffer.from(response.body as Buffer);
); const ydoc = new Y.Doc();
Y.applyUpdate(ydoc, responseBuffer);
const decodedBlocks = editor.yDocToBlocks(ydoc, 'document-store');
expect(decodedBlocks).toStrictEqual(expectedBlocks);
});
test('POST /api/convert BlockNote to HTML', async () => {
const app = initApp();
const response = await request(app)
.post('/api/convert')
.set('origin', origin)
.set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.blocknote+json')
.set('accept', 'text/html')
.send(expectedBlocks);
expect(response.status).toBe(200);
expect(response.header['content-type']).toBe('text/html; charset=utf-8');
expect(typeof response.text).toBe('string');
expect(response.text).toBe(expectedHTML);
});
test('POST /api/convert Yjs to HTML', async () => { test('POST /api/convert Yjs to HTML', async () => {
const app = initApp(); const app = initApp();
@@ -183,10 +219,11 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.yjs.doc') .set('content-type', 'application/vnd.yjs.doc')
.set('accept', 'text/html') .set('accept', 'text/html')
.send(Buffer.from(yjsUpdate)); .send(Buffer.from(yjsUpdate));
expect(response.status).toBe(200); expect(response.status).toBe(200);
expect(response.header['content-type']).toBe('text/html; charset=utf-8'); expect(response.header['content-type']).toBe('text/html; charset=utf-8');
expect(typeof response.text).toBe('string'); expect(typeof response.text).toBe('string');
@@ -202,10 +239,11 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.yjs.doc') .set('content-type', 'application/vnd.yjs.doc')
.set('accept', 'text/markdown') .set('accept', 'text/markdown')
.send(Buffer.from(yjsUpdate)); .send(Buffer.from(yjsUpdate));
expect(response.status).toBe(200); expect(response.status).toBe(200);
expect(response.header['content-type']).toBe( expect(response.header['content-type']).toBe(
'text/markdown; charset=utf-8', 'text/markdown; charset=utf-8',
@@ -223,15 +261,16 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.yjs.doc') .set('content-type', 'application/vnd.yjs.doc')
.set('accept', 'application/json') .set('accept', 'application/json')
.send(Buffer.from(yjsUpdate)); .send(Buffer.from(yjsUpdate));
expect(response.status).toBe(200); expect(response.status).toBe(200);
expect(response.header['content-type']).toBe( expect(response.header['content-type']).toBe(
'application/json; charset=utf-8', 'application/json; charset=utf-8',
); );
expect(Array.isArray(response.body)).toBe(true); expect(response.body).toBeInstanceOf(Array);
expect(response.body).toStrictEqual(expectedBlocks); expect(response.body).toStrictEqual(expectedBlocks);
}); });
@@ -240,15 +279,16 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'text/markdown') .set('content-type', 'text/markdown')
.set('accept', 'application/json') .set('accept', 'application/json')
.send(expectedMarkdown); .send(expectedMarkdown);
expect(response.status).toBe(200); expect(response.status).toBe(200);
expect(response.header['content-type']).toBe( expect(response.header['content-type']).toBe(
'application/json; charset=utf-8', 'application/json; charset=utf-8',
); );
expect(Array.isArray(response.body)).toBe(true); expect(response.body).toBeInstanceOf(Array);
expect(response.body).toStrictEqual(expectedBlocks); expect(response.body).toStrictEqual(expectedBlocks);
}); });
@@ -257,11 +297,12 @@ describe('Server Tests', () => {
const response = await request(app) const response = await request(app)
.post('/api/convert') .post('/api/convert')
.set('origin', origin) .set('origin', origin)
.set('authorization', apiKey) .set('authorization', `Bearer ${apiKey}`)
.set('content-type', 'application/vnd.yjs.doc') .set('content-type', 'application/vnd.yjs.doc')
.set('accept', 'application/json') .set('accept', 'application/json')
.send(Buffer.from('notvalidyjs')); .send(Buffer.from('notvalidyjs'));
expect(response.status).toBe(400); expect(response.status).toBe(400);
expect(response.body).toStrictEqual({ error: 'Invalid Yjs content' }); expect(response.body).toStrictEqual({ error: 'Invalid content' });
}); });
}); });

View File

@@ -14,27 +14,115 @@ interface ErrorResponse {
error: string; error: string;
} }
type ConversionResponseBody = Uint8Array | string | object | ErrorResponse;
interface InputReader {
supportedContentTypes: string[];
read(data: Buffer): Promise<PartialBlock[]>;
}
interface OutputWriter {
supportedContentTypes: string[];
write(blocks: PartialBlock[]): Promise<ConversionResponseBody>;
}
const editor = ServerBlockNoteEditor.create< const editor = ServerBlockNoteEditor.create<
DefaultBlockSchema, DefaultBlockSchema,
DefaultInlineContentSchema, DefaultInlineContentSchema,
DefaultStyleSchema DefaultStyleSchema
>(); >();
const ContentTypes = {
XMarkdown: 'text/x-markdown',
Markdown: 'text/markdown',
YJS: 'application/vnd.yjs.doc',
FormUrlEncoded: 'application/x-www-form-urlencoded',
OctetStream: 'application/octet-stream',
HTML: 'text/html',
BlockNote: 'application/vnd.blocknote+json',
JSON: 'application/json',
} as const;
const createYDocument = (blocks: PartialBlock[]) =>
editor.blocksToYDoc(blocks, 'document-store');
const readers: InputReader[] = [
{
// application/x-www-form-urlencoded is interpreted as Markdown for backward compatibility
supportedContentTypes: [
ContentTypes.Markdown,
ContentTypes.XMarkdown,
ContentTypes.FormUrlEncoded,
],
read: (data) => editor.tryParseMarkdownToBlocks(data.toString()),
},
{
supportedContentTypes: [ContentTypes.YJS, ContentTypes.OctetStream],
read: async (data) => {
const ydoc = new Y.Doc();
Y.applyUpdate(ydoc, data);
return editor.yDocToBlocks(ydoc, 'document-store') as PartialBlock[];
},
},
{
supportedContentTypes: [ContentTypes.BlockNote],
read: async (data) => JSON.parse(data.toString()),
},
];
const writers: OutputWriter[] = [
{
supportedContentTypes: [ContentTypes.BlockNote, ContentTypes.JSON],
write: async (blocks) => blocks,
},
{
supportedContentTypes: [ContentTypes.YJS, ContentTypes.OctetStream],
write: async (blocks) => Y.encodeStateAsUpdate(createYDocument(blocks)),
},
{
supportedContentTypes: [ContentTypes.Markdown, ContentTypes.XMarkdown],
write: (blocks) => editor.blocksToMarkdownLossy(blocks),
},
{
supportedContentTypes: [ContentTypes.HTML],
write: (blocks) => editor.blocksToHTMLLossy(blocks),
},
];
const normalizeContentType = (value: string) => value.split(';')[0];
export const convertHandler = async ( export const convertHandler = async (
req: Request<object, Uint8Array | ErrorResponse, Buffer, object>, req: Request<object, Uint8Array | ErrorResponse, Buffer, object>,
res: Response<Uint8Array | string | object | ErrorResponse>, res: Response<ConversionResponseBody>,
) => { ) => {
if (!req.body || req.body.length === 0) { if (!req.body || req.body.length === 0) {
res.status(400).json({ error: 'Invalid request: missing content' }); res.status(400).json({ error: 'Invalid request: missing content' });
return; return;
} }
const contentType = (req.header('content-type') || 'text/markdown').split( const contentType = normalizeContentType(
';', req.header('content-type') || ContentTypes.Markdown,
)[0]; );
const accept = (req.header('accept') || 'application/vnd.yjs.doc').split(
';', const reader = readers.find((reader) =>
)[0]; reader.supportedContentTypes.includes(contentType),
);
if (!reader) {
res.status(415).json({ error: 'Unsupported Content-Type' });
return;
}
const accept = normalizeContentType(req.header('accept') || ContentTypes.YJS);
const writer = writers.find((writer) =>
writer.supportedContentTypes.includes(accept),
);
if (!writer) {
res.status(406).json({ error: 'Unsupported format' });
return;
}
let blocks: let blocks:
| PartialBlock< | PartialBlock<
@@ -44,63 +132,23 @@ export const convertHandler = async (
>[] >[]
| null = null; | null = null;
try { try {
// First, convert from the input format to blocks try {
// application/x-www-form-urlencoded is interpreted as Markdown for backward compatibility blocks = await reader.read(req.body);
if ( } catch (e) {
contentType === 'text/markdown' || logger('Invalid content:', e);
contentType === 'application/x-www-form-urlencoded' res.status(400).json({ error: 'Invalid content' });
) {
blocks = await editor.tryParseMarkdownToBlocks(req.body.toString());
} else if (
contentType === 'application/vnd.yjs.doc' ||
contentType === 'application/octet-stream'
) {
try {
const ydoc = new Y.Doc();
Y.applyUpdate(ydoc, req.body);
blocks = editor.yDocToBlocks(ydoc, 'document-store') as PartialBlock[];
} catch (e) {
logger('Invalid Yjs content:', e);
res.status(400).json({ error: 'Invalid Yjs content' });
return;
}
} else {
res.status(415).json({ error: 'Unsupported Content-Type' });
return; return;
} }
if (!blocks || blocks.length === 0) { if (!blocks || blocks.length === 0) {
res.status(500).json({ error: 'No valid blocks were generated' }); res.status(500).json({ error: 'No valid blocks were generated' });
return; return;
} }
// Then, convert from blocks to the output format res
if (accept === 'application/json') { .status(200)
res.status(200).json(blocks); .setHeader('content-type', accept)
} else { .send(await writer.write(blocks));
const yDocument = editor.blocksToYDoc(blocks, 'document-store');
if (
accept === 'application/vnd.yjs.doc' ||
accept === 'application/octet-stream'
) {
res
.status(200)
.setHeader('content-type', 'application/octet-stream')
.send(Y.encodeStateAsUpdate(yDocument));
} else if (accept === 'text/markdown') {
res
.status(200)
.setHeader('content-type', 'text/markdown')
.send(await editor.blocksToMarkdownLossy(blocks));
} else if (accept === 'text/html') {
res
.status(200)
.setHeader('content-type', 'text/html')
.send(await editor.blocksToHTMLLossy(blocks));
} else {
res.status(406).json({ error: 'Unsupported format' });
}
}
} catch (e) { } catch (e) {
logger('conversion failed:', e); logger('conversion failed:', e);
res.status(500).json({ error: 'An error occurred' }); res.status(500).json({ error: 'An error occurred' });

View File

@@ -6815,6 +6815,11 @@ at-least-node@^1.0.0:
resolved "https://registry.yarnpkg.com/at-least-node/-/at-least-node-1.0.0.tgz#602cd4b46e844ad4effc92a8011a3c46e0238dc2" resolved "https://registry.yarnpkg.com/at-least-node/-/at-least-node-1.0.0.tgz#602cd4b46e844ad4effc92a8011a3c46e0238dc2"
integrity sha512-+q/t7Ekv1EDY2l6Gda6LLiX14rU9TV20Wa3ofeQmwPFZbOMo9DXrLbOjFaaclkXKWidIaopwAObQDqwWtGUjqg== integrity sha512-+q/t7Ekv1EDY2l6Gda6LLiX14rU9TV20Wa3ofeQmwPFZbOMo9DXrLbOjFaaclkXKWidIaopwAObQDqwWtGUjqg==
attr-accept@^2.2.4:
version "2.2.5"
resolved "https://registry.yarnpkg.com/attr-accept/-/attr-accept-2.2.5.tgz#d7061d958e6d4f97bf8665c68b75851a0713ab5e"
integrity sha512-0bDNnY/u6pPwHDMoF0FieU354oBi0a8rD9FcsLwzcGWbc8KS8KPIi7y+s13OlVY+gMWc/9xEMUgNE6Qm8ZllYQ==
available-typed-arrays@^1.0.7: available-typed-arrays@^1.0.7:
version "1.0.7" version "1.0.7"
resolved "https://registry.yarnpkg.com/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz#a5cc375d6a03c2efc87a553f3e0b1522def14846" resolved "https://registry.yarnpkg.com/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz#a5cc375d6a03c2efc87a553f3e0b1522def14846"
@@ -8817,6 +8822,13 @@ figlet@1.8.1:
resolved "https://registry.yarnpkg.com/figlet/-/figlet-1.8.1.tgz#e8e8a07e8c16be24c31086d7d5de8a9b9cf7f0fd" resolved "https://registry.yarnpkg.com/figlet/-/figlet-1.8.1.tgz#e8e8a07e8c16be24c31086d7d5de8a9b9cf7f0fd"
integrity sha512-kEC3Sme+YvA8Hkibv0NR1oClGcWia0VB2fC1SlMy027cwe795Xx40Xiv/nw/iFAwQLupymWh+uhAAErn/7hwPg== integrity sha512-kEC3Sme+YvA8Hkibv0NR1oClGcWia0VB2fC1SlMy027cwe795Xx40Xiv/nw/iFAwQLupymWh+uhAAErn/7hwPg==
file-selector@^2.1.0:
version "2.1.2"
resolved "https://registry.yarnpkg.com/file-selector/-/file-selector-2.1.2.tgz#fe7c7ee9e550952dfbc863d73b14dc740d7de8b4"
integrity sha512-QgXo+mXTe8ljeqUFaX3QVHc5osSItJ/Km+xpocx0aSqWGMSCf6qYs/VnzZgS864Pjn5iceMRFigeAV7AfTlaig==
dependencies:
tslib "^2.7.0"
file-entry-cache@^11.1.1: file-entry-cache@^11.1.1:
version "11.1.1" version "11.1.1"
resolved "https://registry.yarnpkg.com/file-entry-cache/-/file-entry-cache-11.1.1.tgz#728918c624dbeb09372276837ea0c413ec78806b" resolved "https://registry.yarnpkg.com/file-entry-cache/-/file-entry-cache-11.1.1.tgz#728918c624dbeb09372276837ea0c413ec78806b"
@@ -12786,6 +12798,15 @@ react-dom@*, react-dom@19.2.3:
dependencies: dependencies:
scheduler "^0.27.0" scheduler "^0.27.0"
react-dropzone@14.3.8:
version "14.3.8"
resolved "https://registry.yarnpkg.com/react-dropzone/-/react-dropzone-14.3.8.tgz#a7eab118f8a452fe3f8b162d64454e81ba830582"
integrity sha512-sBgODnq+lcA4P296DY4wacOZz3JFpD99fp+hb//iBO2HHnyeZU3FwWyXJ6salNpqQdsZrgMrotuko/BdJMV8Ug==
dependencies:
attr-accept "^2.2.4"
file-selector "^2.1.0"
prop-types "^15.8.1"
react-i18next@16.5.0: react-i18next@16.5.0:
version "16.5.0" version "16.5.0"
resolved "https://registry.yarnpkg.com/react-i18next/-/react-i18next-16.5.0.tgz#107e4323742344a2f8792feb905cea551da6fd2c" resolved "https://registry.yarnpkg.com/react-i18next/-/react-i18next-16.5.0.tgz#107e4323742344a2f8792feb905cea551da6fd2c"