mirror of
https://github.com/suitenumerique/messages.git
synced 2026-04-25 17:15:21 +02:00
✨(import) add support for PST imports & stream data for mbox (#544)
Fixes #183. Replaces #517.
This commit is contained in:
213
docs/pst.md
Normal file
213
docs/pst.md
Normal file
@@ -0,0 +1,213 @@
|
||||
# PST Import
|
||||
|
||||
This document describes how PST (Personal Storage Table) files are imported
|
||||
into the application.
|
||||
|
||||
## Overview
|
||||
|
||||
PST files are the native file format for Microsoft Outlook mailboxes. The
|
||||
import pipeline reads messages from a PST file stored in S3, converts them to
|
||||
RFC 5322 (EML) format, and delivers them through the standard inbound message
|
||||
pipeline with `is_import=True`.
|
||||
|
||||
**Entry point:** `process_pst_file_task` in `core/services/importer/pst_tasks.py`
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
S3 (message-imports bucket)
|
||||
│
|
||||
▼
|
||||
S3SeekableReader (block-aligned LRU cache)
|
||||
│
|
||||
▼
|
||||
pypff (libpff) ── reads PST B-tree structures
|
||||
│
|
||||
▼
|
||||
pst.py ── folder detection, message extraction, EML reconstruction
|
||||
│
|
||||
▼
|
||||
pst_tasks.py ── Celery task, flag/label mapping, progress reporting
|
||||
│
|
||||
▼
|
||||
deliver_inbound_message(is_import=True)
|
||||
```
|
||||
|
||||
## S3 Seekable Reader
|
||||
|
||||
PST files are read directly from S3 without downloading to disk. The
|
||||
`S3SeekableReader` class (`core/services/importer/s3_seekable.py`) implements
|
||||
a seekable file-like object backed by S3 range requests.
|
||||
|
||||
For PST files, the reader uses `BUFFER_NONE` strategy with a **block-aligned
|
||||
LRU cache** — 64 KB blocks with up to 2048 cache slots (128 MB max). This
|
||||
matches pypff's random-access pattern when traversing PST B-tree structures.
|
||||
|
||||
## Folder Detection
|
||||
|
||||
PST files contain a hierarchy of folders. Some are "special" folders (Inbox,
|
||||
Sent Items, Drafts, etc.) that need specific handling during import. Detection
|
||||
uses a **3-tier fallback strategy**:
|
||||
|
||||
### Tier 1: Message Store Entry IDs
|
||||
|
||||
The PST message store (`pst.get_message_store()`) may contain entry ID
|
||||
properties that directly identify special folders:
|
||||
|
||||
| Property | Tag | Folder |
|
||||
|-------------------------------|----------|----------------|
|
||||
| `PR_IPM_SUBTREE_ENTRYID` | `0x35E0` | IPM Subtree |
|
||||
| `PR_IPM_OUTBOX_ENTRYID` | `0x35E2` | Outbox |
|
||||
| `PR_IPM_WASTEBASKET_ENTRYID` | `0x35E3` | Deleted Items |
|
||||
| `PR_IPM_SENTMAIL_ENTRYID` | `0x35E4` | Sent Items |
|
||||
|
||||
Each entry ID is 24 bytes. The last 4 bytes (LE uint32) contain the folder
|
||||
identifier, which matches `folder.get_identifier()` from pypff.
|
||||
|
||||
`PR_VALID_FOLDER_MASK` (`0x35DF`) indicates which entry IDs are valid:
|
||||
|
||||
| Bit | Folder |
|
||||
|-----|---------------------|
|
||||
| 0 | IPM Subtree |
|
||||
| 1 | Inbox |
|
||||
| 2 | Outbox |
|
||||
| 3 | Deleted Items |
|
||||
| 4 | Sent Items |
|
||||
| 7 | Finder / Search |
|
||||
|
||||
In locally-created Outlook PSTs, all bits are typically set (`0xFF`) and Tier
|
||||
1 detects all special folders. In Exchange/O365-exported PSTs, only some bits
|
||||
may be set — for example `0x89` (subtree + wastebasket + finder only), meaning
|
||||
Sent Items, Outbox, and Inbox entry IDs are absent and **Tier 1 only detects
|
||||
Deleted Items**. The remaining folders must be detected by Tier 2 or 3.
|
||||
|
||||
| PST type | `PR_VALID_FOLDER_MASK` | Tier 1 detects | Needs fallback for |
|
||||
|-------------------------------------|------------------------|------------------------|----------------------------------|
|
||||
| Local Outlook | `0xFF` | All special folders | Nothing |
|
||||
| Exchange/O365 migration | `0x89` | Deleted Items only | Inbox, Sent, Outbox, Drafts |
|
||||
| Other/unknown | varies | Whatever IDs are valid | Anything missing |
|
||||
|
||||
### Tier 2: SourceWellKnownFolderType (Named Property)
|
||||
|
||||
PSTs exported by Microsoft's migration tools (Exchange/O365) contain a named
|
||||
property that identifies special folders. This covers exactly the gap left by
|
||||
Tier 1 on Exchange PSTs — Inbox, Sent Items, Outbox, and Drafts:
|
||||
|
||||
- **GUID:** `{9137a2fd-2fa5-4409-91aa-2c3ee697350a}`
|
||||
- **Name:** `SourceWellKnownFolderType`
|
||||
|
||||
This is a named property, so its tag varies between PST files (resolved via
|
||||
the Name-to-ID Map at NID `0x61`). The values are:
|
||||
|
||||
| Value | Folder Type |
|
||||
|-------|----------------|
|
||||
| 10 | Inbox |
|
||||
| 11 | Sent Items |
|
||||
| 12 | Outbox |
|
||||
| 14 | Deleted Items |
|
||||
| 17 | Drafts |
|
||||
|
||||
The resolution process:
|
||||
1. Read the Entry Stream, GUID Stream, and String Stream from the
|
||||
Name-to-ID Map
|
||||
2. Parse NAMEID records (8 bytes each) to find the one matching the target
|
||||
GUID + string name
|
||||
3. Calculate the NPID: `0x8000 + wPropIdx`
|
||||
4. Read that property tag from each folder
|
||||
|
||||
This tier is only available on Exchange/O365 migration PSTs. It is absent from
|
||||
locally-created Outlook PSTs (which have all entry IDs via Tier 1 anyway).
|
||||
|
||||
### Tier 3: Folder Name Matching
|
||||
|
||||
As a final fallback, folder names are matched against a dictionary of known
|
||||
Outlook default folder names in multiple languages (English, French, German,
|
||||
Spanish, Italian, Dutch, Portuguese, Russian, Polish, Czech, Hungarian, Danish,
|
||||
Norwegian, Swedish, Finnish, Turkish, Japanese, Chinese, Korean, Arabic,
|
||||
Hebrew, Ukrainian, Romanian).
|
||||
|
||||
This matching is only applied to **direct children of the IPM subtree** to
|
||||
avoid false positives on user-created subfolders with coincidental names.
|
||||
|
||||
### Detection Priority
|
||||
|
||||
For each folder, detection is attempted in this order:
|
||||
1. Entry ID match (Tier 1) — highest priority
|
||||
2. SourceWellKnownFolderType match (Tier 2)
|
||||
3. Folder name match (Tier 3) — only for IPM subtree direct children
|
||||
4. Normal folder (no special handling)
|
||||
|
||||
## IPM Subtree
|
||||
|
||||
The "Top of Personal Folders" wrapper folder is skipped by locating the IPM
|
||||
subtree via `PR_IPM_SUBTREE_ENTRYID` on the message store. All folder
|
||||
iteration starts from the IPM subtree, not the root folder. This excludes
|
||||
internal folders like "Freebusy Data", "Search Root", etc.
|
||||
|
||||
## Message Processing
|
||||
|
||||
### EML Reconstruction
|
||||
|
||||
Each pypff message is converted to RFC 5322 format by `reconstruct_eml()`:
|
||||
|
||||
1. **Transport headers** (if available): Used for threading headers
|
||||
(`Message-ID`, `In-Reply-To`, `References`), sender, recipients, date
|
||||
2. **MAPI properties** (fallback): Sender from `PR_SENDER_*` properties,
|
||||
recipients from the recipient table, date from `delivery_time` /
|
||||
`client_submit_time`
|
||||
3. **Body**: Plain text and/or HTML body
|
||||
4. **Attachments**: Filenames from `PR_ATTACH_LONG_FILENAME` / `PR_ATTACH_FILENAME`
|
||||
MAPI properties, with MIME type from `PR_ATTACH_MIME_TAG`
|
||||
|
||||
### Sender Resolution
|
||||
|
||||
For Exchange (EX) address types, SMTP addresses are resolved in order:
|
||||
1. `PR_SENDER_SMTP_ADDRESS`
|
||||
2. `PR_SENDER_EMAIL_ADDRESS` (if it contains `@`)
|
||||
3. `sender_name` parsed as email
|
||||
4. Store owner email from message store `PR_DISPLAY_NAME` (fallback for
|
||||
Exchange sent items with no SMTP address)
|
||||
|
||||
### Flag and Label Mapping
|
||||
|
||||
| Folder Type | IMAP Label | IMAP Flags | `is_import_sender` |
|
||||
|-------------|------------|-------------|-------------------|
|
||||
| Inbox | *(none)* | | `False` |
|
||||
| Sent Items | `Sent` | | `True` |
|
||||
| Drafts | *(none)* | `\Draft` | `False` |
|
||||
| Deleted | `Trash` | | `False` |
|
||||
| Outbox | `OUTBOX` | | `True` |
|
||||
| Normal | folder path| | `False` |
|
||||
|
||||
**Subfolders of special folders** inherit the parent's special type (so they
|
||||
keep `is_import_sender`, IMAP flags, etc.) and get their own subfolder name as
|
||||
an additional label. For example, a "Sent Items/Archives 2024" subfolder yields
|
||||
`is_import_sender=True` + labels `Sent` and `Archives 2024`. Deeper nesting
|
||||
builds hierarchical paths: "Sent Items/Projects/Work" yields labels `Sent` and
|
||||
`Projects/Work`.
|
||||
|
||||
Per-message flags from MAPI properties:
|
||||
- `MSGFLAG_READ` → `\Seen`
|
||||
- `MSGFLAG_UNSENT` or Drafts folder → `\Draft`
|
||||
- `FLAG_STATUS >= 2` (follow-up) → `\Flagged`
|
||||
|
||||
### Chronological Ordering
|
||||
|
||||
Messages are collected in a first pass (lightweight metadata only), sorted by
|
||||
`delivery_time` (oldest first, `None` timestamps last), then reconstructed to
|
||||
EML one at a time in the second pass. This ensures proper threading while
|
||||
limiting memory usage.
|
||||
|
||||
## Key Files
|
||||
|
||||
| File | Description |
|
||||
|------|-------------|
|
||||
| `core/services/importer/pst.py` | PST parsing, folder detection, EML reconstruction |
|
||||
| `core/services/importer/pst_tasks.py` | Celery task, flag/label logic, progress reporting |
|
||||
| `core/services/importer/s3_seekable.py` | S3-backed seekable file reader with LRU cache |
|
||||
| `core/tests/importer/test_pst_import.py` | Unit and integration tests |
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **pypff** (libpff-python): Python bindings for reading PST files. Requires
|
||||
`build-essential` for compilation in Docker.
|
||||
@@ -28,7 +28,7 @@ Tasks are routed to specific queues based on their type. Queues are listed in pr
|
||||
| 2 | `inbound` | Inbound email processing (time-sensitive) |
|
||||
| 3 | `outbound` | Outbound email sending and retries |
|
||||
| 4 | `default` | General tasks (fallback for unrouted tasks) |
|
||||
| 5 | `imports` | File import processing (MBOX, EML, IMAP) |
|
||||
| 5 | `imports` | File import processing (MBOX, EML, PST, IMAP) |
|
||||
| 6 (lowest) | `reindex` | Search indexing |
|
||||
|
||||
### Queue Routing
|
||||
@@ -39,7 +39,10 @@ Tasks are automatically routed to queues based on their module:
|
||||
|-------------|-------|
|
||||
| `core.mda.inbound_tasks.*` | `inbound` |
|
||||
| `core.mda.outbound_tasks.*` | `outbound` |
|
||||
| `core.services.importer.tasks.*` | `imports` |
|
||||
| `core.services.importer.mbox_tasks.*` | `imports` |
|
||||
| `core.services.importer.eml_tasks.*` | `imports` |
|
||||
| `core.services.importer.imap_tasks.*` | `imports` |
|
||||
| `core.services.importer.pst_tasks.*` | `imports` |
|
||||
| `core.services.search.tasks.*` | `reindex` |
|
||||
| Everything else | `default` |
|
||||
|
||||
|
||||
@@ -22,7 +22,10 @@ apt-get update
|
||||
DEBIAN_FRONTEND="noninteractive" apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
rdfind \
|
||||
libmagic1
|
||||
libmagic1 \
|
||||
build-essential \
|
||||
python3-dev \
|
||||
zlib1g-dev
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
EOR
|
||||
|
||||
|
||||
@@ -559,6 +559,7 @@ class MessageAdmin(admin.ModelAdmin):
|
||||
recipient=recipient,
|
||||
user=request.user,
|
||||
request=request,
|
||||
filename=import_file.name,
|
||||
)
|
||||
if success:
|
||||
return redirect("..")
|
||||
|
||||
@@ -1538,7 +1538,7 @@
|
||||
"/api/v1.0/import/file/": {
|
||||
"post": {
|
||||
"operationId": "import_file_create",
|
||||
"description": "\n Import messages by uploading an EML or MBOX file.\n\n The import is processed asynchronously and returns a task ID for tracking.\n The file must be a valid EML or MBOX format. The recipient mailbox must exist\n and the user must have access to it.\n ",
|
||||
"description": "\n Import messages by uploading an EML, MBOX, or PST file.\n\n The import is processed asynchronously and returns a task ID for tracking.\n The file must be a valid EML, MBOX, or PST format. The recipient mailbox must exist\n and the user must have access to it.\n ",
|
||||
"tags": [
|
||||
"import"
|
||||
],
|
||||
@@ -1580,7 +1580,7 @@
|
||||
},
|
||||
"type": {
|
||||
"type": "string",
|
||||
"description": "Type of import (eml or mbox)"
|
||||
"description": "Type of import (eml, mbox, or pst)"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1334,7 +1334,9 @@ class ImportFileUploadSerializer(ImportBaseSerializer):
|
||||
def validate_content_type(self, value):
|
||||
"""Validate content type."""
|
||||
if value not in enums.ARCHIVE_SUPPORTED_MIME_TYPES:
|
||||
raise serializers.ValidationError("Only EML and MBOX files are supported.")
|
||||
raise serializers.ValidationError(
|
||||
"Only EML, MBOX, and PST files are supported."
|
||||
)
|
||||
return value
|
||||
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
"""API ViewSet for importing messages via EML, MBOX, or IMAP."""
|
||||
"""API ViewSet for importing messages via EML, MBOX, PST, or IMAP."""
|
||||
|
||||
from django.core.files.storage import storages
|
||||
from django.shortcuts import get_object_or_404
|
||||
@@ -27,10 +27,10 @@ from ..serializers import (
|
||||
@extend_schema(tags=["import"])
|
||||
class ImportViewSet(viewsets.ViewSet):
|
||||
"""
|
||||
ViewSet for importing messages via EML/MBOX file or IMAP.
|
||||
ViewSet for importing messages via EML/MBOX/PST file or IMAP.
|
||||
|
||||
This ViewSet provides endpoints for importing messages from:
|
||||
- EML/MBOX files uploaded directly
|
||||
- EML/MBOX/PST files uploaded directly
|
||||
- IMAP servers with configurable connection settings
|
||||
|
||||
All imports are processed asynchronously and return a task ID for tracking.
|
||||
@@ -53,7 +53,7 @@ class ImportViewSet(viewsets.ViewSet):
|
||||
},
|
||||
"type": {
|
||||
"type": "string",
|
||||
"description": "Type of import (eml or mbox)",
|
||||
"description": "Type of import (eml, mbox, or pst)",
|
||||
},
|
||||
},
|
||||
},
|
||||
@@ -65,16 +65,16 @@ class ImportViewSet(viewsets.ViewSet):
|
||||
404: OpenApiResponse(description="Specified mailbox not found"),
|
||||
},
|
||||
description="""
|
||||
Import messages by uploading an EML or MBOX file.
|
||||
Import messages by uploading an EML, MBOX, or PST file.
|
||||
|
||||
The import is processed asynchronously and returns a task ID for tracking.
|
||||
The file must be a valid EML or MBOX format. The recipient mailbox must exist
|
||||
The file must be a valid EML, MBOX, or PST format. The recipient mailbox must exist
|
||||
and the user must have access to it.
|
||||
""",
|
||||
)
|
||||
@action(detail=False, methods=["post"], url_path="file")
|
||||
def import_file(self, request):
|
||||
"""Import messages by uploading an EML or MBOX file."""
|
||||
"""Import messages by uploading an EML, MBOX, or PST file."""
|
||||
serializer = ImportFileSerializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
recipient_id = serializer.validated_data["recipient"]
|
||||
@@ -85,6 +85,7 @@ class ImportViewSet(viewsets.ViewSet):
|
||||
file_key=file_key,
|
||||
recipient=mailbox,
|
||||
user=request.user,
|
||||
filename=serializer.validated_data["filename"],
|
||||
)
|
||||
|
||||
if not success:
|
||||
|
||||
@@ -148,7 +148,10 @@ MBOX_SUPPORTED_MIME_TYPES = [
|
||||
"text/plain",
|
||||
"application/mbox",
|
||||
]
|
||||
ARCHIVE_SUPPORTED_MIME_TYPES = EML_SUPPORTED_MIME_TYPES + MBOX_SUPPORTED_MIME_TYPES
|
||||
PST_SUPPORTED_MIME_TYPES = ["application/vnd.ms-outlook"]
|
||||
ARCHIVE_SUPPORTED_MIME_TYPES = (
|
||||
EML_SUPPORTED_MIME_TYPES + MBOX_SUPPORTED_MIME_TYPES + PST_SUPPORTED_MIME_TYPES
|
||||
)
|
||||
|
||||
BLACKLISTED_PROXY_IMAGE_MIME_TYPES = [
|
||||
"image/svg+xml", # Can contain JavaScript and external references
|
||||
|
||||
@@ -6,12 +6,12 @@ from core.models import Mailbox
|
||||
|
||||
|
||||
class MessageImportForm(forms.Form):
|
||||
"""Form for importing EML or MBOX files in the admin interface."""
|
||||
"""Form for importing EML, MBOX, or PST files in the admin interface."""
|
||||
|
||||
import_file = forms.FileField(
|
||||
label="Import File",
|
||||
help_text="Select an EML or MBOX file to import",
|
||||
widget=forms.FileInput(attrs={"accept": ".eml,.mbox,mbox"}),
|
||||
help_text="Select an EML, MBOX, or PST file to import",
|
||||
widget=forms.FileInput(attrs={"accept": ".eml,.mbox,mbox,.pst"}),
|
||||
)
|
||||
recipient = forms.ModelChoiceField(
|
||||
queryset=Mailbox.objects.all(),
|
||||
@@ -27,9 +27,9 @@ class MessageImportForm(forms.Form):
|
||||
if not file:
|
||||
return None
|
||||
|
||||
if not file.name.endswith((".eml", ".mbox", "mbox")):
|
||||
if not file.name.endswith((".eml", ".mbox", "mbox", ".pst")):
|
||||
raise forms.ValidationError(
|
||||
"File must be either an EML (.eml) or MBOX (.mbox) file or named 'mbox'"
|
||||
"File must be an EML (.eml), MBOX (.mbox), or PST (.pst) file or named 'mbox'"
|
||||
)
|
||||
return file
|
||||
|
||||
|
||||
@@ -141,8 +141,15 @@ def set_basic_headers(message_part, jmap_data, in_reply_to=None):
|
||||
date, datetime.timezone.utc
|
||||
) # Use Django's timezone utils
|
||||
except (ValueError, TypeError):
|
||||
# Default to current time if parsing fails or type is wrong
|
||||
date = datetime.datetime.now(datetime.timezone.utc)
|
||||
# fromisoformat failed — try RFC5322 format (e.g. from PST transport headers)
|
||||
try:
|
||||
date = parsedate_to_datetime(date)
|
||||
# Ensure timezone-aware (date strings without a timezone yield a naive datetime)
|
||||
if date.tzinfo is None or date.tzinfo.utcoffset(date) is None:
|
||||
date = timezone.make_aware(date, datetime.timezone.utc)
|
||||
except (ValueError, TypeError, IndexError):
|
||||
# Default to current time if all parsing fails
|
||||
date = datetime.datetime.now(datetime.timezone.utc)
|
||||
elif isinstance(date, datetime.datetime):
|
||||
# Ensure provided datetime is timezone-aware
|
||||
if date.tzinfo is None or date.tzinfo.utcoffset(date) is None:
|
||||
|
||||
139
src/backend/core/services/importer/eml_tasks.py
Normal file
139
src/backend/core/services/importer/eml_tasks.py
Normal file
@@ -0,0 +1,139 @@
|
||||
"""EML file import task."""
|
||||
|
||||
# pylint: disable=broad-exception-caught
|
||||
from typing import Any, Dict
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import storages
|
||||
|
||||
from celery.utils.log import get_task_logger
|
||||
from sentry_sdk import capture_exception
|
||||
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.mda.rfc5322 import parse_email_message
|
||||
from core.models import Mailbox
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def process_eml_file_task(self, file_key: str, recipient_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Process an EML file asynchronously.
|
||||
|
||||
Args:
|
||||
file_key: The storage key of the EML file
|
||||
recipient_id: The UUID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
try:
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process message",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 1,
|
||||
"type": "eml",
|
||||
"current_message": 0,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
try:
|
||||
# Update progress state
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 1 of 1",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Get storage and read file
|
||||
message_imports_storage = storages["message-imports"]
|
||||
with message_imports_storage.open(file_key, "rb") as file:
|
||||
file_content = file.read()
|
||||
|
||||
# Check message size limit
|
||||
if len(file_content) > settings.MAX_INCOMING_EMAIL_SIZE:
|
||||
error_msg = f"File too large: {len(file_content)} bytes"
|
||||
logger.warning("Skipping oversized EML file: %d bytes", len(file_content))
|
||||
result = {
|
||||
"message_status": "Failed to process message",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 1,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
# Parse the email message
|
||||
parsed_email = parse_email_message(file_content)
|
||||
# Deliver the message
|
||||
success = deliver_inbound_message(
|
||||
str(recipient), parsed_email, file_content, is_import=True
|
||||
)
|
||||
|
||||
result = {
|
||||
"message_status": "Completed processing message",
|
||||
"total_messages": 1,
|
||||
"success_count": 1 if success else 0,
|
||||
"failure_count": 0 if success else 1,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
|
||||
if success:
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": result,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": "Failed to deliver message",
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception(
|
||||
"Error processing EML file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
result = {
|
||||
"message_status": "Failed to process message",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 1,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": "An error occurred while processing the EML file.",
|
||||
}
|
||||
@@ -479,25 +479,32 @@ def process_folder_messages( # pylint: disable=too-many-arguments
|
||||
# Fetch message with flags using retry logic
|
||||
flags, raw_email = _fetch_message_with_flags_retry(imap_connection, msg_num)
|
||||
|
||||
# Parse message
|
||||
parsed_email = parse_email_message(raw_email)
|
||||
|
||||
# TODO: better heuristic to determine if the message is from the sender
|
||||
is_sender = parsed_email["from"]["email"].lower() == username.lower()
|
||||
|
||||
# Deliver message
|
||||
if deliver_inbound_message(
|
||||
str(recipient),
|
||||
parsed_email,
|
||||
raw_email,
|
||||
is_import=True,
|
||||
is_import_sender=is_sender,
|
||||
imap_labels=[display_name],
|
||||
imap_flags=flags,
|
||||
):
|
||||
success_count += 1
|
||||
else:
|
||||
# Check message size limit
|
||||
if len(raw_email) > settings.MAX_INCOMING_EMAIL_SIZE:
|
||||
logger.warning(
|
||||
"Skipping oversized IMAP message: %d bytes", len(raw_email)
|
||||
)
|
||||
failure_count += 1
|
||||
else:
|
||||
# Parse message
|
||||
parsed_email = parse_email_message(raw_email)
|
||||
|
||||
# TODO: better heuristic to determine if the message is from the sender
|
||||
is_sender = parsed_email["from"]["email"].lower() == username.lower()
|
||||
|
||||
# Deliver message
|
||||
if deliver_inbound_message(
|
||||
str(recipient),
|
||||
parsed_email,
|
||||
raw_email,
|
||||
is_import=True,
|
||||
is_import_sender=is_sender,
|
||||
imap_labels=[display_name],
|
||||
imap_flags=flags,
|
||||
):
|
||||
success_count += 1
|
||||
else:
|
||||
failure_count += 1
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(
|
||||
|
||||
156
src/backend/core/services/importer/imap_tasks.py
Normal file
156
src/backend/core/services/importer/imap_tasks.py
Normal file
@@ -0,0 +1,156 @@
|
||||
"""IMAP import task."""
|
||||
|
||||
# pylint: disable=broad-exception-caught
|
||||
from typing import Any, Dict
|
||||
|
||||
from celery.utils.log import get_task_logger
|
||||
|
||||
from core.models import Mailbox
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
from .imap import (
|
||||
IMAPConnectionManager,
|
||||
create_folder_mapping,
|
||||
get_message_numbers,
|
||||
get_selectable_folders,
|
||||
process_folder_messages,
|
||||
select_imap_folder,
|
||||
)
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def import_imap_messages_task(
|
||||
self,
|
||||
imap_server: str,
|
||||
imap_port: int,
|
||||
username: str,
|
||||
password: str,
|
||||
use_ssl: bool,
|
||||
recipient_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""Import messages from an IMAP server.
|
||||
|
||||
Args:
|
||||
imap_server: IMAP server hostname
|
||||
imap_port: IMAP server port
|
||||
username: Email address for login
|
||||
password: Password for login
|
||||
use_ssl: Whether to use SSL
|
||||
recipient_id: ID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
success_count = 0
|
||||
failure_count = 0
|
||||
total_messages = 0
|
||||
current_message = 0
|
||||
|
||||
try:
|
||||
# Get recipient mailbox
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
|
||||
# Connect to IMAP server using context manager
|
||||
with IMAPConnectionManager(
|
||||
imap_server, imap_port, username, password, use_ssl
|
||||
) as imap:
|
||||
# Get selectable folders
|
||||
selectable_folders = get_selectable_folders(imap, username, imap_server)
|
||||
|
||||
# Process all folders
|
||||
folders_to_process = selectable_folders
|
||||
|
||||
# Create folder mapping
|
||||
folder_mapping = create_folder_mapping(
|
||||
selectable_folders, username, imap_server
|
||||
)
|
||||
|
||||
# Count total messages and cache message lists per folder
|
||||
folder_messages = {}
|
||||
for folder_name in folders_to_process:
|
||||
if select_imap_folder(imap, folder_name):
|
||||
message_list = get_message_numbers(
|
||||
imap, folder_name, username, imap_server
|
||||
)
|
||||
if message_list:
|
||||
folder_messages[folder_name] = message_list
|
||||
total_messages += len(message_list)
|
||||
|
||||
# Process each folder (reusing cached message lists)
|
||||
for folder_to_process in folders_to_process:
|
||||
if folder_to_process not in folder_messages:
|
||||
continue
|
||||
|
||||
display_name = folder_mapping.get(folder_to_process, folder_to_process)
|
||||
message_list = folder_messages[folder_to_process]
|
||||
|
||||
# Re-select folder for processing
|
||||
if not select_imap_folder(imap, folder_to_process):
|
||||
logger.warning(
|
||||
"Skipping folder %s - could not select it", folder_to_process
|
||||
)
|
||||
continue
|
||||
|
||||
# Process messages in this folder
|
||||
success_count, failure_count, current_message = process_folder_messages(
|
||||
imap_connection=imap,
|
||||
folder=folder_to_process,
|
||||
display_name=display_name,
|
||||
message_list=message_list,
|
||||
recipient=recipient,
|
||||
username=username,
|
||||
task_instance=self,
|
||||
success_count=success_count,
|
||||
failure_count=failure_count,
|
||||
current_message=current_message,
|
||||
total_messages=total_messages,
|
||||
)
|
||||
|
||||
# Determine appropriate message status
|
||||
if len(folders_to_process) == 1:
|
||||
# If only one folder was processed, show which folder it was
|
||||
actual_folder = folders_to_process[0]
|
||||
message_status = (
|
||||
f"Completed processing messages from folder '{actual_folder}'"
|
||||
)
|
||||
else:
|
||||
message_status = "Completed processing messages from all folders"
|
||||
|
||||
result = {
|
||||
"message_status": message_status,
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "imap",
|
||||
"current_message": current_message,
|
||||
}
|
||||
|
||||
return {"status": "SUCCESS", "result": result, "error": None}
|
||||
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "imap",
|
||||
"current_message": 0,
|
||||
}
|
||||
return {"status": "FAILURE", "result": result, "error": error_msg}
|
||||
|
||||
except Exception as e:
|
||||
logger.exception("Error in import_imap_messages_task: %s", e)
|
||||
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "imap",
|
||||
"current_message": current_message,
|
||||
}
|
||||
return {"status": "FAILURE", "result": result, "error": str(e)}
|
||||
@@ -104,6 +104,10 @@ def compute_labels_and_flags(
|
||||
if "\\Draft" in imap_flags:
|
||||
message_flags["is_draft"] = True
|
||||
|
||||
# Handle \\Flagged flag (follow-up / starred)
|
||||
if "\\Flagged" in imap_flags:
|
||||
message_flags["is_starred"] = True
|
||||
|
||||
# Special case: if message is sender or draft, it should not be unread
|
||||
if message_flags.get("is_sender") or message_flags.get("is_draft"):
|
||||
message_flags["is_unread"] = False
|
||||
|
||||
352
src/backend/core/services/importer/mbox_tasks.py
Normal file
352
src/backend/core/services/importer/mbox_tasks.py
Normal file
@@ -0,0 +1,352 @@
|
||||
"""Mbox file import task."""
|
||||
|
||||
# pylint: disable=broad-exception-caught
|
||||
import io
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import storages
|
||||
|
||||
from celery.utils.log import get_task_logger
|
||||
from sentry_sdk import capture_exception
|
||||
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.mda.rfc5322 import parse_email_message
|
||||
from core.mda.rfc5322.parser import parse_date
|
||||
from core.models import Mailbox
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
from .s3_seekable import BUFFER_CENTERED, S3SeekableReader
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class MboxMessageIndex:
|
||||
"""Index entry for a single message inside an mbox file."""
|
||||
|
||||
start_byte: int
|
||||
end_byte: int
|
||||
date: Optional[datetime] = None
|
||||
|
||||
|
||||
def extract_date_from_headers(raw_message: bytes) -> Optional[datetime]:
|
||||
"""Extract the Date header from raw message bytes (headers only, fast).
|
||||
|
||||
Reads only until the first blank line (end of headers) to avoid
|
||||
parsing the entire message body. Handles RFC 5322 folded headers
|
||||
(continuation lines starting with whitespace).
|
||||
"""
|
||||
# Find the end of headers (first blank line)
|
||||
header_end = raw_message.find(b"\r\n\r\n")
|
||||
if header_end == -1:
|
||||
header_end = raw_message.find(b"\n\n")
|
||||
if header_end == -1:
|
||||
header_end = len(raw_message)
|
||||
|
||||
headers = raw_message[:header_end]
|
||||
|
||||
# Unfold headers: continuation lines start with whitespace (RFC 5322 §2.2.3)
|
||||
unfolded = headers.replace(b"\r\n ", b" ").replace(b"\r\n\t", b" ")
|
||||
unfolded = unfolded.replace(b"\n ", b" ").replace(b"\n\t", b" ")
|
||||
|
||||
# Parse the Date header
|
||||
for line in unfolded.split(b"\n"):
|
||||
line_str = line.decode("utf-8", errors="replace").strip()
|
||||
if line_str.lower().startswith("date:"):
|
||||
date_value = line_str[5:].strip()
|
||||
return parse_date(date_value)
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def index_mbox_messages(
|
||||
file,
|
||||
chunk_size: int = 65536,
|
||||
initial_buffer: bytes = b"",
|
||||
initial_offset: int = 0,
|
||||
) -> List[MboxMessageIndex]:
|
||||
"""Index all messages in an mbox file by scanning for 'From ' separators.
|
||||
|
||||
Returns a list of MboxMessageIndex with byte offsets and parsed dates.
|
||||
The file object must support read() and optionally seek().
|
||||
"""
|
||||
indices: List[MboxMessageIndex] = []
|
||||
# We need to scan through the file finding "From " lines at line starts
|
||||
buffer = initial_buffer
|
||||
file_offset = initial_offset # tracks where buffer starts in the file
|
||||
message_start: Optional[int] = None
|
||||
scan_pos = 0 # position within buffer to scan from
|
||||
|
||||
while True:
|
||||
# Read more data if needed
|
||||
if scan_pos >= len(buffer) - 5:
|
||||
new_data = file.read(chunk_size)
|
||||
if not new_data:
|
||||
break
|
||||
# Keep unprocessed tail
|
||||
buffer = buffer[scan_pos:] + new_data
|
||||
file_offset += scan_pos
|
||||
scan_pos = 0
|
||||
|
||||
# Find next newline to process line by line
|
||||
nl = buffer.find(b"\n", scan_pos)
|
||||
if nl == -1:
|
||||
# No complete line yet, read more
|
||||
new_data = file.read(chunk_size)
|
||||
if not new_data:
|
||||
break
|
||||
buffer = buffer[scan_pos:] + new_data
|
||||
file_offset += scan_pos
|
||||
scan_pos = 0
|
||||
continue
|
||||
|
||||
line_start_abs = file_offset + scan_pos
|
||||
line = buffer[scan_pos : nl + 1]
|
||||
|
||||
if line.startswith(b"From "):
|
||||
if message_start is not None:
|
||||
# End previous message (exclusive of this From line)
|
||||
msg_end = line_start_abs - 1
|
||||
# Read headers to extract date
|
||||
_extract_and_store_index(
|
||||
file, indices, message_start, msg_end, buffer, file_offset
|
||||
)
|
||||
# Start new message (content begins after the "From " line)
|
||||
message_start = line_start_abs + len(line)
|
||||
|
||||
scan_pos = nl + 1
|
||||
|
||||
# Handle last message
|
||||
if message_start is not None:
|
||||
# Get file end position
|
||||
current_pos = file.tell()
|
||||
file.seek(0, io.SEEK_END)
|
||||
file_end = file.tell()
|
||||
total_end = file_end - 1
|
||||
# Restore position for _extract_and_store_index
|
||||
file.seek(current_pos)
|
||||
if total_end >= message_start:
|
||||
_extract_and_store_index(
|
||||
file,
|
||||
indices,
|
||||
message_start,
|
||||
total_end,
|
||||
buffer[scan_pos:] if scan_pos < len(buffer) else b"",
|
||||
file_offset + scan_pos,
|
||||
)
|
||||
|
||||
return indices
|
||||
|
||||
|
||||
def _extract_and_store_index(
|
||||
file, indices, msg_start, msg_end, buffer, buf_file_offset
|
||||
):
|
||||
"""Extract date from a message and add an index entry."""
|
||||
# Try to read first 2048 bytes of the message for header parsing
|
||||
header_size = min(2048, msg_end - msg_start + 1)
|
||||
|
||||
# Check if the header bytes are in our buffer
|
||||
buf_start = buf_file_offset
|
||||
buf_end = buf_start + len(buffer) - 1
|
||||
|
||||
if buf_start <= msg_start and msg_start + header_size - 1 <= buf_end:
|
||||
offset_in_buf = msg_start - buf_start
|
||||
header_bytes = buffer[offset_in_buf : offset_in_buf + header_size]
|
||||
else:
|
||||
# Need to seek and read
|
||||
current_pos = file.tell() if hasattr(file, "tell") else None
|
||||
try:
|
||||
file.seek(msg_start)
|
||||
header_bytes = file.read(header_size)
|
||||
finally:
|
||||
if current_pos is not None:
|
||||
file.seek(current_pos)
|
||||
|
||||
date = extract_date_from_headers(header_bytes)
|
||||
indices.append(MboxMessageIndex(start_byte=msg_start, end_byte=msg_end, date=date))
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def process_mbox_file_task(self, file_key: str, recipient_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Process a MBOX file asynchronously using a 2-pass approach.
|
||||
|
||||
Pass 1: Index messages with byte offsets and dates.
|
||||
Pass 2: Process messages in chronological order (oldest first).
|
||||
|
||||
Args:
|
||||
file_key: The storage key of the MBOX file
|
||||
recipient_id: The UUID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
success_count = 0
|
||||
failure_count = 0
|
||||
total_messages = 0
|
||||
current_message = 0
|
||||
|
||||
try:
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 0,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
try:
|
||||
# Get storage and create S3 seekable reader
|
||||
message_imports_storage = storages["message-imports"]
|
||||
s3_client = message_imports_storage.connection.meta.client
|
||||
|
||||
with S3SeekableReader(
|
||||
s3_client,
|
||||
message_imports_storage.bucket_name,
|
||||
file_key,
|
||||
buffer_strategy=BUFFER_CENTERED,
|
||||
) as reader:
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Indexing messages",
|
||||
"total_messages": None,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 0,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Pass 1: Index messages
|
||||
message_indices = index_mbox_messages(reader)
|
||||
total_messages = len(message_indices)
|
||||
|
||||
if total_messages == 0:
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": {
|
||||
"message_status": "Completed processing messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 0,
|
||||
},
|
||||
"error": None,
|
||||
}
|
||||
|
||||
# Sort by date (oldest first, messages without dates go last)
|
||||
# Normalize naive datetimes to UTC for safe comparison
|
||||
# (parsedate_to_datetime returns naive for "-0000" timezone, aware otherwise)
|
||||
_utc = timezone.utc
|
||||
_max_date = datetime.max.replace(tzinfo=_utc)
|
||||
message_indices.sort(
|
||||
key=lambda m: (
|
||||
m.date is None,
|
||||
m.date.replace(tzinfo=_utc)
|
||||
if m.date and m.date.tzinfo is None
|
||||
else (m.date or _max_date),
|
||||
)
|
||||
)
|
||||
|
||||
# Pass 2: Process messages in chronological order
|
||||
for i, msg_index in enumerate(message_indices, 1):
|
||||
current_message = i
|
||||
try:
|
||||
result = {
|
||||
"message_status": f"Processing message {i} of {total_messages}",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": i,
|
||||
}
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
reader.seek(msg_index.start_byte)
|
||||
message_content = reader.read(
|
||||
msg_index.end_byte - msg_index.start_byte + 1
|
||||
)
|
||||
|
||||
if len(message_content) > settings.MAX_INCOMING_EMAIL_SIZE:
|
||||
logger.warning(
|
||||
"Skipping oversized message: %d bytes",
|
||||
len(message_content),
|
||||
)
|
||||
failure_count += 1
|
||||
continue
|
||||
|
||||
parsed_email = parse_email_message(message_content)
|
||||
if deliver_inbound_message(
|
||||
str(recipient), parsed_email, message_content, is_import=True
|
||||
):
|
||||
success_count += 1
|
||||
else:
|
||||
failure_count += 1
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception(
|
||||
"Error processing message from mbox file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
failure_count += 1
|
||||
|
||||
result = {
|
||||
"message_status": "Completed processing messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": current_message,
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": result,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception(
|
||||
"Error processing MBOX file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": current_message,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": "An error occurred while processing the MBOX file.",
|
||||
}
|
||||
1035
src/backend/core/services/importer/pst.py
Normal file
1035
src/backend/core/services/importer/pst.py
Normal file
File diff suppressed because it is too large
Load Diff
257
src/backend/core/services/importer/pst_tasks.py
Normal file
257
src/backend/core/services/importer/pst_tasks.py
Normal file
@@ -0,0 +1,257 @@
|
||||
"""PST file import task."""
|
||||
|
||||
# pylint: disable=broad-exception-caught
|
||||
from typing import Any, Dict
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import storages
|
||||
|
||||
import pypff
|
||||
from celery.utils.log import get_task_logger
|
||||
from sentry_sdk import capture_exception
|
||||
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.mda.rfc5322 import parse_email_message
|
||||
from core.models import Mailbox
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
from .pst import (
|
||||
FLAG_STATUS_FOLLOWUP,
|
||||
FOLDER_TYPE_DELETED,
|
||||
FOLDER_TYPE_DRAFTS,
|
||||
FOLDER_TYPE_INBOX,
|
||||
FOLDER_TYPE_NORMAL,
|
||||
FOLDER_TYPE_OUTBOX,
|
||||
FOLDER_TYPE_SENT,
|
||||
MSGFLAG_READ,
|
||||
MSGFLAG_UNSENT,
|
||||
build_special_folder_map,
|
||||
count_pst_messages,
|
||||
get_store_owner_email,
|
||||
sanitize_folder_name,
|
||||
walk_pst_messages,
|
||||
)
|
||||
from .s3_seekable import BUFFER_NONE, S3SeekableReader
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def process_pst_file_task(self, file_key: str, recipient_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Process a PST file asynchronously.
|
||||
|
||||
Args:
|
||||
file_key: The storage key of the PST file
|
||||
recipient_id: The UUID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
success_count = 0
|
||||
failure_count = 0
|
||||
total_messages = 0
|
||||
current_message = 0
|
||||
|
||||
try:
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "pst",
|
||||
"current_message": 0,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
try:
|
||||
message_imports_storage = storages["message-imports"]
|
||||
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Initializing import",
|
||||
"total_messages": None,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "pst",
|
||||
"current_message": 0,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Create S3 seekable reader with block-aligned LRU cache
|
||||
# for pypff's random-access B-tree traversal pattern.
|
||||
# 64 KB blocks x 2048 cache slots = 128 MB max cache.
|
||||
s3_client = message_imports_storage.connection.meta.client
|
||||
with S3SeekableReader(
|
||||
s3_client,
|
||||
message_imports_storage.bucket_name,
|
||||
file_key,
|
||||
buffer_strategy=BUFFER_NONE,
|
||||
buffer_size=64 * 1024,
|
||||
buffer_count=2048,
|
||||
) as reader:
|
||||
# Open PST file
|
||||
pst = pypff.file()
|
||||
pst.open_file_object(reader)
|
||||
|
||||
try:
|
||||
# Build special folder map and get store owner email
|
||||
special_folder_map = build_special_folder_map(pst)
|
||||
store_email = get_store_owner_email(pst)
|
||||
|
||||
# Count messages
|
||||
total_messages = count_pst_messages(pst, special_folder_map)
|
||||
|
||||
# Iterate messages chronologically
|
||||
for (
|
||||
folder_type,
|
||||
folder_path,
|
||||
message_flags,
|
||||
flag_status,
|
||||
eml_bytes,
|
||||
) in walk_pst_messages(
|
||||
pst, special_folder_map, store_email=store_email
|
||||
):
|
||||
current_message += 1
|
||||
try:
|
||||
# Check message size limit
|
||||
if len(eml_bytes) > settings.MAX_INCOMING_EMAIL_SIZE:
|
||||
logger.warning(
|
||||
"Skipping oversized message: %d bytes",
|
||||
len(eml_bytes),
|
||||
)
|
||||
failure_count += 1
|
||||
continue
|
||||
|
||||
result = {
|
||||
"message_status": (
|
||||
f"Processing message {current_message}"
|
||||
f" of {total_messages}"
|
||||
),
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "pst",
|
||||
"current_message": current_message,
|
||||
}
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
parsed_email = parse_email_message(eml_bytes)
|
||||
|
||||
# Compute IMAP-compatible flags from PST message flags
|
||||
imap_flags = []
|
||||
if message_flags & MSGFLAG_READ:
|
||||
imap_flags.append("\\Seen")
|
||||
if (
|
||||
message_flags & MSGFLAG_UNSENT
|
||||
or folder_type == FOLDER_TYPE_DRAFTS
|
||||
):
|
||||
imap_flags.append("\\Draft")
|
||||
if (
|
||||
flag_status is not None
|
||||
and flag_status >= FLAG_STATUS_FOLLOWUP
|
||||
):
|
||||
imap_flags.append("\\Flagged")
|
||||
|
||||
# Compute IMAP-compatible labels from folder type
|
||||
imap_labels = []
|
||||
if folder_type == FOLDER_TYPE_SENT:
|
||||
imap_labels.append("Sent")
|
||||
elif folder_type == FOLDER_TYPE_DELETED:
|
||||
imap_labels.append("Trash")
|
||||
elif folder_type == FOLDER_TYPE_OUTBOX:
|
||||
imap_labels.append("OUTBOX")
|
||||
elif folder_type in (
|
||||
FOLDER_TYPE_INBOX,
|
||||
FOLDER_TYPE_DRAFTS,
|
||||
):
|
||||
pass # No label for inbox/drafts
|
||||
elif folder_path:
|
||||
imap_labels.append(sanitize_folder_name(folder_path))
|
||||
|
||||
# Subfolders of special folders also get their
|
||||
# subfolder name as an additional label.
|
||||
if folder_path and folder_type != FOLDER_TYPE_NORMAL:
|
||||
imap_labels.append(sanitize_folder_name(folder_path))
|
||||
|
||||
is_sender = folder_type in (
|
||||
FOLDER_TYPE_SENT,
|
||||
FOLDER_TYPE_OUTBOX,
|
||||
)
|
||||
|
||||
if deliver_inbound_message(
|
||||
str(recipient),
|
||||
parsed_email,
|
||||
eml_bytes,
|
||||
is_import=True,
|
||||
is_import_sender=is_sender,
|
||||
imap_labels=imap_labels,
|
||||
imap_flags=imap_flags,
|
||||
):
|
||||
success_count += 1
|
||||
else:
|
||||
failure_count += 1
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception(
|
||||
"Error processing message from PST file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
failure_count += 1
|
||||
finally:
|
||||
pst.close()
|
||||
|
||||
result = {
|
||||
"message_status": "Completed processing messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "pst",
|
||||
"current_message": current_message,
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": result,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception(
|
||||
"Error processing PST file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "pst",
|
||||
"current_message": current_message,
|
||||
}
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": "An error occurred while processing the PST file.",
|
||||
}
|
||||
245
src/backend/core/services/importer/s3_seekable.py
Normal file
245
src/backend/core/services/importer/s3_seekable.py
Normal file
@@ -0,0 +1,245 @@
|
||||
"""Seekable file-like object backed by S3 range requests."""
|
||||
|
||||
import io
|
||||
import logging
|
||||
from collections import OrderedDict
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
BUFFER_FORWARD = "forward"
|
||||
BUFFER_CENTERED = "centered"
|
||||
BUFFER_NONE = "none"
|
||||
|
||||
|
||||
class S3SeekableReader:
|
||||
"""Seekable file-like object that reads from S3 using range requests.
|
||||
|
||||
Maintains a read-ahead buffer (default 100MB) and makes HTTP Range
|
||||
requests as the caller seeks through the file. Supports read(), seek(),
|
||||
tell() as required by pypff.open_file_object().
|
||||
|
||||
Buffer strategies:
|
||||
- "forward": buffer starts at the read position and extends forward.
|
||||
Best for purely sequential access.
|
||||
- "centered": buffer is centered around the read position.
|
||||
Best for bidirectional or random access (e.g. reading mbox messages
|
||||
in chronological order from a file stored in reverse order).
|
||||
- "none": block-aligned LRU cache. Each read is served from cached
|
||||
blocks of ``buffer_size`` bytes; up to ``buffer_count`` blocks are
|
||||
kept in an LRU cache. Best for highly random access patterns
|
||||
(e.g. pypff traversing a PST file's B-tree structures).
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
s3_client,
|
||||
bucket,
|
||||
key,
|
||||
buffer_size=100 * 1024 * 1024,
|
||||
buffer_count=1,
|
||||
buffer_strategy=BUFFER_FORWARD,
|
||||
):
|
||||
if buffer_strategy not in (BUFFER_FORWARD, BUFFER_CENTERED, BUFFER_NONE):
|
||||
raise ValueError(
|
||||
f"Unknown buffer_strategy: {buffer_strategy!r}. "
|
||||
f"Use {BUFFER_FORWARD!r}, {BUFFER_CENTERED!r} or {BUFFER_NONE!r}."
|
||||
)
|
||||
self._s3_client = s3_client
|
||||
self._bucket = bucket
|
||||
self._key = key
|
||||
self._buffer_size = buffer_size
|
||||
self._buffer_count = buffer_count
|
||||
self._buffer_strategy = buffer_strategy
|
||||
self._position = 0
|
||||
|
||||
# Get file size
|
||||
head = s3_client.head_object(Bucket=bucket, Key=key)
|
||||
self._size = head["ContentLength"]
|
||||
self._fetch_count = 0
|
||||
self._cache_hit_count = 0
|
||||
|
||||
# Buffer state (forward/centered use a single buffer, none uses LRU)
|
||||
self._buffer = b""
|
||||
self._buffer_start = 0
|
||||
self._cache = OrderedDict() # block_index -> bytes
|
||||
|
||||
if buffer_strategy == BUFFER_NONE:
|
||||
logger.info(
|
||||
"S3SeekableReader opened: %s/%s (%d MB, strategy=%s, "
|
||||
"block=%d KB, cache=%d blocks = %d MB)",
|
||||
bucket,
|
||||
key,
|
||||
self._size // (1024 * 1024),
|
||||
buffer_strategy,
|
||||
buffer_size // 1024,
|
||||
buffer_count,
|
||||
(buffer_size * buffer_count) // (1024 * 1024),
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
"S3SeekableReader opened: %s/%s (%d MB, strategy=%s, buffer=%d MB)",
|
||||
bucket,
|
||||
key,
|
||||
self._size // (1024 * 1024),
|
||||
buffer_strategy,
|
||||
buffer_size // (1024 * 1024),
|
||||
)
|
||||
|
||||
@property
|
||||
def size(self):
|
||||
"""Return the total size of the S3 object."""
|
||||
return self._size
|
||||
|
||||
def read(self, size=-1):
|
||||
"""Read up to size bytes from the current position."""
|
||||
if size == -1 or size is None:
|
||||
size = self._size - self._position
|
||||
|
||||
if self._position >= self._size:
|
||||
return b""
|
||||
|
||||
# Clamp to remaining bytes
|
||||
size = min(size, self._size - self._position)
|
||||
|
||||
if self._buffer_strategy == BUFFER_NONE:
|
||||
return self._read_direct(size)
|
||||
|
||||
result = b""
|
||||
remaining = size
|
||||
while remaining > 0:
|
||||
buffer_end = self._buffer_start + len(self._buffer)
|
||||
if not (self._buffer and self._buffer_start <= self._position < buffer_end):
|
||||
self._fill_buffer(self._position)
|
||||
buffer_end = self._buffer_start + len(self._buffer)
|
||||
offset = self._position - self._buffer_start
|
||||
available = min(remaining, buffer_end - self._position)
|
||||
result += self._buffer[offset : offset + available]
|
||||
self._position += available
|
||||
remaining -= available
|
||||
return result
|
||||
|
||||
def _read_direct(self, size):
|
||||
"""Read using block-aligned LRU cache."""
|
||||
if size == 0:
|
||||
return b""
|
||||
|
||||
result = b""
|
||||
remaining = size
|
||||
while remaining > 0:
|
||||
block_index = self._position // self._buffer_size
|
||||
block_start = block_index * self._buffer_size
|
||||
block_data = self._cache_get(block_index)
|
||||
|
||||
offset = self._position - block_start
|
||||
available = min(remaining, len(block_data) - offset)
|
||||
result += block_data[offset : offset + available]
|
||||
self._position += available
|
||||
remaining -= available
|
||||
|
||||
return result
|
||||
|
||||
def _cache_get(self, block_index):
|
||||
"""Get a block from the LRU cache, fetching from S3 if missing."""
|
||||
if block_index in self._cache:
|
||||
self._cache_hit_count += 1
|
||||
self._cache.move_to_end(block_index)
|
||||
return self._cache[block_index]
|
||||
|
||||
# Fetch block from S3
|
||||
block_start = block_index * self._buffer_size
|
||||
block_end = min(block_start + self._buffer_size - 1, self._size - 1)
|
||||
self._fetch_count += 1
|
||||
logger.debug(
|
||||
"S3SeekableReader fetch #%d: block %d (bytes %d-%d, %d KB)",
|
||||
self._fetch_count,
|
||||
block_index,
|
||||
block_start,
|
||||
block_end,
|
||||
(block_end - block_start + 1) // 1024,
|
||||
)
|
||||
range_header = f"bytes={block_start}-{block_end}"
|
||||
response = self._s3_client.get_object(
|
||||
Bucket=self._bucket, Key=self._key, Range=range_header
|
||||
)
|
||||
data = response["Body"].read()
|
||||
|
||||
# Store in cache, evict oldest if full
|
||||
self._cache[block_index] = data
|
||||
self._cache.move_to_end(block_index)
|
||||
if len(self._cache) > self._buffer_count:
|
||||
self._cache.popitem(last=False)
|
||||
|
||||
return data
|
||||
|
||||
def _fill_buffer(self, position):
|
||||
"""Fetch a buffer_size chunk from S3 around the given position."""
|
||||
if self._buffer_strategy == BUFFER_CENTERED:
|
||||
half = self._buffer_size // 2
|
||||
start = max(0, position - half)
|
||||
else:
|
||||
start = position
|
||||
end = min(start + self._buffer_size - 1, self._size - 1)
|
||||
self._fetch_count += 1
|
||||
logger.info(
|
||||
"S3SeekableReader fetch #%d: bytes %d-%d (%d MB) for read at position %d",
|
||||
self._fetch_count,
|
||||
start,
|
||||
end,
|
||||
(end - start + 1) // (1024 * 1024),
|
||||
position,
|
||||
)
|
||||
range_header = f"bytes={start}-{end}"
|
||||
response = self._s3_client.get_object(
|
||||
Bucket=self._bucket, Key=self._key, Range=range_header
|
||||
)
|
||||
self._buffer = response["Body"].read()
|
||||
self._buffer_start = start
|
||||
|
||||
def seek(self, offset, whence=io.SEEK_SET):
|
||||
"""Seek to a position in the file."""
|
||||
if whence == io.SEEK_SET:
|
||||
self._position = offset
|
||||
elif whence == io.SEEK_CUR:
|
||||
self._position += offset
|
||||
elif whence == io.SEEK_END:
|
||||
self._position = self._size + offset
|
||||
else:
|
||||
raise ValueError(f"Invalid whence value: {whence}")
|
||||
|
||||
self._position = max(0, min(self._position, self._size))
|
||||
return self._position
|
||||
|
||||
def tell(self):
|
||||
"""Return the current position."""
|
||||
return self._position
|
||||
|
||||
def seekable(self):
|
||||
"""Return True - this object supports seeking."""
|
||||
return True
|
||||
|
||||
def readable(self):
|
||||
"""Return True - this object supports reading."""
|
||||
return True
|
||||
|
||||
def get_size(self):
|
||||
"""Return the total size of the S3 object. Used by pypff."""
|
||||
return self._size
|
||||
|
||||
def close(self):
|
||||
"""Release the buffer memory."""
|
||||
if self._cache:
|
||||
total = self._fetch_count + self._cache_hit_count
|
||||
logger.info(
|
||||
"S3SeekableReader closed: %d fetches, %d cache hits (%d%% hit rate)",
|
||||
self._fetch_count,
|
||||
self._cache_hit_count,
|
||||
(self._cache_hit_count * 100 // total) if total else 0,
|
||||
)
|
||||
self._cache.clear()
|
||||
self._buffer = b""
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, *args):
|
||||
self.close()
|
||||
@@ -1,4 +1,4 @@
|
||||
"""Service layer for importing messages via EML, MBOX, or IMAP."""
|
||||
"""Service layer for importing messages via EML, MBOX, PST, or IMAP."""
|
||||
|
||||
import logging
|
||||
from typing import Any, Dict, Optional, Tuple
|
||||
@@ -7,15 +7,17 @@ from django.contrib import messages
|
||||
from django.core.files.storage import storages
|
||||
from django.http import HttpRequest
|
||||
|
||||
import magic
|
||||
from sentry_sdk import capture_exception
|
||||
|
||||
from core import enums
|
||||
from core.api.viewsets.task import register_task_owner
|
||||
from core.models import Mailbox
|
||||
|
||||
from .tasks import (
|
||||
import_imap_messages_task,
|
||||
process_eml_file_task,
|
||||
process_mbox_file_task,
|
||||
)
|
||||
from .eml_tasks import process_eml_file_task
|
||||
from .imap_tasks import import_imap_messages_task
|
||||
from .mbox_tasks import process_mbox_file_task
|
||||
from .pst_tasks import process_pst_file_task
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -29,14 +31,16 @@ class ImportService:
|
||||
recipient: Mailbox,
|
||||
user: Any,
|
||||
request: Optional[HttpRequest] = None,
|
||||
filename: Optional[str] = None,
|
||||
) -> Tuple[bool, Dict[str, Any]]:
|
||||
"""Import messages from an EML or MBOX file.
|
||||
"""Import messages from an EML, MBOX, or PST file.
|
||||
|
||||
Args:
|
||||
file: The uploaded file (EML or MBOX)
|
||||
file_key: The storage key of the uploaded file
|
||||
recipient: The recipient mailbox
|
||||
user: The user performing the import
|
||||
request: Optional HTTP request for admin messages
|
||||
filename: Original filename for MIME type disambiguation
|
||||
|
||||
Returns:
|
||||
Tuple of (success, response_data)
|
||||
@@ -55,25 +59,50 @@ class ImportService:
|
||||
if not message_imports_storage.exists(file_key):
|
||||
return False, {"detail": "File not found."}
|
||||
|
||||
# We retrieve the content type from the file metadata as we need to make a quick check
|
||||
# but this is not guaranteed to be correct so we have to check the file content again in the task
|
||||
# Detect content type from actual file bytes using python-magic
|
||||
s3_client = message_imports_storage.connection.meta.client
|
||||
unsafe_content_type = s3_client.head_object(
|
||||
Bucket=message_imports_storage.bucket_name, Key=file_key
|
||||
).get("ContentType")
|
||||
head = s3_client.get_object(
|
||||
Bucket=message_imports_storage.bucket_name,
|
||||
Key=file_key,
|
||||
Range="bytes=0-2047",
|
||||
)["Body"].read()
|
||||
content_type = magic.from_buffer(head, mime=True)
|
||||
|
||||
if unsafe_content_type not in enums.ARCHIVE_SUPPORTED_MIME_TYPES:
|
||||
# Disambiguate ambiguous MIME types using filename extension
|
||||
if content_type in ("text/plain", "application/octet-stream") and filename:
|
||||
ext = filename.rsplit(".", 1)[-1].lower() if "." in filename else ""
|
||||
extension_map = {
|
||||
"eml": "message/rfc822",
|
||||
"mbox": "application/mbox",
|
||||
"pst": "application/vnd.ms-outlook",
|
||||
}
|
||||
if ext in extension_map:
|
||||
content_type = extension_map[ext]
|
||||
|
||||
if content_type not in enums.ARCHIVE_SUPPORTED_MIME_TYPES:
|
||||
return False, {
|
||||
"detail": (
|
||||
"Invalid file format. Only EML (message/rfc822) and MBOX "
|
||||
"(application/octet-stream, application/mbox, or text/plain) files are supported. "
|
||||
"Invalid file format. Only EML, MBOX, "
|
||||
"and PST files are supported. "
|
||||
"Detected content type: {content_type}"
|
||||
).format(content_type=unsafe_content_type)
|
||||
).format(content_type=content_type)
|
||||
}
|
||||
|
||||
try:
|
||||
# Check MIME type for PST
|
||||
if content_type in enums.PST_SUPPORTED_MIME_TYPES:
|
||||
task = process_pst_file_task.delay(file_key, str(recipient.id))
|
||||
register_task_owner(task.id, user.id)
|
||||
response_data = {"task_id": task.id, "type": "pst"}
|
||||
if request:
|
||||
messages.info(
|
||||
request,
|
||||
f"Started processing PST file for recipient {recipient}. "
|
||||
"This may take a while. You can check the status in the Celery task monitor.",
|
||||
)
|
||||
return True, response_data
|
||||
# Check MIME type for MBOX
|
||||
if unsafe_content_type in enums.MBOX_SUPPORTED_MIME_TYPES:
|
||||
elif content_type in enums.MBOX_SUPPORTED_MIME_TYPES:
|
||||
# Process MBOX file asynchronously
|
||||
task = process_mbox_file_task.delay(file_key, str(recipient.id))
|
||||
register_task_owner(task.id, user.id)
|
||||
@@ -86,7 +115,7 @@ class ImportService:
|
||||
)
|
||||
return True, response_data
|
||||
# Check MIME type for EML
|
||||
elif unsafe_content_type in enums.EML_SUPPORTED_MIME_TYPES:
|
||||
elif content_type in enums.EML_SUPPORTED_MIME_TYPES:
|
||||
# Process EML file asynchronously
|
||||
task = process_eml_file_task.delay(file_key, str(recipient.id))
|
||||
register_task_owner(task.id, user.id)
|
||||
@@ -98,12 +127,15 @@ class ImportService:
|
||||
"This may take a while. You can check the status in the Celery task monitor.",
|
||||
)
|
||||
return True, response_data
|
||||
else:
|
||||
return False, {"detail": f"Unsupported file format: {content_type}"}
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception("Error processing file: %s", e)
|
||||
if request:
|
||||
messages.error(request, f"Error processing file: {str(e)}")
|
||||
messages.error(request, "Error processing file.")
|
||||
|
||||
return False, {"detail": str(e)}
|
||||
return False, {"detail": "An error occurred while processing the file."}
|
||||
|
||||
@staticmethod
|
||||
def import_imap(
|
||||
@@ -161,7 +193,10 @@ class ImportService:
|
||||
return True, response_data
|
||||
|
||||
except Exception as e:
|
||||
capture_exception(e)
|
||||
logger.exception("Error starting IMAP import: %s", e)
|
||||
if request:
|
||||
messages.error(request, f"Error starting IMAP import: {str(e)}")
|
||||
return False, {"detail": str(e)}
|
||||
messages.error(request, "Error starting IMAP import.")
|
||||
return False, {
|
||||
"detail": "An error occurred while starting the IMAP import."
|
||||
}
|
||||
|
||||
@@ -1,539 +0,0 @@
|
||||
"""Import-related tasks."""
|
||||
|
||||
# pylint: disable=unused-argument, broad-exception-raised, broad-exception-caught, too-many-lines
|
||||
from typing import Any, Dict, Generator
|
||||
|
||||
from django.core.files.storage import storages
|
||||
|
||||
import magic
|
||||
from celery.utils.log import get_task_logger
|
||||
|
||||
from core import enums
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.mda.rfc5322 import parse_email_message
|
||||
from core.models import Mailbox
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
from .imap import (
|
||||
IMAPConnectionManager,
|
||||
create_folder_mapping,
|
||||
get_message_numbers,
|
||||
get_selectable_folders,
|
||||
process_folder_messages,
|
||||
select_imap_folder,
|
||||
)
|
||||
|
||||
logger = get_task_logger(__name__)
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def process_mbox_file_task(self, file_key: str, recipient_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Process a MBOX file asynchronously.
|
||||
|
||||
Args:
|
||||
file_key: The storage key of the MBOX file
|
||||
recipient_id: The UUID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
success_count = 0
|
||||
failure_count = 0
|
||||
total_messages = 0
|
||||
current_message = 0
|
||||
|
||||
try:
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 0,
|
||||
}
|
||||
self.update_state(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
try:
|
||||
# Get storage and open file
|
||||
message_imports_storage = storages["message-imports"]
|
||||
|
||||
with message_imports_storage.open(file_key, "rb") as file:
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Initializing import",
|
||||
"type": "mbox",
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
# Ensure file is a valid mbox file
|
||||
content_type = magic.from_buffer(file.read(2048), mime=True)
|
||||
if content_type not in enums.MBOX_SUPPORTED_MIME_TYPES:
|
||||
raise Exception(f"Expected MBOX file, got {content_type}")
|
||||
|
||||
# First pass: scan for message positions (also gives us the count)
|
||||
file.seek(0)
|
||||
message_positions, file_end = scan_mbox_messages(file)
|
||||
total_messages = len(message_positions)
|
||||
|
||||
# Second pass: process messages using pre-computed positions
|
||||
for i, message_content in enumerate(
|
||||
stream_mbox_messages(file, message_positions, file_end), 1
|
||||
):
|
||||
current_message = i
|
||||
try:
|
||||
# Update task state with progress
|
||||
result = {
|
||||
"message_status": f"Processing message {i} of {total_messages}",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": i,
|
||||
}
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Parse the email message
|
||||
parsed_email = parse_email_message(message_content)
|
||||
# Deliver the message
|
||||
if deliver_inbound_message(
|
||||
str(recipient), parsed_email, message_content, is_import=True
|
||||
):
|
||||
success_count += 1
|
||||
else:
|
||||
failure_count += 1
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
logger.exception(
|
||||
"Error processing message from mbox file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
failure_count += 1
|
||||
|
||||
result = {
|
||||
"message_status": "Completed processing messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": current_message,
|
||||
}
|
||||
|
||||
self.update_state(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": result,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(
|
||||
"Error processing MBOX file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
error_msg = str(e)
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "mbox",
|
||||
"current_message": current_message,
|
||||
}
|
||||
self.update_state(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
|
||||
def scan_mbox_messages(file) -> tuple[list[int], int]:
|
||||
"""
|
||||
Scan an MBOX file and return message positions without loading content into memory.
|
||||
|
||||
This function performs a single pass through the file to record the byte offset
|
||||
where each message starts. The count of messages is simply len(positions).
|
||||
|
||||
Args:
|
||||
file: File-like object to read from
|
||||
|
||||
Returns:
|
||||
A tuple of (message_positions, file_end) where:
|
||||
- message_positions: list of byte offsets where each message starts
|
||||
- file_end: byte offset of end of file (needed to compute last message size)
|
||||
"""
|
||||
message_positions = []
|
||||
position = 0
|
||||
|
||||
for line in file:
|
||||
if line.startswith(b"From "):
|
||||
message_positions.append(position)
|
||||
position += len(line)
|
||||
|
||||
return message_positions, position
|
||||
|
||||
|
||||
def stream_mbox_messages(
|
||||
file, message_positions: list[int], file_end: int | None = None
|
||||
) -> Generator[bytes, None, None]:
|
||||
"""
|
||||
Stream individual email messages from an MBOX file without loading everything into memory.
|
||||
|
||||
Yields messages in reverse order (oldest first) for proper reply threading,
|
||||
since mbox files store messages with the most recent first.
|
||||
|
||||
Args:
|
||||
file: File-like object to read from (must support seek)
|
||||
message_positions: Pre-computed list of byte offsets where messages start.
|
||||
file_end: Byte offset of end of file.
|
||||
|
||||
Yields:
|
||||
Individual email messages as bytes
|
||||
"""
|
||||
if message_positions is None or file_end is None:
|
||||
logger.warning(
|
||||
"Cannot stream MBOX messages: message positions or file end not provided"
|
||||
)
|
||||
return
|
||||
|
||||
# Read messages in reverse order for chronological processing
|
||||
# Process from last message to first (oldest to newest in real time)
|
||||
for i in range(len(message_positions) - 1, -1, -1):
|
||||
start_pos = message_positions[i]
|
||||
# End position is either the next message start or end of file
|
||||
end_pos = (
|
||||
message_positions[i + 1] if i + 1 < len(message_positions) else file_end
|
||||
)
|
||||
|
||||
# Seek to message start
|
||||
file.seek(start_pos)
|
||||
|
||||
# Read the message content (excluding the "From " line)
|
||||
first_line = file.readline() # Skip the "From " separator line
|
||||
content_start = start_pos + len(first_line)
|
||||
content_length = end_pos - content_start
|
||||
|
||||
# Read just this message's content
|
||||
message_content = file.read(content_length)
|
||||
yield message_content
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def import_imap_messages_task(
|
||||
self,
|
||||
imap_server: str,
|
||||
imap_port: int,
|
||||
username: str,
|
||||
password: str,
|
||||
use_ssl: bool,
|
||||
recipient_id: str,
|
||||
) -> Dict[str, Any]:
|
||||
"""Import messages from an IMAP server.
|
||||
|
||||
Args:
|
||||
imap_server: IMAP server hostname
|
||||
imap_port: IMAP server port
|
||||
username: Email address for login
|
||||
password: Password for login
|
||||
use_ssl: Whether to use SSL
|
||||
recipient_id: ID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
success_count = 0
|
||||
failure_count = 0
|
||||
total_messages = 0
|
||||
current_message = 0
|
||||
|
||||
try:
|
||||
# Get recipient mailbox
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
|
||||
# Connect to IMAP server using context manager
|
||||
with IMAPConnectionManager(
|
||||
imap_server, imap_port, username, password, use_ssl
|
||||
) as imap:
|
||||
# Get selectable folders
|
||||
selectable_folders = get_selectable_folders(imap, username, imap_server)
|
||||
|
||||
# Process all folders
|
||||
folders_to_process = selectable_folders
|
||||
|
||||
# Create folder mapping
|
||||
folder_mapping = create_folder_mapping(
|
||||
selectable_folders, username, imap_server
|
||||
)
|
||||
|
||||
# Calculate total messages across all folders
|
||||
for folder_name in folders_to_process:
|
||||
if select_imap_folder(imap, folder_name):
|
||||
message_list = get_message_numbers(
|
||||
imap, folder_name, username, imap_server
|
||||
)
|
||||
total_messages += len(message_list)
|
||||
|
||||
# Process each folder
|
||||
|
||||
for folder_to_process in folders_to_process:
|
||||
display_name = folder_mapping.get(folder_to_process, folder_to_process)
|
||||
|
||||
# Select folder
|
||||
if not select_imap_folder(imap, folder_to_process):
|
||||
logger.warning(
|
||||
"Skipping folder %s - could not select it", folder_to_process
|
||||
)
|
||||
continue
|
||||
|
||||
# Get message numbers
|
||||
message_list = get_message_numbers(
|
||||
imap, folder_to_process, username, imap_server
|
||||
)
|
||||
if not message_list:
|
||||
logger.info("No messages found in folder %s", folder_to_process)
|
||||
continue
|
||||
|
||||
# Process messages in this folder
|
||||
success_count, failure_count, current_message = process_folder_messages(
|
||||
imap_connection=imap,
|
||||
folder=folder_to_process,
|
||||
display_name=display_name,
|
||||
message_list=message_list,
|
||||
recipient=recipient,
|
||||
username=username,
|
||||
task_instance=self,
|
||||
success_count=success_count,
|
||||
failure_count=failure_count,
|
||||
current_message=current_message,
|
||||
total_messages=total_messages,
|
||||
)
|
||||
|
||||
# Determine appropriate message status
|
||||
if len(folders_to_process) == 1:
|
||||
# If only one folder was processed, show which folder it was
|
||||
actual_folder = folders_to_process[0]
|
||||
message_status = (
|
||||
f"Completed processing messages from folder '{actual_folder}'"
|
||||
)
|
||||
else:
|
||||
message_status = "Completed processing messages from all folders"
|
||||
|
||||
result = {
|
||||
"message_status": message_status,
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "imap",
|
||||
"current_message": current_message,
|
||||
}
|
||||
|
||||
self.update_state(
|
||||
state="SUCCESS",
|
||||
meta={"status": "SUCCESS", "result": result, "error": None},
|
||||
)
|
||||
|
||||
return {"status": "SUCCESS", "result": result, "error": None}
|
||||
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": 0,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "imap",
|
||||
"current_message": 0,
|
||||
}
|
||||
self.update_state(state="FAILURE", meta={"result": result, "error": error_msg})
|
||||
return {"status": "FAILURE", "result": result, "error": error_msg}
|
||||
|
||||
except Exception as e:
|
||||
logger.exception("Error in import_imap_messages_task: %s", e)
|
||||
|
||||
error_msg = str(e)
|
||||
result = {
|
||||
"message_status": "Failed to process messages",
|
||||
"total_messages": total_messages,
|
||||
"success_count": success_count,
|
||||
"failure_count": failure_count,
|
||||
"type": "imap",
|
||||
"current_message": current_message,
|
||||
}
|
||||
self.update_state(state="FAILURE", meta={"result": result, "error": error_msg})
|
||||
return {"status": "FAILURE", "result": result, "error": error_msg}
|
||||
|
||||
|
||||
@celery_app.task(bind=True)
|
||||
def process_eml_file_task(self, file_key: str, recipient_id: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Process an EML file asynchronously.
|
||||
|
||||
Args:
|
||||
file_key: The storage key of the EML file
|
||||
recipient_id: The UUID of the recipient mailbox
|
||||
|
||||
Returns:
|
||||
Dict with task status and result
|
||||
"""
|
||||
try:
|
||||
recipient = Mailbox.objects.get(id=recipient_id)
|
||||
except Mailbox.DoesNotExist:
|
||||
error_msg = f"Recipient mailbox {recipient_id} not found"
|
||||
result = {
|
||||
"message_status": "Failed to process message",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "eml",
|
||||
"current_message": 0,
|
||||
}
|
||||
self.update_state(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
try:
|
||||
# Update progress state
|
||||
progress_result = {
|
||||
"message_status": "Processing message 1 of 1",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
self.update_state(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": progress_result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Get storage and read file
|
||||
message_imports_storage = storages["message-imports"]
|
||||
with message_imports_storage.open(file_key, "rb") as file:
|
||||
content_type = magic.from_buffer(file.read(2048), mime=True)
|
||||
if content_type not in enums.EML_SUPPORTED_MIME_TYPES:
|
||||
raise Exception(f"Expected EML file, got {content_type}")
|
||||
|
||||
file.seek(0)
|
||||
file_content = file.read()
|
||||
|
||||
# Parse the email message
|
||||
parsed_email = parse_email_message(file_content)
|
||||
# Deliver the message
|
||||
success = deliver_inbound_message(
|
||||
str(recipient), parsed_email, file_content, is_import=True
|
||||
)
|
||||
|
||||
result = {
|
||||
"message_status": "Completed processing message",
|
||||
"total_messages": 1,
|
||||
"success_count": 1 if success else 0,
|
||||
"failure_count": 0 if success else 1,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
|
||||
if success:
|
||||
self.update_state(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"status": "SUCCESS",
|
||||
"result": result,
|
||||
"error": None,
|
||||
}
|
||||
|
||||
error_msg = "Failed to deliver message"
|
||||
self.update_state(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(
|
||||
"Error processing EML file for recipient %s: %s",
|
||||
recipient_id,
|
||||
e,
|
||||
)
|
||||
error_msg = str(e)
|
||||
result = {
|
||||
"message_status": "Failed to process message",
|
||||
"total_messages": 1,
|
||||
"success_count": 0,
|
||||
"failure_count": 1,
|
||||
"type": "eml",
|
||||
"current_message": 1,
|
||||
}
|
||||
self.update_state(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
},
|
||||
)
|
||||
return {
|
||||
"status": "FAILURE",
|
||||
"result": result,
|
||||
"error": error_msg,
|
||||
}
|
||||
@@ -4,5 +4,8 @@
|
||||
from core.mda.inbound_tasks import * # noqa: F403
|
||||
from core.mda.outbound_tasks import * # noqa: F403
|
||||
from core.services.dns.tasks import * # noqa: F403
|
||||
from core.services.importer.tasks import * # noqa: F403
|
||||
from core.services.importer.eml_tasks import * # noqa: F403
|
||||
from core.services.importer.imap_tasks import * # noqa: F403
|
||||
from core.services.importer.mbox_tasks import * # noqa: F403
|
||||
from core.services.importer.pst_tasks import * # noqa: F403
|
||||
from core.services.search.tasks import * # noqa: F403
|
||||
|
||||
@@ -100,8 +100,13 @@ class TestImportViewSetPermissions:
|
||||
with mock.patch("core.services.importer.service.storages") as mock_storages:
|
||||
mock_storage = mock.MagicMock()
|
||||
mock_storage.exists.return_value = True
|
||||
mock_storage.connection.meta.client.head_object.return_value = {
|
||||
"ContentType": "message/rfc822"
|
||||
# Mock get_object to return EML-like content for magic detection
|
||||
eml_body = mock.MagicMock()
|
||||
eml_body.read.return_value = (
|
||||
b"From: test@example.com\r\nSubject: Test\r\n\r\n"
|
||||
)
|
||||
mock_storage.connection.meta.client.get_object.return_value = {
|
||||
"Body": eml_body,
|
||||
}
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
|
||||
@@ -187,7 +192,7 @@ class TestMessagesArchiveUploadViewSet:
|
||||
response = api_client.post(url, data, format="json")
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["content_type"] == [
|
||||
"Only EML and MBOX files are supported."
|
||||
"Only EML, MBOX, and PST files are supported."
|
||||
]
|
||||
|
||||
def test_api_messages_archive_create_upload_missing_filename(self, api_client):
|
||||
|
||||
@@ -14,7 +14,8 @@ from core import factories
|
||||
from core.api.utils import get_file_key
|
||||
from core.enums import MailboxRoleChoices, MessageDeliveryStatusChoices
|
||||
from core.models import Mailbox, MailDomain, Message, Thread
|
||||
from core.services.importer.tasks import process_eml_file_task, process_mbox_file_task
|
||||
from core.services.importer.eml_tasks import process_eml_file_task
|
||||
from core.services.importer.mbox_tasks import process_mbox_file_task
|
||||
|
||||
pytestmark = pytest.mark.django_db
|
||||
|
||||
@@ -186,7 +187,7 @@ def test_api_import_mbox_async(api_client, user, mailbox, mbox_file):
|
||||
# add access to mailbox
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
mock_task.return_value.status = "PENDING"
|
||||
@@ -201,6 +202,78 @@ def test_api_import_mbox_async(api_client, user, mailbox, mbox_file):
|
||||
assert mock_task.call_args[0][1] == str(mailbox.id)
|
||||
|
||||
|
||||
def test_api_import_pst_file_async(api_client, user, mailbox):
|
||||
"""Test import of PST file via API dispatches the PST task."""
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
|
||||
# PST magic bytes: '!BDN' signature
|
||||
pst_body = b"\x21\x42\x44\x4e" + b"\x00" * 100
|
||||
|
||||
# Create a fake PST file in S3
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
file_key = get_file_key(user.id, "test.pst")
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=pst_body,
|
||||
ContentType="application/vnd.ms-outlook",
|
||||
)
|
||||
|
||||
try:
|
||||
with patch(
|
||||
"core.services.importer.pst_tasks.process_pst_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-pst-task-id"
|
||||
mock_task.return_value.status = "PENDING"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
{"filename": "test.pst", "recipient": str(mailbox.id)},
|
||||
format="multipart",
|
||||
)
|
||||
assert response.status_code == 202
|
||||
assert response.data["type"] == "pst"
|
||||
assert mock_task.call_count == 1
|
||||
assert mock_task.call_args[0][1] == str(mailbox.id)
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
|
||||
def test_api_import_pst_autodetect(api_client, user, mailbox):
|
||||
"""Test that PST files are autodetected by magic bytes regardless of S3 content type."""
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
|
||||
# PST magic bytes: '!BDN' signature, uploaded with generic content type
|
||||
pst_body = b"\x21\x42\x44\x4e" + b"\x00" * 100
|
||||
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
file_key = get_file_key(user.id, "test.pst")
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=pst_body,
|
||||
ContentType="application/octet-stream",
|
||||
)
|
||||
|
||||
try:
|
||||
with patch(
|
||||
"core.services.importer.pst_tasks.process_pst_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-pst-task-id"
|
||||
mock_task.return_value.status = "PENDING"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
{"filename": "test.pst", "recipient": str(mailbox.id)},
|
||||
format="multipart",
|
||||
)
|
||||
assert response.status_code == 202
|
||||
assert response.data["type"] == "pst"
|
||||
mock_task.assert_called_once()
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
|
||||
def test_api_import_mailbox_no_access(api_client, domain, eml_file):
|
||||
"""Test import of EML file without access to mailbox."""
|
||||
# Create a mailbox the user does NOT have access to
|
||||
@@ -218,7 +291,7 @@ def test_api_import_imap_task(api_client, user, mailbox):
|
||||
"""Test import of IMAP messages."""
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
with patch(
|
||||
"core.services.importer.tasks.import_imap_messages_task.delay"
|
||||
"core.services.importer.imap_tasks.import_imap_messages_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
data = {
|
||||
@@ -336,7 +409,9 @@ def test_api_import_duplicate_eml_file(api_client, user, mailbox, eml_file):
|
||||
assert Thread.objects.count() == 0
|
||||
|
||||
# First import
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-1"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
@@ -361,7 +436,9 @@ def test_api_import_duplicate_eml_file(api_client, user, mailbox, eml_file):
|
||||
assert Thread.objects.count() == 1
|
||||
|
||||
# Import again the same file
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-2"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
@@ -396,7 +473,7 @@ def test_api_import_duplicate_mbox_file(api_client, user, mailbox, mbox_file):
|
||||
|
||||
# First import
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-1"
|
||||
response = api_client.post(
|
||||
@@ -426,7 +503,7 @@ def test_api_import_duplicate_mbox_file(api_client, user, mailbox, mbox_file):
|
||||
|
||||
# Second import of the same file
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-2"
|
||||
response = api_client.post(
|
||||
@@ -467,7 +544,9 @@ def test_api_import_eml_same_message_different_mailboxes(api_client, user, eml_f
|
||||
assert Message.objects.count() == 0
|
||||
|
||||
# Import to first mailbox
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-1"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
@@ -491,7 +570,9 @@ def test_api_import_eml_same_message_different_mailboxes(api_client, user, eml_f
|
||||
assert Message.objects.count() == 1
|
||||
|
||||
# Import to second mailbox
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-2"
|
||||
response = api_client.post(
|
||||
IMPORT_FILE_URL,
|
||||
@@ -539,7 +620,7 @@ def test_api_import_mbox_same_message_different_mailboxes(api_client, user, mbox
|
||||
|
||||
# Import to first mailbox
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-1"
|
||||
response = api_client.post(
|
||||
@@ -565,7 +646,7 @@ def test_api_import_mbox_same_message_different_mailboxes(api_client, user, mbox
|
||||
|
||||
# Import to second mailbox
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id-2"
|
||||
response = api_client.post(
|
||||
|
||||
@@ -3,8 +3,13 @@
|
||||
|
||||
import datetime
|
||||
from io import BytesIO
|
||||
from unittest.mock import MagicMock, Mock, patch
|
||||
from unittest.mock import (
|
||||
MagicMock,
|
||||
Mock,
|
||||
patch,
|
||||
)
|
||||
|
||||
from django.core.files.storage import storages
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
from django.urls import reverse
|
||||
|
||||
@@ -12,7 +17,8 @@ import pytest
|
||||
|
||||
from core import factories
|
||||
from core.models import Mailbox, MailDomain, Message, Thread
|
||||
from core.services.importer.tasks import process_eml_file_task, process_mbox_file_task
|
||||
from core.services.importer.eml_tasks import process_eml_file_task
|
||||
from core.services.importer.mbox_tasks import process_mbox_file_task
|
||||
|
||||
|
||||
def mock_storage_open(content: bytes):
|
||||
@@ -107,7 +113,9 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
with (
|
||||
patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_delay,
|
||||
patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_delay,
|
||||
patch.object(process_eml_file_task, "update_state", mock_task.update_state),
|
||||
):
|
||||
mock_delay.return_value.id = "fake-task-id"
|
||||
@@ -127,7 +135,7 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
|
||||
# Mock storage for running task synchronously
|
||||
mock_storage = mock_storage_open(eml_file)
|
||||
|
||||
with patch("core.services.importer.tasks.storages") as mock_storages:
|
||||
with patch("core.services.importer.eml_tasks.storages") as mock_storages:
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task synchronously for testing
|
||||
task_result = process_eml_file_task(
|
||||
@@ -143,10 +151,11 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 1
|
||||
|
||||
# Verify progress updates were called correctly
|
||||
assert mock_task.update_state.call_count == 2
|
||||
# Verify only PROGRESS update_state was called (no SUCCESS —
|
||||
# Celery infers SUCCESS from normal return)
|
||||
assert mock_task.update_state.call_count == 1
|
||||
|
||||
mock_task.update_state.assert_any_call(
|
||||
mock_task.update_state.assert_called_once_with(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
@@ -160,13 +169,6 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# check that the message was created
|
||||
assert Message.objects.count() == 1
|
||||
@@ -181,96 +183,95 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
|
||||
)
|
||||
|
||||
|
||||
def _upload_to_s3(content, file_key="test-mbox-key"):
|
||||
"""Upload content to the message-imports S3 bucket (real MinIO)."""
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=content,
|
||||
)
|
||||
return file_key, storage, s3_client
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_process_mbox_file_task(mailbox, mbox_file):
|
||||
"""Test the Celery task that processes MBOX files."""
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
file_key, storage, s3_client = _upload_to_s3(mbox_file)
|
||||
|
||||
# Mock storage
|
||||
mock_storage = mock_storage_open(mbox_file)
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
# Mock the task's update_state method to avoid database operations
|
||||
with (
|
||||
patch.object(process_mbox_file_task, "update_state", mock_task.update_state),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task synchronously for testing
|
||||
task_result = process_mbox_file_task(
|
||||
file_key="test-file-key.mbox", recipient_id=str(mailbox.id)
|
||||
)
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert (
|
||||
task_result["result"]["message_status"] == "Completed processing messages"
|
||||
)
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert (
|
||||
task_result["result"]["total_messages"] == 3
|
||||
) # Three messages in the test MBOX file
|
||||
assert task_result["result"]["success_count"] == 3
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 3
|
||||
with patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
):
|
||||
task_result = process_mbox_file_task(
|
||||
file_key=file_key, recipient_id=str(mailbox.id)
|
||||
)
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert (
|
||||
task_result["result"]["message_status"]
|
||||
== "Completed processing messages"
|
||||
)
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert task_result["result"]["total_messages"] == 3
|
||||
assert task_result["result"]["success_count"] == 3
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 3
|
||||
|
||||
# Verify progress updates were called correctly
|
||||
assert mock_task.update_state.call_count == 5 # 4 PROGRESS + 1 SUCCESS
|
||||
# 1 indexing + 3 per-message PROGRESS = 4
|
||||
assert mock_task.update_state.call_count == 4
|
||||
|
||||
# Verify progress updates
|
||||
for i in range(1, 4):
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": f"Processing message {i} of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": i - 1, # Previous messages were successful
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": i,
|
||||
# Verify per-message progress updates
|
||||
for i in range(1, 4):
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": f"Processing message {i} of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": i - 1,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": i,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify messages were created
|
||||
assert Message.objects.count() == 3
|
||||
messages = Message.objects.order_by("created_at")
|
||||
|
||||
# Check thread for each message
|
||||
assert messages[0].thread is not None
|
||||
assert messages[1].thread is not None
|
||||
assert messages[2].thread is not None
|
||||
assert messages[2].thread.messages.count() == 2
|
||||
assert messages[1].thread == messages[2].thread
|
||||
# Check created_at dates match between messages and threads
|
||||
assert messages[0].sent_at == messages[0].thread.messaged_at
|
||||
assert messages[2].sent_at == messages[1].thread.messaged_at
|
||||
assert messages[2].sent_at == (
|
||||
datetime.datetime(2025, 5, 26, 20, 18, 4, tzinfo=datetime.timezone.utc)
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
# Check messages
|
||||
assert messages[0].subject == "Mon mail avec joli pj"
|
||||
assert messages[0].has_attachments is True
|
||||
|
||||
# Verify messages were created
|
||||
assert Message.objects.count() == 3
|
||||
messages = Message.objects.order_by("created_at")
|
||||
assert messages[1].subject == "Je t'envoie encore un message..."
|
||||
body1 = messages[1].get_parsed_field("textBody")[0]["content"]
|
||||
assert "Lorem ipsum dolor sit amet" in body1
|
||||
|
||||
# Check thread for each message
|
||||
assert messages[0].thread is not None
|
||||
assert messages[1].thread is not None
|
||||
assert messages[2].thread is not None
|
||||
assert messages[2].thread.messages.count() == 2
|
||||
assert messages[1].thread == messages[2].thread
|
||||
# Check created_at dates match between messages and threads
|
||||
assert messages[0].sent_at == messages[0].thread.messaged_at
|
||||
assert messages[2].sent_at == messages[1].thread.messaged_at
|
||||
assert messages[2].sent_at == (
|
||||
datetime.datetime(2025, 5, 26, 20, 18, 4, tzinfo=datetime.timezone.utc)
|
||||
)
|
||||
|
||||
# Check messages
|
||||
assert messages[0].subject == "Mon mail avec joli pj"
|
||||
assert messages[0].has_attachments is True
|
||||
|
||||
assert messages[1].subject == "Je t'envoie encore un message..."
|
||||
body1 = messages[1].get_parsed_field("textBody")[0]["content"]
|
||||
assert "Lorem ipsum dolor sit amet" in body1
|
||||
|
||||
assert messages[2].subject == "Re: Je t'envoie encore un message..."
|
||||
body2 = messages[2].get_parsed_field("textBody")[0]["content"]
|
||||
assert "Yes !" in body2
|
||||
assert "Lorem ipsum dolor sit amet" in body2
|
||||
assert messages[2].subject == "Re: Je t'envoie encore un message..."
|
||||
body2 = messages[2].get_parsed_field("textBody")[0]["content"]
|
||||
assert "Yes !" in body2
|
||||
assert "Lorem ipsum dolor sit amet" in body2
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
|
||||
def test_upload_mbox_file(admin_client, mailbox, mbox_file):
|
||||
@@ -316,7 +317,7 @@ def test_import_form_invalid_file(admin_client, mailbox):
|
||||
# The form should still be displayed with an error
|
||||
assert "Import Messages" in response.content.decode()
|
||||
assert (
|
||||
"File must be either an EML (.eml) or MBOX (.mbox) file"
|
||||
"File must be an EML (.eml), MBOX (.mbox), or PST (.pst) file"
|
||||
in response.content.decode()
|
||||
)
|
||||
|
||||
@@ -389,7 +390,7 @@ This is a test message addressed to mailbox_b.
|
||||
# Import from mailbox_a
|
||||
with (
|
||||
patch.object(process_eml_file_task, "update_state", mock_task.update_state),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
patch("core.services.importer.eml_tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task synchronously for testing, importing from mailbox_a
|
||||
@@ -460,7 +461,7 @@ This is a test message sent from the mailbox.
|
||||
# Import the message
|
||||
with (
|
||||
patch.object(process_eml_file_task, "update_state", mock_task.update_state),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
patch("core.services.importer.eml_tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task synchronously for testing
|
||||
|
||||
@@ -12,7 +12,7 @@ import pytest
|
||||
from core import enums, factories
|
||||
from core.forms import IMAPImportForm
|
||||
from core.models import Mailbox, MailDomain, Message, Thread
|
||||
from core.services.importer.tasks import import_imap_messages_task
|
||||
from core.services.importer.imap_tasks import import_imap_messages_task
|
||||
|
||||
from messages.celery_app import app as celery_app
|
||||
|
||||
@@ -163,7 +163,7 @@ def test_imap_import_form_view(admin_client, mailbox):
|
||||
}
|
||||
|
||||
with patch(
|
||||
"core.services.importer.tasks.import_imap_messages_task.delay"
|
||||
"core.services.importer.imap_tasks.import_imap_messages_task.delay"
|
||||
) as mock_task:
|
||||
response = admin_client.post(url, form_data, follow=True)
|
||||
assert response.status_code == 200
|
||||
@@ -212,7 +212,7 @@ def test_imap_import_task_success(
|
||||
assert task["result"]["current_message"] == 3
|
||||
|
||||
# Verify progress updates were called correctly
|
||||
assert mock_task.update_state.call_count == 4 # 3 PROGRESS + 1 SUCCESS
|
||||
assert mock_task.update_state.call_count == 3 # 3 PROGRESS
|
||||
|
||||
# Verify progress updates
|
||||
for i in range(1, 4):
|
||||
@@ -231,11 +231,8 @@ def test_imap_import_task_success(
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="SUCCESS",
|
||||
meta=task,
|
||||
)
|
||||
# No SUCCESS update_state — Celery infers SUCCESS from normal return;
|
||||
# status is in the returned dict
|
||||
|
||||
# Verify messages were created
|
||||
assert Message.objects.count() == 3
|
||||
@@ -295,15 +292,8 @@ def test_imap_import_task_login_failure(mailbox):
|
||||
assert task_result["result"]["current_message"] == 0
|
||||
assert "Login failed" in task_result["error"]
|
||||
|
||||
# Verify only failure update was called
|
||||
assert mock_task.update_state.call_count == 1
|
||||
mock_task.update_state.assert_called_once_with(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": task_result["error"],
|
||||
},
|
||||
)
|
||||
# No update_state calls — failure status is in the returned dict
|
||||
mock_task.update_state.assert_not_called()
|
||||
|
||||
# Verify no messages were created
|
||||
assert Message.objects.count() == 0
|
||||
@@ -360,7 +350,7 @@ def test_imap_import_task_message_fetch_failure(
|
||||
assert task["result"]["current_message"] == 3
|
||||
|
||||
# Verify progress updates were called correctly
|
||||
assert mock_task.update_state.call_count == 4 # 3 PROGRESS + 1 SUCCESS
|
||||
assert mock_task.update_state.call_count == 3 # 3 PROGRESS
|
||||
|
||||
# Verify progress updates
|
||||
for i in range(1, 4):
|
||||
@@ -379,11 +369,8 @@ def test_imap_import_task_message_fetch_failure(
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="SUCCESS",
|
||||
meta=task,
|
||||
)
|
||||
# No SUCCESS update_state — Celery infers SUCCESS from normal return;
|
||||
# status is in the returned dict
|
||||
|
||||
|
||||
@patch("core.mda.inbound.logger")
|
||||
|
||||
@@ -16,8 +16,8 @@ from core.api.utils import get_file_key
|
||||
from core.enums import MailboxRoleChoices
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.models import Mailbox, MailDomain, Message
|
||||
from core.services.importer.eml_tasks import process_eml_file_task
|
||||
from core.services.importer.service import ImportService
|
||||
from core.services.importer.tasks import process_eml_file_task
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@@ -135,7 +135,9 @@ def mbox_key(user, mbox_file):
|
||||
@pytest.mark.django_db
|
||||
def test_import_file_eml_by_superuser(admin_user, mailbox, eml_key, mock_request):
|
||||
"""Test successful EML file import for superuser."""
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
file_key=eml_key,
|
||||
@@ -194,9 +196,9 @@ def test_import_file_eml_by_superuser_sync(admin_user, mailbox, eml_key):
|
||||
assert task_result["result"]["current_message"] == 1
|
||||
assert task_result["error"] is None
|
||||
|
||||
# Verify progress updates
|
||||
assert mock_task.update_state.call_count == 2 # 1 PROGRESS + 1 SUCCESS
|
||||
mock_task.update_state.assert_any_call(
|
||||
# Verify progress update (no SUCCESS update_state — Celery infers
|
||||
# SUCCESS from normal return; status is in the returned dict)
|
||||
mock_task.update_state.assert_called_once_with(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
@@ -211,15 +213,6 @@ def test_import_file_eml_by_superuser_sync(admin_user, mailbox, eml_key):
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify message was created
|
||||
assert Message.objects.count() == 1
|
||||
message = Message.objects.first()
|
||||
@@ -235,7 +228,9 @@ def test_import_file_eml_by_user_with_access_task(user, mailbox, eml_key, mock_r
|
||||
# Add access to mailbox
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
|
||||
with patch("core.services.importer.tasks.process_eml_file_task.delay") as mock_task:
|
||||
with patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
file_key=eml_key,
|
||||
@@ -297,9 +292,9 @@ def test_import_file_eml_by_user_with_access_sync(user, mailbox, eml_key, mock_r
|
||||
assert task_result["result"]["current_message"] == 1
|
||||
assert task_result["error"] is None
|
||||
|
||||
# Verify progress updates
|
||||
assert mock_task.update_state.call_count == 2 # 1 PROGRESS + 1 SUCCESS
|
||||
mock_task.update_state.assert_any_call(
|
||||
# Verify progress update (no SUCCESS update_state — Celery infers
|
||||
# SUCCESS from normal return; status is in the returned dict)
|
||||
mock_task.update_state.assert_called_once_with(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
@@ -314,15 +309,6 @@ def test_import_file_eml_by_user_with_access_sync(user, mailbox, eml_key, mock_r
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify message was created
|
||||
assert Message.objects.count() == 1
|
||||
message = Message.objects.first()
|
||||
@@ -339,7 +325,7 @@ def test_import_file_mbox_by_superuser_task(
|
||||
"""Test successful MBOX file import by superuser."""
|
||||
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
@@ -364,7 +350,7 @@ def test_import_file_mbox_by_user_with_access_task(
|
||||
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
|
||||
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_mbox_file_task.delay"
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
@@ -427,9 +413,10 @@ def test_import_file_no_access(user, domain, eml_key, mock_request):
|
||||
def test_import_file_invalid_file(admin_user, mailbox, mock_request):
|
||||
"""Test import with an invalid file."""
|
||||
# Create an invalid file (not EML or MBOX)
|
||||
invalid_content = b"Invalid file content"
|
||||
# Use real PDF magic bytes so python-magic detects it as application/pdf
|
||||
invalid_content = b"%PDF-1.4 invalid content"
|
||||
invalid_file = SimpleUploadedFile(
|
||||
"test.pdf", invalid_content, content_type="application/pdf"
|
||||
"test.mbox", invalid_content, content_type="application/mbox"
|
||||
)
|
||||
invalid_file_key = get_file_key(admin_user.id, invalid_file.name)
|
||||
storage = storages["message-imports"]
|
||||
@@ -443,11 +430,8 @@ def test_import_file_invalid_file(admin_user, mailbox, mock_request):
|
||||
|
||||
try:
|
||||
with patch(
|
||||
"core.services.importer.tasks.process_eml_file_task.delay"
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_task:
|
||||
# The task should not be called for invalid files
|
||||
mock_task.assert_not_called()
|
||||
|
||||
success, response_data = ImportService.import_file(
|
||||
file_key=invalid_file_key,
|
||||
recipient=mailbox,
|
||||
@@ -459,6 +443,8 @@ def test_import_file_invalid_file(admin_user, mailbox, mock_request):
|
||||
assert "detail" in response_data
|
||||
assert "Invalid file format" in response_data["detail"]
|
||||
assert Message.objects.count() == 0
|
||||
# The task should not be called for invalid files
|
||||
mock_task.assert_not_called()
|
||||
finally:
|
||||
# Clean up: delete the file from S3 after the test
|
||||
s3_client.delete_object(
|
||||
@@ -470,7 +456,7 @@ def test_import_file_invalid_file(admin_user, mailbox, mock_request):
|
||||
def test_import_imap_by_superuser(admin_user, mailbox, mock_request):
|
||||
"""Test successful IMAP import."""
|
||||
with patch(
|
||||
"core.services.importer.tasks.import_imap_messages_task.delay"
|
||||
"core.services.importer.imap_tasks.import_imap_messages_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_imap(
|
||||
@@ -504,7 +490,7 @@ def test_import_imap_by_user_with_access(user, mailbox, mock_request, role):
|
||||
mailbox.accesses.create(user=user, role=role)
|
||||
|
||||
with patch(
|
||||
"core.services.importer.tasks.import_imap_messages_task.delay"
|
||||
"core.services.importer.imap_tasks.import_imap_messages_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.return_value.id = "fake-task-id"
|
||||
success, response_data = ImportService.import_imap(
|
||||
@@ -550,7 +536,7 @@ def test_import_imap_task_error(admin_user, mailbox, mock_request):
|
||||
mailbox.accesses.create(user=admin_user, role=MailboxRoleChoices.ADMIN)
|
||||
|
||||
with patch(
|
||||
"core.services.importer.tasks.import_imap_messages_task.delay"
|
||||
"core.services.importer.imap_tasks.import_imap_messages_task.delay"
|
||||
) as mock_task:
|
||||
mock_task.side_effect = Exception("Task error")
|
||||
success, response_data = ImportService.import_imap(
|
||||
@@ -566,7 +552,7 @@ def test_import_imap_task_error(admin_user, mailbox, mock_request):
|
||||
|
||||
assert success is False
|
||||
assert "detail" in response_data
|
||||
assert "Task error" in response_data["detail"]
|
||||
assert "error" in response_data["detail"].lower()
|
||||
|
||||
|
||||
def test_import_imap_messages_by_superuser(admin_user, mailbox, mock_request):
|
||||
@@ -805,3 +791,119 @@ Test message body 1"""
|
||||
assert mock_is_auto_labels_enabled.call_count == 0
|
||||
assert mock_assign_label_to_thread.call_count == 0
|
||||
assert mock_summarize_thread.call_count == 0
|
||||
|
||||
|
||||
# --- Filename disambiguation tests ---
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_import_file_eml_disambiguated_by_filename(admin_user, mailbox, mock_request):
|
||||
"""Test that a .eml file detected as text/plain is routed to EML task via filename."""
|
||||
# Create a file that magic detects as text/plain but has .eml extension
|
||||
eml_content = b"From: sender@example.com\r\nTo: recipient@example.com\r\nSubject: Test\r\n\r\nBody"
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
file_key = get_file_key(admin_user.id, "test.eml")
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=eml_content,
|
||||
ContentType="text/plain",
|
||||
)
|
||||
|
||||
try:
|
||||
with (
|
||||
patch(
|
||||
"core.services.importer.eml_tasks.process_eml_file_task.delay"
|
||||
) as mock_eml_task,
|
||||
patch(
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_mbox_task,
|
||||
):
|
||||
mock_eml_task.return_value.id = "fake-eml-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
file_key=file_key,
|
||||
recipient=mailbox,
|
||||
user=admin_user,
|
||||
request=mock_request,
|
||||
filename="test.eml",
|
||||
)
|
||||
|
||||
assert success is True
|
||||
assert response_data["type"] == "eml"
|
||||
mock_eml_task.assert_called_once()
|
||||
mock_mbox_task.assert_not_called()
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_import_file_mbox_disambiguated_by_filename(admin_user, mailbox, mock_request):
|
||||
"""Test that a .mbox file detected as text/plain is routed to MBOX task via filename."""
|
||||
# text/plain content with .mbox extension
|
||||
mbox_content = (
|
||||
b"From sender@example.com Mon Jan 1 00:00:00 2025\r\n"
|
||||
b"From: sender@example.com\r\nSubject: Test\r\n\r\nBody"
|
||||
)
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
file_key = get_file_key(admin_user.id, "test.mbox")
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=mbox_content,
|
||||
ContentType="text/plain",
|
||||
)
|
||||
|
||||
try:
|
||||
with patch(
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_mbox_task:
|
||||
mock_mbox_task.return_value.id = "fake-mbox-task-id"
|
||||
success, response_data = ImportService.import_file(
|
||||
file_key=file_key,
|
||||
recipient=mailbox,
|
||||
user=admin_user,
|
||||
request=mock_request,
|
||||
filename="test.mbox",
|
||||
)
|
||||
|
||||
assert success is True
|
||||
assert response_data["type"] == "mbox"
|
||||
mock_mbox_task.assert_called_once()
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_import_file_without_filename_falls_back_to_mime(
|
||||
admin_user, mailbox, mock_request
|
||||
):
|
||||
"""Test that without filename, text/plain files still work (fall through to MBOX/EML)."""
|
||||
eml_content = b"From: sender@example.com\r\nTo: recipient@example.com\r\nSubject: Test\r\n\r\nBody"
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
file_key = get_file_key(admin_user.id, "noext")
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=eml_content,
|
||||
ContentType="text/plain",
|
||||
)
|
||||
|
||||
try:
|
||||
with patch(
|
||||
"core.services.importer.mbox_tasks.process_mbox_file_task.delay"
|
||||
) as mock_mbox_task:
|
||||
mock_mbox_task.return_value.id = "fake-task-id"
|
||||
# Without filename, text/plain hits MBOX first (as before)
|
||||
success, _response_data = ImportService.import_file(
|
||||
file_key=file_key,
|
||||
recipient=mailbox,
|
||||
user=admin_user,
|
||||
request=mock_request,
|
||||
)
|
||||
|
||||
assert success is True
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
1355
src/backend/core/tests/importer/test_pst_import.py
Normal file
1355
src/backend/core/tests/importer/test_pst_import.py
Normal file
File diff suppressed because it is too large
Load Diff
306
src/backend/core/tests/importer/test_s3_seekable.py
Normal file
306
src/backend/core/tests/importer/test_s3_seekable.py
Normal file
@@ -0,0 +1,306 @@
|
||||
"""Tests for S3SeekableReader."""
|
||||
|
||||
import io
|
||||
from unittest.mock import Mock
|
||||
|
||||
import pytest
|
||||
|
||||
from core.services.importer.s3_seekable import (
|
||||
BUFFER_CENTERED,
|
||||
BUFFER_FORWARD,
|
||||
S3SeekableReader,
|
||||
)
|
||||
|
||||
|
||||
class TestS3SeekableReader:
|
||||
"""Tests for S3SeekableReader."""
|
||||
|
||||
def test_read_basic(self):
|
||||
"""Test basic read operation."""
|
||||
test_data = b"Hello, World! This is a test file."
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
assert reader.size == len(test_data)
|
||||
data = reader.read(5)
|
||||
assert data == b"Hello"
|
||||
assert reader.tell() == 5
|
||||
|
||||
def test_seek_and_read(self):
|
||||
"""Test seek followed by read."""
|
||||
test_data = b"Hello, World!"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
# Forward buffer: seeking to position 7 fetches from 7 onward
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data[7:]
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
reader.seek(7)
|
||||
assert reader.tell() == 7
|
||||
data = reader.read(6)
|
||||
assert data == b"World!"
|
||||
|
||||
def test_seek_from_end(self):
|
||||
"""Test seeking from the end of file."""
|
||||
test_data = b"Hello, World!"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
# Forward buffer: seeking to position 7 fetches from 7 onward
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data[7:]
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
reader.seek(-6, io.SEEK_END)
|
||||
assert reader.tell() == 7
|
||||
|
||||
def test_read_past_end(self):
|
||||
"""Test reading past the end of the file."""
|
||||
test_data = b"Hi"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
data = reader.read(100)
|
||||
assert data == b"Hi"
|
||||
assert reader.tell() == 2
|
||||
|
||||
def test_read_empty_at_end(self):
|
||||
"""Test reading when already at end of file."""
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": 5}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
reader.seek(0, io.SEEK_END)
|
||||
data = reader.read(10)
|
||||
assert data == b""
|
||||
|
||||
def test_seekable_and_readable(self):
|
||||
"""Test seekable() and readable() return True."""
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": 10}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key")
|
||||
assert reader.seekable() is True
|
||||
assert reader.readable() is True
|
||||
|
||||
def test_buffer_reuse(self):
|
||||
"""Test that buffered data is reused without extra S3 calls."""
|
||||
test_data = b"ABCDEFGHIJ"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
reader.read(3) # Reads "ABC", fetches buffer
|
||||
reader.read(3) # Reads "DEF", should reuse buffer
|
||||
|
||||
# Only one get_object call should have been made
|
||||
assert mock_s3.get_object.call_count == 1
|
||||
|
||||
def test_forward_buffer_sequential(self):
|
||||
"""Test that forward buffer fetches from the read position onward."""
|
||||
test_data = b"A" * 100 + b"B" * 100 + b"C" * 100
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": 300}
|
||||
|
||||
recorded_ranges = []
|
||||
|
||||
def mock_get_object(**kwargs):
|
||||
recorded_ranges.append(kwargs.get("Range"))
|
||||
_, range_spec = kwargs["Range"].split("=")
|
||||
start, end = range_spec.split("-")
|
||||
body = Mock()
|
||||
body.read.return_value = test_data[int(start) : int(end) + 1]
|
||||
return {"Body": body}
|
||||
|
||||
mock_s3.get_object = Mock(side_effect=mock_get_object)
|
||||
|
||||
reader = S3SeekableReader(
|
||||
mock_s3,
|
||||
"test-bucket",
|
||||
"test-key",
|
||||
buffer_size=150,
|
||||
buffer_strategy=BUFFER_FORWARD,
|
||||
)
|
||||
|
||||
# Read from start — buffer covers [0, 149]
|
||||
assert reader.read(100) == b"A" * 100
|
||||
assert recorded_ranges[-1] == "bytes=0-149"
|
||||
|
||||
# Read next 100 — position 100, buffer covers [100, 149] = 50 bytes available
|
||||
# Serves 50 from buffer, then fetches from 150 onward for remaining 50
|
||||
assert reader.read(100) == b"B" * 100
|
||||
assert recorded_ranges[-1] == "bytes=150-299"
|
||||
|
||||
def test_invalid_buffer_strategy(self):
|
||||
"""Test that an invalid buffer strategy raises ValueError."""
|
||||
mock_s3 = Mock()
|
||||
with pytest.raises(ValueError, match="Unknown buffer_strategy"):
|
||||
S3SeekableReader(
|
||||
mock_s3, "test-bucket", "test-key", buffer_strategy="invalid"
|
||||
)
|
||||
|
||||
def test_centered_buffer_bidirectional(self):
|
||||
"""Test centered buffer with bidirectional access pattern.
|
||||
|
||||
Simulates a 500-byte file with 5 messages. Buffer size = 200.
|
||||
Pass 1 reads sequentially (0→499), Pass 2 reads in reverse (400→0).
|
||||
Centered buffering should reduce total S3 requests vs forward-only.
|
||||
"""
|
||||
# Create 500 bytes of predictable data (5 x 100-byte blocks)
|
||||
test_data = b""
|
||||
for i in range(5):
|
||||
block = f"MSG{i}".encode().ljust(100, b".")
|
||||
test_data += block
|
||||
assert len(test_data) == 500
|
||||
|
||||
recorded_ranges = []
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": 500}
|
||||
|
||||
def mock_get_object(**kwargs):
|
||||
recorded_ranges.append(kwargs.get("Range"))
|
||||
_, range_spec = kwargs["Range"].split("=")
|
||||
start, end = range_spec.split("-")
|
||||
start, end = int(start), int(end)
|
||||
body = Mock()
|
||||
body.read.return_value = test_data[start : end + 1]
|
||||
return {"Body": body}
|
||||
|
||||
mock_s3.get_object = Mock(side_effect=mock_get_object)
|
||||
|
||||
reader = S3SeekableReader(
|
||||
mock_s3,
|
||||
"test-bucket",
|
||||
"test-key",
|
||||
buffer_size=200,
|
||||
buffer_strategy=BUFFER_CENTERED,
|
||||
)
|
||||
|
||||
# Pass 1: Sequential read (simulating indexing)
|
||||
reader.seek(0)
|
||||
d0 = reader.read(100) # pos 0-99
|
||||
assert d0[:4] == b"MSG0"
|
||||
|
||||
d1 = reader.read(100) # pos 100-199
|
||||
assert d1[:4] == b"MSG1"
|
||||
|
||||
d2 = reader.read(100) # pos 200-299
|
||||
assert d2[:4] == b"MSG2"
|
||||
|
||||
d3 = reader.read(100) # pos 300-399
|
||||
assert d3[:4] == b"MSG3"
|
||||
|
||||
d4 = reader.read(100) # pos 400-499
|
||||
assert d4[:4] == b"MSG4"
|
||||
|
||||
# Pass 2: Reverse read (simulating chronological processing)
|
||||
reader.seek(400)
|
||||
r4 = reader.read(100)
|
||||
assert r4[:4] == b"MSG4"
|
||||
|
||||
reader.seek(300)
|
||||
r3 = reader.read(100)
|
||||
assert r3[:4] == b"MSG3"
|
||||
|
||||
reader.seek(200)
|
||||
r2 = reader.read(100)
|
||||
assert r2[:4] == b"MSG2"
|
||||
|
||||
reader.seek(100)
|
||||
r1 = reader.read(100)
|
||||
assert r1[:4] == b"MSG1"
|
||||
|
||||
reader.seek(0)
|
||||
r0 = reader.read(100)
|
||||
assert r0[:4] == b"MSG0"
|
||||
|
||||
# All messages were read correctly
|
||||
assert d0 == r0
|
||||
assert d1 == r1
|
||||
assert d2 == r2
|
||||
assert d3 == r3
|
||||
assert d4 == r4
|
||||
|
||||
# Verify the number of S3 requests is reasonable
|
||||
# Without centering (forward-only buffer) reverse pass would need 5 requests
|
||||
# With centering, some reverse reads hit the buffer
|
||||
assert len(recorded_ranges) < 10 # Reasonable upper bound
|
||||
|
||||
def test_context_manager(self):
|
||||
"""Test that S3SeekableReader works as a context manager."""
|
||||
test_data = b"Hello"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
with S3SeekableReader(mock_s3, "test-bucket", "test-key") as reader:
|
||||
data = reader.read(5)
|
||||
assert data == b"Hello"
|
||||
|
||||
# After context exit, buffer should be released
|
||||
assert reader._buffer == b"" # pylint: disable=protected-access
|
||||
|
||||
def test_read_larger_than_buffer(self):
|
||||
"""Test reading more bytes than the buffer size spans multiple fills."""
|
||||
test_data = b"A" * 100 + b"B" * 100 + b"C" * 100 # 300 bytes
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": 300}
|
||||
|
||||
def mock_get_object(**kwargs):
|
||||
_, range_spec = kwargs["Range"].split("=")
|
||||
start, end = range_spec.split("-")
|
||||
body = Mock()
|
||||
body.read.return_value = test_data[int(start) : int(end) + 1]
|
||||
return {"Body": body}
|
||||
|
||||
mock_s3.get_object = Mock(side_effect=mock_get_object)
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=100)
|
||||
|
||||
# Read all 300 bytes with buffer_size=100 — requires 3 fills
|
||||
data = reader.read(300)
|
||||
assert len(data) == 300
|
||||
assert data == test_data
|
||||
assert mock_s3.get_object.call_count == 3
|
||||
|
||||
def test_read_all_default(self):
|
||||
"""Test read() with no size argument reads entire file."""
|
||||
test_data = b"Hello, World!"
|
||||
mock_s3 = Mock()
|
||||
mock_s3.head_object.return_value = {"ContentLength": len(test_data)}
|
||||
|
||||
body_mock = Mock()
|
||||
body_mock.read.return_value = test_data
|
||||
mock_s3.get_object.return_value = {"Body": body_mock}
|
||||
|
||||
reader = S3SeekableReader(mock_s3, "test-bucket", "test-key", buffer_size=1024)
|
||||
|
||||
data = reader.read()
|
||||
assert data == test_data
|
||||
assert reader.tell() == len(test_data)
|
||||
BIN
src/backend/core/tests/resources/Outlook.pst
Normal file
BIN
src/backend/core/tests/resources/Outlook.pst
Normal file
Binary file not shown.
BIN
src/backend/core/tests/resources/sample.pst
Normal file
BIN
src/backend/core/tests/resources/sample.pst
Normal file
Binary file not shown.
@@ -1,12 +1,12 @@
|
||||
"""Tests for importer tasks."""
|
||||
# pylint: disable=redefined-outer-name, no-value-for-parameter
|
||||
|
||||
import logging
|
||||
import uuid
|
||||
from io import BytesIO
|
||||
from unittest.mock import MagicMock, Mock, patch
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.core.files.storage import storages
|
||||
|
||||
import pytest
|
||||
|
||||
@@ -14,14 +14,12 @@ from core import models
|
||||
from core.factories import MailboxFactory, UserFactory
|
||||
from core.mda.inbound import deliver_inbound_message
|
||||
from core.models import Message
|
||||
from core.services.importer.tasks import (
|
||||
from core.services.importer.mbox_tasks import (
|
||||
extract_date_from_headers,
|
||||
index_mbox_messages,
|
||||
process_mbox_file_task,
|
||||
scan_mbox_messages,
|
||||
stream_mbox_messages,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mailbox(user):
|
||||
@@ -39,27 +37,38 @@ def user():
|
||||
|
||||
@pytest.fixture
|
||||
def sample_mbox_content():
|
||||
"""Create a sample MBOX file content."""
|
||||
return b"""From user@example.com Thu Jan 1 00:00:00 2024
|
||||
Subject: Test Message 1
|
||||
From: sender1@example.com
|
||||
To: recipient@example.com
|
||||
"""Create a sample MBOX file content with dates and message IDs.
|
||||
|
||||
This is test message 1.
|
||||
|
||||
From user@example.com Thu Jan 1 00:00:01 2024
|
||||
Subject: Test Message 2
|
||||
From: sender2@example.com
|
||||
To: recipient@example.com
|
||||
|
||||
This is test message 2.
|
||||
|
||||
From user@example.com Thu Jan 1 00:00:02 2024
|
||||
Messages are intentionally out of chronological order to test sorting.
|
||||
"""
|
||||
return b"""From user@example.com Thu Jan 3 00:00:00 2024
|
||||
Message-ID: <msg3@example.com>
|
||||
Subject: Test Message 3
|
||||
From: sender3@example.com
|
||||
To: recipient@example.com
|
||||
Date: Wed, 3 Jan 2024 00:00:00 +0000
|
||||
|
||||
This is test message 3.
|
||||
|
||||
From user@example.com Thu Jan 1 00:00:00 2024
|
||||
Message-ID: <msg1@example.com>
|
||||
Subject: Test Message 1
|
||||
From: sender1@example.com
|
||||
To: recipient@example.com
|
||||
Date: Mon, 1 Jan 2024 00:00:00 +0000
|
||||
|
||||
This is test message 1.
|
||||
|
||||
From user@example.com Thu Jan 2 00:00:00 2024
|
||||
Message-ID: <msg2@example.com>
|
||||
Subject: Test Message 2
|
||||
From: sender2@example.com
|
||||
To: recipient@example.com
|
||||
Date: Tue, 2 Jan 2024 00:00:00 +0000
|
||||
In-Reply-To: <msg1@example.com>
|
||||
References: <msg1@example.com>
|
||||
|
||||
This is test message 2.
|
||||
"""
|
||||
|
||||
|
||||
@@ -71,19 +80,123 @@ def mock_task():
|
||||
return task
|
||||
|
||||
|
||||
def mock_storage_open(content: bytes):
|
||||
"""Helper to create a mock storage that returns the given content.
|
||||
def _upload_to_s3(content, file_key="test-mbox-key"):
|
||||
"""Upload content to the message-imports S3 bucket (real MinIO)."""
|
||||
storage = storages["message-imports"]
|
||||
s3_client = storage.connection.meta.client
|
||||
s3_client.put_object(
|
||||
Bucket=storage.bucket_name,
|
||||
Key=file_key,
|
||||
Body=content,
|
||||
)
|
||||
return file_key, storage, s3_client
|
||||
|
||||
The mock allows multiple opens since the task opens the file twice:
|
||||
once for counting and once for processing.
|
||||
"""
|
||||
|
||||
def create_file(*args, **kwargs):
|
||||
return BytesIO(content)
|
||||
@pytest.mark.django_db
|
||||
class TestExtractDateFromHeaders:
|
||||
"""Test the extract_date_from_headers function."""
|
||||
|
||||
mock_storage = Mock()
|
||||
mock_storage.open = Mock(side_effect=create_file)
|
||||
return mock_storage
|
||||
def test_extract_valid_date(self):
|
||||
"""Test extracting a valid RFC5322 date."""
|
||||
raw = b"From: a@b.com\r\nDate: Mon, 1 Jan 2024 00:00:00 +0000\r\n\r\nBody"
|
||||
result = extract_date_from_headers(raw)
|
||||
assert result is not None
|
||||
assert result.year == 2024
|
||||
assert result.month == 1
|
||||
assert result.day == 1
|
||||
|
||||
def test_extract_no_date_header(self):
|
||||
"""Test message without Date header returns None."""
|
||||
raw = b"From: a@b.com\r\nSubject: Test\r\n\r\nBody"
|
||||
result = extract_date_from_headers(raw)
|
||||
assert result is None
|
||||
|
||||
def test_extract_invalid_date(self):
|
||||
"""Test message with invalid date returns None."""
|
||||
raw = b"From: a@b.com\r\nDate: not-a-date\r\n\r\nBody"
|
||||
result = extract_date_from_headers(raw)
|
||||
assert result is None
|
||||
|
||||
def test_extract_date_only_reads_headers(self):
|
||||
"""Test that only headers are parsed, not body."""
|
||||
raw = b"Subject: Test\r\n\r\nDate: Mon, 1 Jan 2024 00:00:00 +0000"
|
||||
result = extract_date_from_headers(raw)
|
||||
assert result is None # Date in body should be ignored
|
||||
|
||||
def test_extract_date_lf_only(self):
|
||||
"""Test with LF-only line endings."""
|
||||
raw = b"From: a@b.com\nDate: Tue, 2 Jan 2024 10:00:00 +0000\n\nBody"
|
||||
result = extract_date_from_headers(raw)
|
||||
assert result is not None
|
||||
assert result.day == 2
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestIndexMboxMessages:
|
||||
"""Test the index_mbox_messages function."""
|
||||
|
||||
def test_index_basic(self, sample_mbox_content):
|
||||
"""Test basic indexing of mbox content."""
|
||||
file = BytesIO(sample_mbox_content)
|
||||
indices = index_mbox_messages(file)
|
||||
assert len(indices) == 3
|
||||
|
||||
def test_index_has_dates(self, sample_mbox_content):
|
||||
"""Test that dates are extracted during indexing."""
|
||||
file = BytesIO(sample_mbox_content)
|
||||
indices = index_mbox_messages(file)
|
||||
# All 3 messages have dates
|
||||
for idx in indices:
|
||||
assert idx.date is not None
|
||||
|
||||
def test_index_byte_offsets(self, sample_mbox_content):
|
||||
"""Test that byte offsets allow correct message extraction."""
|
||||
file = BytesIO(sample_mbox_content)
|
||||
indices = index_mbox_messages(file)
|
||||
# Each message should be extractable
|
||||
for idx in indices:
|
||||
file.seek(idx.start_byte)
|
||||
content = file.read(idx.end_byte - idx.start_byte + 1)
|
||||
assert b"Subject: " in content
|
||||
|
||||
def test_index_empty_file(self):
|
||||
"""Test indexing an empty file."""
|
||||
file = BytesIO(b"")
|
||||
indices = index_mbox_messages(file)
|
||||
assert len(indices) == 0
|
||||
|
||||
def test_index_no_from_lines(self):
|
||||
"""Test indexing content without From separators."""
|
||||
file = BytesIO(b"Subject: Test\nFrom: a@b.com\n\nBody\n")
|
||||
indices = index_mbox_messages(file)
|
||||
assert len(indices) == 0
|
||||
|
||||
def test_index_single_message(self):
|
||||
"""Test indexing a single message."""
|
||||
content = b"""From user@example.com Thu Jan 1 00:00:00 2024
|
||||
Subject: Single
|
||||
From: a@b.com
|
||||
Date: Mon, 1 Jan 2024 00:00:00 +0000
|
||||
|
||||
Body
|
||||
"""
|
||||
file = BytesIO(content)
|
||||
indices = index_mbox_messages(file)
|
||||
assert len(indices) == 1
|
||||
assert indices[0].date is not None
|
||||
|
||||
def test_index_message_without_date(self):
|
||||
"""Test indexing a message without a Date header."""
|
||||
content = b"""From user@example.com Thu Jan 1 00:00:00 2024
|
||||
Subject: No Date
|
||||
From: a@b.com
|
||||
|
||||
Body
|
||||
"""
|
||||
file = BytesIO(content)
|
||||
indices = index_mbox_messages(file)
|
||||
assert len(indices) == 1
|
||||
assert indices[0].date is None
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@@ -92,29 +205,19 @@ class TestProcessMboxFileTask:
|
||||
|
||||
def test_task_process_mbox_file_success(self, mailbox, sample_mbox_content):
|
||||
"""Test successful MBOX file processing."""
|
||||
# Mock deliver_inbound_message to always succeed
|
||||
with patch("core.mda.inbound.deliver_inbound_message", return_value=True):
|
||||
# Create a mock task instance
|
||||
file_key, storage, s3_client = _upload_to_s3(sample_mbox_content)
|
||||
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
# Mock storage
|
||||
mock_storage = mock_storage_open(sample_mbox_content)
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
with patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
|
||||
# Run the task
|
||||
task_result = process_mbox_file_task(
|
||||
file_key="test-file-key.mbox", recipient_id=str(mailbox.id)
|
||||
file_key=file_key, recipient_id=str(mailbox.id)
|
||||
)
|
||||
|
||||
# Verify task result
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert (
|
||||
task_result["result"]["message_status"]
|
||||
@@ -126,215 +229,115 @@ class TestProcessMboxFileTask:
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 3
|
||||
|
||||
# Verify progress updates
|
||||
assert mock_task.update_state.call_count == 5 # 4 PROGRESS + 1 SUCCESS
|
||||
# 1 "Indexing" + 3 per-message PROGRESS = 4
|
||||
assert mock_task.update_state.call_count == 4
|
||||
|
||||
# First message
|
||||
# Verify "Indexing messages" update
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 1 of 3",
|
||||
"total_messages": 3,
|
||||
"message_status": "Indexing messages",
|
||||
"total_messages": None,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 1,
|
||||
"current_message": 0,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Second message
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 2 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 1,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 2,
|
||||
# Verify per-message progress
|
||||
for i in range(1, 4):
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": f"Processing message {i} of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": i - 1,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": i,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
)
|
||||
|
||||
# Third message
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 3 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 2,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 3,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify messages were created
|
||||
# Verify messages were created in chronological order
|
||||
message_count = Message.objects.count()
|
||||
assert message_count == 3, f"Expected 3 messages, got {message_count}"
|
||||
messages = Message.objects.order_by("created_at")
|
||||
assert messages[0].subject == "Test Message 3"
|
||||
# Sorted by date: Jan 1, Jan 2, Jan 3
|
||||
assert messages[0].subject == "Test Message 1"
|
||||
assert messages[1].subject == "Test Message 2"
|
||||
assert messages[2].subject == "Test Message 1"
|
||||
assert messages[2].subject == "Test Message 3"
|
||||
|
||||
# Verify threading: msg2 replies to msg1
|
||||
assert messages[1].thread == messages[0].thread
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
def test_task_process_mbox_file_partial_success(self, mailbox, sample_mbox_content):
|
||||
"""Test MBOX processing with some messages failing."""
|
||||
|
||||
# Mock deliver_inbound_message to fail for the second message
|
||||
original_deliver = deliver_inbound_message
|
||||
|
||||
def mock_deliver(recipient_email, parsed_email, raw_data, **kwargs):
|
||||
# Get the subject from the parsed email dictionary
|
||||
subject = parsed_email.get("headers", {}).get("subject", "")
|
||||
|
||||
# Return False for Test Message 2 without creating the message
|
||||
if subject == "Test Message 2":
|
||||
return False
|
||||
|
||||
# For other messages, call the original function to create the message
|
||||
return original_deliver(recipient_email, parsed_email, raw_data, **kwargs)
|
||||
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
file_key, storage, s3_client = _upload_to_s3(sample_mbox_content)
|
||||
|
||||
# Mock storage
|
||||
mock_storage = mock_storage_open(sample_mbox_content)
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch(
|
||||
"core.services.importer.tasks.deliver_inbound_message",
|
||||
side_effect=mock_deliver,
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Call the task once
|
||||
task_result = process_mbox_file_task("test-file-key.mbox", str(mailbox.id))
|
||||
with (
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch(
|
||||
"core.services.importer.mbox_tasks.deliver_inbound_message",
|
||||
side_effect=mock_deliver,
|
||||
),
|
||||
):
|
||||
task_result = process_mbox_file_task(file_key, str(mailbox.id))
|
||||
|
||||
# Verify task result
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert (
|
||||
task_result["result"]["message_status"]
|
||||
== "Completed processing messages"
|
||||
)
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert task_result["result"]["total_messages"] == 3
|
||||
assert task_result["result"]["success_count"] == 2
|
||||
assert task_result["result"]["failure_count"] == 1
|
||||
assert task_result["result"]["current_message"] == 3
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert task_result["result"]["total_messages"] == 3
|
||||
assert task_result["result"]["success_count"] == 2
|
||||
assert task_result["result"]["failure_count"] == 1
|
||||
assert task_result["result"]["current_message"] == 3
|
||||
|
||||
# Verify progress updates
|
||||
assert mock_task.update_state.call_count == 5 # 4 PROGRESS + 1 SUCCESS
|
||||
# 1 indexing + 3 per-message PROGRESS = 4
|
||||
assert mock_task.update_state.call_count == 4
|
||||
|
||||
# First message (success)
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 1 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 0,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 1,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Second message (failure)
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 2 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 1,
|
||||
"failure_count": 0,
|
||||
"type": "mbox",
|
||||
"current_message": 2,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Third message (success)
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 3 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 1,
|
||||
"failure_count": 1,
|
||||
"type": "mbox",
|
||||
"current_message": 3,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify success update
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify messages were created
|
||||
assert Message.objects.count() == 2
|
||||
messages = Message.objects.order_by("-created_at")
|
||||
assert messages[0].subject == "Test Message 1"
|
||||
assert messages[1].subject == "Test Message 3"
|
||||
# Verify messages: msg1 and msg3 created, msg2 failed
|
||||
assert Message.objects.count() == 2
|
||||
subjects = sorted(Message.objects.values_list("subject", flat=True))
|
||||
assert "Test Message 1" in subjects
|
||||
assert "Test Message 3" in subjects
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
def test_task_process_mbox_file_mailbox_not_found(self, sample_mbox_content):
|
||||
"""Test MBOX processing with non-existent mailbox."""
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
# Use a valid UUID that doesn't exist in the database
|
||||
non_existent_id = str(uuid.uuid4())
|
||||
|
||||
# Mock storage
|
||||
mock_storage = mock_storage_open(sample_mbox_content)
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
with patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task with non-existent mailbox
|
||||
task_result = process_mbox_file_task(
|
||||
file_key="test-file-key.mbox", recipient_id=non_existent_id
|
||||
)
|
||||
|
||||
# Verify task result
|
||||
assert task_result["status"] == "FAILURE"
|
||||
assert (
|
||||
task_result["result"]["message_status"] == "Failed to process messages"
|
||||
@@ -348,350 +351,89 @@ class TestProcessMboxFileTask:
|
||||
f"Recipient mailbox {non_existent_id} not found" in task_result["error"]
|
||||
)
|
||||
|
||||
# Verify only failure update was called
|
||||
assert mock_task.update_state.call_count == 1
|
||||
mock_task.update_state.assert_called_once_with(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": task_result["error"],
|
||||
},
|
||||
)
|
||||
# No update_state calls — failure status is in the returned dict
|
||||
mock_task.update_state.assert_not_called()
|
||||
|
||||
# Verify no messages were created
|
||||
assert Message.objects.count() == 0
|
||||
|
||||
def test_task_process_mbox_file_parse_error(self, mailbox, sample_mbox_content):
|
||||
"""Test MBOX processing with message parsing error."""
|
||||
|
||||
# Mock parse_email_message to raise an exception for all messages
|
||||
def mock_parse(*args, **kwargs):
|
||||
raise ValidationError("Invalid message format")
|
||||
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
file_key, storage, s3_client = _upload_to_s3(sample_mbox_content)
|
||||
|
||||
# Mock storage
|
||||
mock_storage = mock_storage_open(sample_mbox_content)
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
with (
|
||||
patch(
|
||||
"core.services.importer.tasks.parse_email_message",
|
||||
side_effect=mock_parse,
|
||||
),
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Call the task
|
||||
task_result = process_mbox_file_task("test-file-key.mbox", str(mailbox.id))
|
||||
with (
|
||||
patch(
|
||||
"core.services.importer.mbox_tasks.parse_email_message",
|
||||
side_effect=mock_parse,
|
||||
),
|
||||
patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
):
|
||||
task_result = process_mbox_file_task(file_key, str(mailbox.id))
|
||||
|
||||
# Verify the result
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert task_result["result"]["total_messages"] == 3
|
||||
assert (
|
||||
task_result["result"]["success_count"] == 0
|
||||
) # All messages should fail
|
||||
assert (
|
||||
task_result["result"]["failure_count"] == 3
|
||||
) # All messages should fail
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert task_result["result"]["total_messages"] == 3
|
||||
assert task_result["result"]["success_count"] == 0
|
||||
assert task_result["result"]["failure_count"] == 3
|
||||
|
||||
# Verify progress updates were called for all messages
|
||||
assert mock_task.update_state.call_count == 5 # 4 PROGRESS + 1 SUCCESS
|
||||
# 1 indexing + 3 per-message PROGRESS = 4
|
||||
assert mock_task.update_state.call_count == 4
|
||||
|
||||
# The first update should be for message 1 with failure_count 0
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 1 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 0,
|
||||
"failure_count": 0, # No failures yet
|
||||
"type": "mbox",
|
||||
"current_message": 1,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# The second update should be for message 2 with failure_count 1
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 2 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 0,
|
||||
"failure_count": 1, # One failure from message 1
|
||||
"type": "mbox",
|
||||
"current_message": 2,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# The third update should be for message 3 with failure_count 2
|
||||
mock_task.update_state.assert_any_call(
|
||||
state="PROGRESS",
|
||||
meta={
|
||||
"result": {
|
||||
"message_status": "Processing message 3 of 3",
|
||||
"total_messages": 3,
|
||||
"success_count": 0,
|
||||
"failure_count": 2, # Two failures from messages 1 and 2
|
||||
"type": "mbox",
|
||||
"current_message": 3,
|
||||
},
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify final success update
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify no messages were created
|
||||
assert Message.objects.count() == 0
|
||||
assert Message.objects.count() == 0
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
def test_task_process_mbox_file_empty(self, mailbox):
|
||||
"""Test processing an empty MBOX file."""
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
"""Test processing an empty MBOX file — returns success with zero messages."""
|
||||
file_key, storage, s3_client = _upload_to_s3(b"")
|
||||
|
||||
# Mock storage with empty content
|
||||
mock_storage = mock_storage_open(b"")
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
with patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
patch("magic.Magic.from_buffer") as mock_magic_from_buffer,
|
||||
):
|
||||
mock_magic_from_buffer.return_value = "application/mbox"
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task with empty content
|
||||
task_result = process_mbox_file_task(
|
||||
file_key="test-file-key.mbox", recipient_id=str(mailbox.id)
|
||||
)
|
||||
):
|
||||
task_result = process_mbox_file_task(
|
||||
file_key=file_key, recipient_id=str(mailbox.id)
|
||||
)
|
||||
|
||||
# Verify task result
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert (
|
||||
task_result["result"]["message_status"]
|
||||
== "Completed processing messages"
|
||||
)
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert task_result["result"]["total_messages"] == 0
|
||||
assert task_result["result"]["success_count"] == 0
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 0
|
||||
|
||||
# Verify 2 updates were called: 1 PROGRESS TO COUNT MESSAGES + 1 SUCCESS
|
||||
assert mock_task.update_state.call_count == 2
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="SUCCESS",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": None,
|
||||
},
|
||||
)
|
||||
|
||||
# Verify no messages were created
|
||||
assert Message.objects.count() == 0
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert task_result["result"]["total_messages"] == 0
|
||||
assert Message.objects.count() == 0
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
def test_task_process_mbox_invalid_file(self, mailbox):
|
||||
"""Test processing an invalid MBOX file."""
|
||||
# Create a mock task instance
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
"""Test processing a non-text file (JPEG) — no messages found, returns success."""
|
||||
# JPEG magic bytes
|
||||
jpeg_content = b"\xff\xd8\xff\xe0" + b"\x00" * 100
|
||||
|
||||
# Mock storage with empty content
|
||||
mock_storage = mock_storage_open(b"")
|
||||
file_key, storage, s3_client = _upload_to_s3(jpeg_content)
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
try:
|
||||
mock_task = MagicMock()
|
||||
mock_task.update_state = MagicMock()
|
||||
|
||||
with patch.object(
|
||||
process_mbox_file_task, "update_state", mock_task.update_state
|
||||
),
|
||||
patch("core.services.importer.tasks.storages") as mock_storages,
|
||||
):
|
||||
mock_storages.__getitem__.return_value = mock_storage
|
||||
# Run the task with empty content
|
||||
task_result = process_mbox_file_task(
|
||||
file_key="test-file-key.mbox", recipient_id=str(mailbox.id)
|
||||
)
|
||||
):
|
||||
task_result = process_mbox_file_task(
|
||||
file_key=file_key, recipient_id=str(mailbox.id)
|
||||
)
|
||||
|
||||
# Verify task result
|
||||
assert task_result["status"] == "FAILURE"
|
||||
assert (
|
||||
task_result["result"]["message_status"] == "Failed to process messages"
|
||||
)
|
||||
assert task_result["result"]["type"] == "mbox"
|
||||
assert task_result["result"]["total_messages"] == 0
|
||||
assert task_result["result"]["success_count"] == 0
|
||||
assert task_result["result"]["failure_count"] == 0
|
||||
assert task_result["result"]["current_message"] == 0
|
||||
assert task_result["error"] == "Expected MBOX file, got application/x-empty"
|
||||
|
||||
# Verify 2 updates were called: 1 PROGRESS TO COUNT MESSAGES + 1 FAILURE
|
||||
assert mock_task.update_state.call_count == 2
|
||||
mock_task.update_state.assert_called_with(
|
||||
state="FAILURE",
|
||||
meta={
|
||||
"result": task_result["result"],
|
||||
"error": task_result["error"],
|
||||
},
|
||||
)
|
||||
|
||||
# Verify no messages were created
|
||||
assert Message.objects.count() == 0
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestStreamMboxMessages:
|
||||
"""Test the stream_mbox_messages function."""
|
||||
|
||||
def test_task_stream_mbox_messages_success(self, sample_mbox_content):
|
||||
"""Test successful streaming of MBOX file."""
|
||||
file = BytesIO(sample_mbox_content)
|
||||
message_positions, file_end = scan_mbox_messages(file)
|
||||
messages = list(stream_mbox_messages(file, message_positions, file_end))
|
||||
assert len(messages) == 3
|
||||
# Messages are in reverse order (newest first) due to the reversing in stream_mbox_messages
|
||||
assert b"Test Message 3" in messages[0]
|
||||
assert b"Test Message 2" in messages[1]
|
||||
assert b"Test Message 1" in messages[2]
|
||||
|
||||
def test_task_stream_mbox_messages_empty(self):
|
||||
"""Test streaming an empty MBOX file."""
|
||||
file = BytesIO(b"")
|
||||
message_positions, file_end = scan_mbox_messages(file)
|
||||
messages = list(stream_mbox_messages(file, message_positions, file_end))
|
||||
assert len(messages) == 0
|
||||
|
||||
def test_task_stream_mbox_messages_single_message(self):
|
||||
"""Test streaming a MBOX file with a single message."""
|
||||
content = b"""From user@example.com Thu Jan 1 00:00:00 2024
|
||||
Subject: Single Message
|
||||
From: sender@example.com
|
||||
To: recipient@example.com
|
||||
|
||||
This is a single message.
|
||||
"""
|
||||
file = BytesIO(content)
|
||||
message_positions, file_end = scan_mbox_messages(file)
|
||||
messages = list(stream_mbox_messages(file, message_positions, file_end))
|
||||
assert len(messages) == 1
|
||||
assert b"Single Message" in messages[0]
|
||||
|
||||
def test_task_stream_mbox_messages_malformed(self):
|
||||
"""Test streaming a malformed MBOX file."""
|
||||
# Content without proper From headers
|
||||
content = b"""Subject: Malformed Message
|
||||
From: sender@example.com
|
||||
To: recipient@example.com
|
||||
|
||||
This is a malformed message.
|
||||
"""
|
||||
file = BytesIO(content)
|
||||
message_positions, file_end = scan_mbox_messages(file)
|
||||
messages = list(stream_mbox_messages(file, message_positions, file_end))
|
||||
assert len(messages) == 0 # No valid messages should be found
|
||||
|
||||
def test_task_stream_mbox_messages_not_fully_loaded_in_memory(
|
||||
self, sample_mbox_content
|
||||
):
|
||||
"""Test that mbox processing uses a memory-efficient two-pass approach.
|
||||
|
||||
The workflow should:
|
||||
1. First pass (scan_mbox_messages): iterate line by line, no seek, only stores integers
|
||||
2. Second pass (stream_mbox_messages): seek to each position and read one message at a time
|
||||
|
||||
This test verifies the file is processed efficiently without loading all content.
|
||||
"""
|
||||
|
||||
class SpyFile:
|
||||
"""A file wrapper that tracks seek and read operations."""
|
||||
|
||||
def __init__(self, content: bytes):
|
||||
self._file = BytesIO(content)
|
||||
self.seek_calls = []
|
||||
self.read_calls = []
|
||||
self.readline_calls = []
|
||||
self.iter_count = 0
|
||||
|
||||
def __iter__(self):
|
||||
self.iter_count += 1
|
||||
return iter(self._file)
|
||||
|
||||
def seek(self, pos, *args):
|
||||
self.seek_calls.append(pos)
|
||||
return self._file.seek(pos, *args)
|
||||
|
||||
def read(self, size=-1):
|
||||
result = self._file.read(size)
|
||||
self.read_calls.append(len(result))
|
||||
return result
|
||||
|
||||
def readline(self):
|
||||
result = self._file.readline()
|
||||
self.readline_calls.append(len(result))
|
||||
return result
|
||||
|
||||
# Test scan_mbox_messages (first pass)
|
||||
scan_spy = SpyFile(sample_mbox_content)
|
||||
positions, file_end = scan_mbox_messages(scan_spy)
|
||||
|
||||
# Verify scan only iterates once and doesn't seek
|
||||
assert scan_spy.iter_count == 1, "scan_mbox_messages should iterate once"
|
||||
assert len(scan_spy.seek_calls) == 0, (
|
||||
"scan_mbox_messages should not seek - it only scans line by line"
|
||||
)
|
||||
assert len(positions) == 3, (
|
||||
f"Expected 3 message positions, got {len(positions)}"
|
||||
)
|
||||
|
||||
# Test stream_mbox_messages with pre-computed positions (second pass)
|
||||
stream_spy = SpyFile(sample_mbox_content)
|
||||
messages = list(stream_mbox_messages(stream_spy, positions, file_end))
|
||||
|
||||
# Verify we got all 3 messages
|
||||
assert len(messages) == 3
|
||||
|
||||
# Verify stream didn't iterate (positions were pre-computed)
|
||||
assert stream_spy.iter_count == 0, (
|
||||
"stream_mbox_messages should not iterate when positions are provided"
|
||||
)
|
||||
|
||||
# Verify seek was called for each message
|
||||
assert len(stream_spy.seek_calls) == 3, (
|
||||
f"Expected 3 seek calls (one per message), got {len(stream_spy.seek_calls)}. "
|
||||
"This suggests the file might be fully loaded into memory."
|
||||
)
|
||||
|
||||
# Verify read was called for each message individually
|
||||
assert len(stream_spy.read_calls) == 3, (
|
||||
f"Expected 3 read calls (one per message), got {len(stream_spy.read_calls)}. "
|
||||
"This suggests messages might be accumulated in memory."
|
||||
)
|
||||
|
||||
# Verify readline was called for each message (to skip "From " separator)
|
||||
assert len(stream_spy.readline_calls) == 3, (
|
||||
f"Expected 3 readline calls, got {len(stream_spy.readline_calls)}."
|
||||
)
|
||||
|
||||
# Verify messages are still in correct order (oldest first for threading)
|
||||
assert b"Test Message 3" in messages[0]
|
||||
assert b"Test Message 2" in messages[1]
|
||||
assert b"Test Message 1" in messages[2]
|
||||
# MIME validation is done upstream in service.py;
|
||||
# the task just finds zero messages in invalid content
|
||||
assert task_result["status"] == "SUCCESS"
|
||||
assert task_result["result"]["total_messages"] == 0
|
||||
assert Message.objects.count() == 0
|
||||
finally:
|
||||
s3_client.delete_object(Bucket=storage.bucket_name, Key=file_key)
|
||||
|
||||
@@ -45,8 +45,14 @@ class TestWorkerQueueConfiguration:
|
||||
assert "core.mda.outbound_tasks.*" in routes
|
||||
assert routes["core.mda.outbound_tasks.*"]["queue"] == "outbound"
|
||||
|
||||
assert "core.services.importer.tasks.*" in routes
|
||||
assert routes["core.services.importer.tasks.*"]["queue"] == "imports"
|
||||
assert "core.services.importer.mbox_tasks.*" in routes
|
||||
assert routes["core.services.importer.mbox_tasks.*"]["queue"] == "imports"
|
||||
assert "core.services.importer.eml_tasks.*" in routes
|
||||
assert routes["core.services.importer.eml_tasks.*"]["queue"] == "imports"
|
||||
assert "core.services.importer.imap_tasks.*" in routes
|
||||
assert routes["core.services.importer.imap_tasks.*"]["queue"] == "imports"
|
||||
assert "core.services.importer.pst_tasks.*" in routes
|
||||
assert routes["core.services.importer.pst_tasks.*"]["queue"] == "imports"
|
||||
|
||||
assert "core.services.search.tasks.*" in routes
|
||||
assert routes["core.services.search.tasks.*"]["queue"] == "reindex"
|
||||
|
||||
@@ -624,6 +624,7 @@ class Base(Configuration):
|
||||
CELERY_RESULT_EXTENDED = True
|
||||
CELERY_TASK_RESULT_EXPIRES = 60 * 60 * 24 * 30 # 30 days
|
||||
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
|
||||
CELERY_WORKER_HIJACK_ROOT_LOGGER = False
|
||||
|
||||
# Default queue for tasks without explicit routing
|
||||
CELERY_TASK_DEFAULT_QUEUE = "default"
|
||||
@@ -636,7 +637,10 @@ class Base(Configuration):
|
||||
# Outbound email sending - high priority
|
||||
"core.mda.outbound_tasks.*": {"queue": "outbound"},
|
||||
# Import tasks - lower priority than regular tasks
|
||||
"core.services.importer.tasks.*": {"queue": "imports"},
|
||||
"core.services.importer.mbox_tasks.*": {"queue": "imports"},
|
||||
"core.services.importer.eml_tasks.*": {"queue": "imports"},
|
||||
"core.services.importer.imap_tasks.*": {"queue": "imports"},
|
||||
"core.services.importer.pst_tasks.*": {"queue": "imports"},
|
||||
# Search indexing - lowest priority, can be delayed
|
||||
"core.services.search.tasks.*": {"queue": "reindex"},
|
||||
}
|
||||
@@ -855,6 +859,41 @@ class Base(Configuration):
|
||||
),
|
||||
"propagate": False,
|
||||
},
|
||||
"botocore": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"urllib3": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"opensearch": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"opensearchpy": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"opensearchpy.trace": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"elastic_transport": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"flanker": {
|
||||
"handlers": ["console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
277
src/backend/poetry.lock
generated
277
src/backend/poetry.lock
generated
@@ -284,84 +284,101 @@ files = [
|
||||
|
||||
[[package]]
|
||||
name = "cffi"
|
||||
version = "1.17.1"
|
||||
version = "2.0.0"
|
||||
description = "Foreign Function Interface for Python calling C code."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
markers = "platform_python_implementation != \"PyPy\""
|
||||
files = [
|
||||
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"},
|
||||
{file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"},
|
||||
{file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"},
|
||||
{file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"},
|
||||
{file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"},
|
||||
{file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"},
|
||||
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
|
||||
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"},
|
||||
{file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pycparser = "*"
|
||||
pycparser = {version = "*", markers = "implementation_name != \"PyPy\""}
|
||||
|
||||
[[package]]
|
||||
name = "chardet"
|
||||
@@ -667,62 +684,74 @@ test = ["pytest"]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "45.0.5"
|
||||
version = "46.0.5"
|
||||
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
|
||||
optional = false
|
||||
python-versions = "!=3.9.0,!=3.9.1,>=3.7"
|
||||
python-versions = "!=3.9.0,!=3.9.1,>=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "cryptography-45.0.5-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:101ee65078f6dd3e5a028d4f19c07ffa4dd22cce6a20eaa160f8b5219911e7d8"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3a264aae5f7fbb089dbc01e0242d3b67dffe3e6292e1f5182122bdf58e65215d"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e74d30ec9c7cb2f404af331d5b4099a9b322a8a6b25c4632755c8757345baac5"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:3af26738f2db354aafe492fb3869e955b12b2ef2e16908c8b9cb928128d42c57"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e6c00130ed423201c5bc5544c23359141660b07999ad82e34e7bb8f882bb78e0"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:dd420e577921c8c2d31289536c386aaa30140b473835e97f83bc71ea9d2baf2d"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:d05a38884db2ba215218745f0781775806bde4f32e07b135348355fe8e4991d9"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:ad0caded895a00261a5b4aa9af828baede54638754b51955a0ac75576b831b27"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9024beb59aca9d31d36fcdc1604dd9bbeed0a55bface9f1908df19178e2f116e"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:91098f02ca81579c85f66df8a588c78f331ca19089763d733e34ad359f474174"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-win32.whl", hash = "sha256:926c3ea71a6043921050eaa639137e13dbe7b4ab25800932a8498364fc1abec9"},
|
||||
{file = "cryptography-45.0.5-cp311-abi3-win_amd64.whl", hash = "sha256:b85980d1e345fe769cfc57c57db2b59cff5464ee0c045d52c0df087e926fbe63"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:f3562c2f23c612f2e4a6964a61d942f891d29ee320edb62ff48ffb99f3de9ae8"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3fcfbefc4a7f332dece7272a88e410f611e79458fab97b5efe14e54fe476f4fd"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:460f8c39ba66af7db0545a8c6f2eabcbc5a5528fc1cf6c3fa9a1e44cec33385e"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9b4cf6318915dccfe218e69bbec417fdd7c7185aa7aab139a2c0beb7468c89f0"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2089cc8f70a6e454601525e5bf2779e665d7865af002a5dec8d14e561002e135"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0027d566d65a38497bc37e0dd7c2f8ceda73597d2ac9ba93810204f56f52ebc7"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:be97d3a19c16a9be00edf79dca949c8fa7eff621763666a145f9f9535a5d7f42"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:7760c1c2e1a7084153a0f68fab76e754083b126a47d0117c9ed15e69e2103492"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:6ff8728d8d890b3dda5765276d1bc6fb099252915a2cd3aff960c4c195745dd0"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:7259038202a47fdecee7e62e0fd0b0738b6daa335354396c6ddebdbe1206af2a"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-win32.whl", hash = "sha256:1e1da5accc0c750056c556a93c3e9cb828970206c68867712ca5805e46dc806f"},
|
||||
{file = "cryptography-45.0.5-cp37-abi3-win_amd64.whl", hash = "sha256:90cb0a7bb35959f37e23303b7eed0a32280510030daba3f7fdfbb65defde6a97"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:206210d03c1193f4e1ff681d22885181d47efa1ab3018766a7b32a7b3d6e6afd"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c648025b6840fe62e57107e0a25f604db740e728bd67da4f6f060f03017d5097"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:b8fa8b0a35a9982a3c60ec79905ba5bb090fc0b9addcfd3dc2dd04267e45f25e"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:14d96584701a887763384f3c47f0ca7c1cce322aa1c31172680eb596b890ec30"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:57c816dfbd1659a367831baca4b775b2a5b43c003daf52e9d57e1d30bc2e1b0e"},
|
||||
{file = "cryptography-45.0.5-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:b9e38e0a83cd51e07f5a48ff9691cae95a79bea28fe4ded168a8e5c6c77e819d"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:8c4a6ff8a30e9e3d38ac0539e9a9e02540ab3f827a3394f8852432f6b0ea152e"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:bd4c45986472694e5121084c6ebbd112aa919a25e783b87eb95953c9573906d6"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:982518cd64c54fcada9d7e5cf28eabd3ee76bd03ab18e08a48cad7e8b6f31b18"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:12e55281d993a793b0e883066f590c1ae1e802e3acb67f8b442e721e475e6463"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:5aa1e32983d4443e310f726ee4b071ab7569f58eedfdd65e9675484a4eb67bd1"},
|
||||
{file = "cryptography-45.0.5-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:e357286c1b76403dd384d938f93c46b2b058ed4dfcdce64a770f0537ed3feb6f"},
|
||||
{file = "cryptography-45.0.5.tar.gz", hash = "sha256:72e76caa004ab63accdf26023fccd1d087f6d90ec6048ff33ad0445abf7f605a"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:351695ada9ea9618b3500b490ad54c739860883df6c1f555e088eaf25b1bbaad"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c18ff11e86df2e28854939acde2d003f7984f721eba450b56a200ad90eeb0e6b"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4d7e3d356b8cd4ea5aff04f129d5f66ebdc7b6f8eae802b93739ed520c47c79b"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:50bfb6925eff619c9c023b967d5b77a54e04256c4281b0e21336a130cd7fc263"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:803812e111e75d1aa73690d2facc295eaefd4439be1023fefc4995eaea2af90d"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ee190460e2fbe447175cda91b88b84ae8322a104fc27766ad09428754a618ed"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:f145bba11b878005c496e93e257c1e88f154d278d2638e6450d17e0f31e558d2"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:e9251e3be159d1020c4030bd2e5f84d6a43fe54b6c19c12f51cde9542a2817b2"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:47fb8a66058b80e509c47118ef8a75d14c455e81ac369050f20ba0d23e77fee0"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4c3341037c136030cb46e4b1e17b7418ea4cbd9dd207e4a6f3b2b24e0d4ac731"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:890bcb4abd5a2d3f852196437129eb3667d62630333aacc13dfd470fad3aaa82"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80a8d7bfdf38f87ca30a5391c0c9ce4ed2926918e017c29ddf643d0ed2778ea1"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-win32.whl", hash = "sha256:60ee7e19e95104d4c03871d7d7dfb3d22ef8a9b9c6778c94e1c8fcc8365afd48"},
|
||||
{file = "cryptography-46.0.5-cp311-abi3-win_amd64.whl", hash = "sha256:38946c54b16c885c72c4f59846be9743d699eee2b69b6988e0a00a01f46a61a4"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:94a76daa32eb78d61339aff7952ea819b1734b46f73646a07decb40e5b3448e2"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:5be7bf2fb40769e05739dd0046e7b26f9d4670badc7b032d6ce4db64dddc0678"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe346b143ff9685e40192a4960938545c699054ba11d4f9029f94751e3f71d87"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:c69fd885df7d089548a42d5ec05be26050ebcd2283d89b3d30676eb32ff87dee"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:8293f3dea7fc929ef7240796ba231413afa7b68ce38fd21da2995549f5961981"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:1abfdb89b41c3be0365328a410baa9df3ff8a9110fb75e7b52e66803ddabc9a9"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:d66e421495fdb797610a08f43b05269e0a5ea7f5e652a89bfd5a7d3c1dee3648"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:4e817a8920bfbcff8940ecfd60f23d01836408242b30f1a708d93198393a80b4"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:68f68d13f2e1cb95163fa3b4db4bf9a159a418f5f6e7242564fc75fcae667fd0"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a3d1fae9863299076f05cb8a778c467578262fae09f9dc0ee9b12eb4268ce663"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c4143987a42a2397f2fc3b4d7e3a7d313fbe684f67ff443999e803dd75a76826"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:7d731d4b107030987fd61a7f8ab512b25b53cef8f233a97379ede116f30eb67d"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-win32.whl", hash = "sha256:c3bcce8521d785d510b2aad26ae2c966092b7daa8f45dd8f44734a104dc0bc1a"},
|
||||
{file = "cryptography-46.0.5-cp314-cp314t-win_amd64.whl", hash = "sha256:4d8ae8659ab18c65ced284993c2265910f6c9e650189d4e3f68445ef82a810e4"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:4108d4c09fbbf2789d0c926eb4152ae1760d5a2d97612b92d508d96c861e4d31"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7d1f30a86d2757199cb2d56e48cce14deddf1f9c95f1ef1b64ee91ea43fe2e18"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:039917b0dc418bb9f6edce8a906572d69e74bd330b0b3fea4f79dab7f8ddd235"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ba2a27ff02f48193fc4daeadf8ad2590516fa3d0adeeb34336b96f7fa64c1e3a"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:61aa400dce22cb001a98014f647dc21cda08f7915ceb95df0c9eaf84b4b6af76"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3ce58ba46e1bc2aac4f7d9290223cead56743fa6ab94a5d53292ffaac6a91614"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:420d0e909050490d04359e7fdb5ed7e667ca5c3c402b809ae2563d7e66a92229"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:582f5fcd2afa31622f317f80426a027f30dc792e9c80ffee87b993200ea115f1"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:bfd56bb4b37ed4f330b82402f6f435845a5f5648edf1ad497da51a8452d5d62d"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:a3d507bb6a513ca96ba84443226af944b0f7f47dcc9a399d110cd6146481d24c"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9f16fbdf4da055efb21c22d81b89f155f02ba420558db21288b3d0035bafd5f4"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:ced80795227d70549a411a4ab66e8ce307899fad2220ce5ab2f296e687eacde9"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-win32.whl", hash = "sha256:02f547fce831f5096c9a567fd41bc12ca8f11df260959ecc7c3202555cc47a72"},
|
||||
{file = "cryptography-46.0.5-cp38-abi3-win_amd64.whl", hash = "sha256:556e106ee01aa13484ce9b0239bca667be5004efb0aabbed28d353df86445595"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:3b4995dc971c9fb83c25aa44cf45f02ba86f71ee600d81091c2f0cbae116b06c"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:bc84e875994c3b445871ea7181d424588171efec3e185dced958dad9e001950a"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:2ae6971afd6246710480e3f15824ed3029a60fc16991db250034efd0b9fb4356"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:d861ee9e76ace6cf36a6a89b959ec08e7bc2493ee39d07ffe5acb23ef46d27da"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:2b7a67c9cd56372f3249b39699f2ad479f6991e62ea15800973b956f4b73e257"},
|
||||
{file = "cryptography-46.0.5-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:8456928655f856c6e1533ff59d5be76578a7157224dbd9ce6872f25055ab9ab7"},
|
||||
{file = "cryptography-46.0.5.tar.gz", hash = "sha256:abace499247268e3757271b2f1e244b36b06f8515cf27c4d49468fc9eb16e93d"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cffi = {version = ">=1.14", markers = "platform_python_implementation != \"PyPy\""}
|
||||
cffi = {version = ">=2.0.0", markers = "python_full_version >= \"3.9.0\" and platform_python_implementation != \"PyPy\""}
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs ; python_full_version >= \"3.8.0\"", "sphinx-rtd-theme (>=3.0.0) ; python_full_version >= \"3.8.0\""]
|
||||
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs", "sphinx-rtd-theme (>=3.0.0)"]
|
||||
docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"]
|
||||
nox = ["nox (>=2024.4.15)", "nox[uv] (>=2024.3.2) ; python_full_version >= \"3.8.0\""]
|
||||
pep8test = ["check-sdist ; python_full_version >= \"3.8.0\"", "click (>=8.0.1)", "mypy (>=1.4)", "ruff (>=0.3.6)"]
|
||||
nox = ["nox[uv] (>=2024.4.15)"]
|
||||
pep8test = ["check-sdist", "click (>=8.0.1)", "mypy (>=1.14)", "ruff (>=0.11.11)"]
|
||||
sdist = ["build (>=1.0.0)"]
|
||||
ssh = ["bcrypt (>=3.1.5)"]
|
||||
test = ["certifi (>=2024)", "cryptography-vectors (==45.0.5)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
|
||||
test = ["certifi (>=2024)", "cryptography-vectors (==46.0.5)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
|
||||
test-randomorder = ["pytest-randomly"]
|
||||
|
||||
[[package]]
|
||||
@@ -1750,6 +1779,22 @@ files = [
|
||||
{file = "legacy_cgi-2.6.3.tar.gz", hash = "sha256:4c119d6cb8e9d8b6ad7cc0ddad880552c62df4029622835d06dfd18f438a8154"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libpff-python"
|
||||
version = "20231205"
|
||||
description = "Python bindings module for libpff"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "libpff-python-20231205.tar.gz", hash = "sha256:06c218be51321b16dc3b835185ee1cd2fa5c2a1ca856e0390c1d6e4ddf329250"},
|
||||
{file = "libpff_python-20231205-cp310-cp310-macosx_12_0_x86_64.whl", hash = "sha256:3c296ba12e1ca03571d34e0609331d90cc7a4289605f04a45e29394a5c2973f3"},
|
||||
{file = "libpff_python-20231205-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:2dfa98dacb5f5a754b5a9f97aba2faf45fe4349632cf2921f4b683140354b037"},
|
||||
{file = "libpff_python-20231205-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:aee688ca5063abd35a067cf36d949e0b3623ce3a61135e3dad76068c3b647635"},
|
||||
{file = "libpff_python-20231205-cp38-cp38-macosx_12_0_x86_64.whl", hash = "sha256:b8f001558c78ef0e6e57ecea9a206879b815fbbc1caf91def219af754b0ad2f0"},
|
||||
{file = "libpff_python-20231205-cp39-cp39-macosx_12_0_x86_64.whl", hash = "sha256:f6bd1fb0386feeaa282c686e16d010cd832117b2524ccbf496e189e3ac4a4351"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "license-expression"
|
||||
version = "30.4.4"
|
||||
@@ -2329,7 +2374,7 @@ description = "C parser in Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "platform_python_implementation != \"PyPy\""
|
||||
markers = "platform_python_implementation != \"PyPy\" and implementation_name != \"PyPy\""
|
||||
files = [
|
||||
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
|
||||
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
|
||||
@@ -3717,4 +3762,4 @@ dev = ["django-extensions", "drf-spectacular-sidecar", "flower", "hypothesis", "
|
||||
[metadata]
|
||||
lock-version = "2.1"
|
||||
python-versions = ">=3.13,<4.0"
|
||||
content-hash = "0867f092bfcf8edb9a634db6a1f784447dd8bbf903ec5f41af9f3b8e3c91e11a"
|
||||
content-hash = "6ad84b1173ae0d81ce9e7a44e5a7a0319a5661d5e5783d0cf30484f90da5c917"
|
||||
|
||||
@@ -27,7 +27,7 @@ dependencies = [
|
||||
"boto3==1.40.43",
|
||||
"botocore==1.40.43",
|
||||
"celery[redis]==5.5.2",
|
||||
"cryptography==45.0.5",
|
||||
"cryptography==46.0.5",
|
||||
"dj-database-url==2.3.0",
|
||||
"django==5.1.15",
|
||||
"django-celery-beat==2.8.0",
|
||||
@@ -62,6 +62,7 @@ dependencies = [
|
||||
"redis==5.2.1",
|
||||
"requests==2.32.5",
|
||||
"sentry-sdk[django]==2.27.0",
|
||||
"libpff-python==20231205",
|
||||
"url-normalize==1.4.3",
|
||||
"whitenoise==6.8.2",
|
||||
"prometheus-client==0.22.1",
|
||||
|
||||
@@ -9,6 +9,8 @@
|
||||
"{{count}} hours ago_other": "{{count}} hours ago",
|
||||
"{{count}} messages_one": "{{count}} message",
|
||||
"{{count}} messages_other": "{{count}} messages",
|
||||
"{{count}} messages imported_one": "{{count}} message imported",
|
||||
"{{count}} messages imported_other": "{{count}} messages imported",
|
||||
"{{count}} messages have been archived._one": "The message has been archived.",
|
||||
"{{count}} messages have been archived._other": "{{count}} messages have been archived.",
|
||||
"{{count}} messages have been deleted._one": "The message has been deleted.",
|
||||
@@ -204,7 +206,7 @@
|
||||
"Edit template \"{{template}}\"": "Edit template \"{{template}}\"",
|
||||
"Edit Widget": "Edit Widget",
|
||||
"Email address": "Email address",
|
||||
"EML or MBOX": "EML or MBOX",
|
||||
"EML, MBOX or PST": "EML, MBOX or PST",
|
||||
"Enter the email addresses of the recipients separated by commas": "Enter the email addresses of the recipients separated by commas",
|
||||
"Error while checking DNS records": "Error while checking DNS records",
|
||||
"Error while loading addresses": "Error while loading addresses",
|
||||
|
||||
@@ -14,6 +14,9 @@
|
||||
"{{count}} messages_one": "{{count}} message",
|
||||
"{{count}} messages_many": "{{count}} messages",
|
||||
"{{count}} messages_other": "{{count}} messages",
|
||||
"{{count}} messages imported_one": "{{count}} message importé",
|
||||
"{{count}} messages imported_many": "{{count}} messages importés",
|
||||
"{{count}} messages imported_other": "{{count}} messages importés",
|
||||
"{{count}} messages have been archived._one": "Le message a été archivé.",
|
||||
"{{count}} messages have been archived._many": "{{count}} messages ont été archivés.",
|
||||
"{{count}} messages have been archived._other": "{{count}} messages ont été archivés.",
|
||||
@@ -230,7 +233,7 @@
|
||||
"Edit template \"{{template}}\"": "Modifier le modèle \"{{template}}\"",
|
||||
"Edit Widget": "Modifier le Widget",
|
||||
"Email address": "Adresse mail",
|
||||
"EML or MBOX": "EML ou MBOX",
|
||||
"EML, MBOX or PST": "EML, MBOX ou PST",
|
||||
"Enter the email addresses of the recipients separated by commas": "Entrez les adresses e-mail des destinataires séparés par des virgules",
|
||||
"Error while checking DNS records": "Erreur lors de la vérification des enregistrements DNS",
|
||||
"Error while loading addresses": "Erreur lors du chargement des adresses",
|
||||
|
||||
@@ -7,6 +7,8 @@
|
||||
"{{count}} hours ago_other": "{{count}} uur geleden",
|
||||
"{{count}} messages_one": "{{count}} bericht",
|
||||
"{{count}} messages_other": "{{count}} berichten",
|
||||
"{{count}} messages imported_one": "{{count}} bericht geïmporteerd",
|
||||
"{{count}} messages imported_other": "{{count}} berichten geïmporteerd",
|
||||
"{{count}} messages have been archived._one": "Het bericht is gearchiveerd.",
|
||||
"{{count}} messages have been archived._other": "{{count}} berichten zijn gearchiveerd.",
|
||||
"{{count}} messages have been deleted._one": "Het bericht is verwijderd.",
|
||||
@@ -167,7 +169,7 @@
|
||||
"Edit {{mailbox}} address": "Bewerk {{mailbox}} adres",
|
||||
"Edit template \"{{template}}\"": "Sjabloon \"{{template}} \" verwijderen",
|
||||
"Email address": "E-mail adres",
|
||||
"EML or MBOX": "EML of MBOX",
|
||||
"EML, MBOX or PST": "EML, MBOX of PST",
|
||||
"Enter the email addresses of the recipients separated by commas": "Voer de e-mailadressen in van de geadresseerden gescheiden door komma's",
|
||||
"Error while checking DNS records": "Fout bij het controleren van DNS records",
|
||||
"Error while loading addresses": "Fout bij het laden van adressen",
|
||||
|
||||
@@ -8,6 +8,8 @@
|
||||
"{{count}} hours ago_one": "{{count}} ч. назад",
|
||||
"{{count}} hours ago_other": "{{count}} ч. назад",
|
||||
"{{count}} messages_one": "{{count}} сообщение",
|
||||
"{{count}} messages imported_one": "{{count}} сообщение импортировано",
|
||||
"{{count}} messages imported_other": "{{count}} сообщений импортировано",
|
||||
"{{count}} messages_other": "{{count}} сообщений",
|
||||
"{{count}} messages have been archived._one": "Сообщение архивировано.",
|
||||
"{{count}} messages have been archived._other": "Сообщения архивированы ({{count}}).",
|
||||
@@ -196,7 +198,7 @@
|
||||
"Edit template \"{{template}}\"": "Изменить шаблон \"{{template}}\"",
|
||||
"Edit Widget": "Настроить виджет",
|
||||
"Email address": "Адрес электронной почты",
|
||||
"EML or MBOX": "EML или MBOX",
|
||||
"EML, MBOX or PST": "EML, MBOX или PST",
|
||||
"Enter the email addresses of the recipients separated by commas": "Введите адреса электронной почты получателей, разделённые запятыми",
|
||||
"Error while checking DNS records": "Ошибка при проверке записей DNS",
|
||||
"Error while loading addresses": "Ошибка при загрузке адресов",
|
||||
|
||||
@@ -8,6 +8,8 @@
|
||||
"{{count}} hours ago_one": "{{count}} год. тому",
|
||||
"{{count}} hours ago_other": "{{count}} год. тому",
|
||||
"{{count}} messages_one": "{{count}} повідомлення",
|
||||
"{{count}} messages imported_one": "{{count}} повідомлення імпортовано",
|
||||
"{{count}} messages imported_other": "{{count}} повідомлень імпортовано",
|
||||
"{{count}} messages_other": "{{count}} повідомлень",
|
||||
"{{count}} messages have been archived._one": "Повідомлення архівовано.",
|
||||
"{{count}} messages have been archived._other": "Повідомлення архівовані ({{count}}).",
|
||||
@@ -196,7 +198,7 @@
|
||||
"Edit template \"{{template}}\"": "Редагувати шаблон \"{{template}}\"",
|
||||
"Edit Widget": "Редагувати віджет",
|
||||
"Email address": "Адреса ел. пошти",
|
||||
"EML or MBOX": "EML або MBOX",
|
||||
"EML, MBOX or PST": "EML, MBOX або PST",
|
||||
"Enter the email addresses of the recipients separated by commas": "Введіть електронні адреси, розділені комами",
|
||||
"Error while checking DNS records": "Помилка під час перевірки записів DNS",
|
||||
"Error while loading addresses": "Помилка при завантаженні адрес",
|
||||
|
||||
@@ -26,10 +26,10 @@ type SecondParameter<T extends (...args: never) => unknown> = Parameters<T>[1];
|
||||
|
||||
/**
|
||||
*
|
||||
Import messages by uploading an EML or MBOX file.
|
||||
Import messages by uploading an EML, MBOX, or PST file.
|
||||
|
||||
The import is processed asynchronously and returns a task ID for tracking.
|
||||
The file must be a valid EML or MBOX format. The recipient mailbox must exist
|
||||
The file must be a valid EML, MBOX, or PST format. The recipient mailbox must exist
|
||||
and the user must have access to it.
|
||||
|
||||
*/
|
||||
|
||||
@@ -9,6 +9,6 @@
|
||||
export type ImportFileCreate202 = {
|
||||
/** Task ID for tracking the import */
|
||||
task_id?: string;
|
||||
/** Type of import (eml or mbox) */
|
||||
/** Type of import (eml, mbox, or pst) */
|
||||
type?: string;
|
||||
};
|
||||
|
||||
@@ -6,7 +6,7 @@ import { StepForm } from "./step-form";
|
||||
import { StepLoader } from "./step-loader";
|
||||
import { StepCompleted } from "./step-completed";
|
||||
import clsx from "clsx";
|
||||
import { useEffect, useState } from "react";
|
||||
import { useEffect, useRef, useState } from "react";
|
||||
import { TaskImportCacheHelper } from "@/features/utils/task-import-cache";
|
||||
|
||||
|
||||
@@ -31,6 +31,30 @@ export const ModalMessageImporter = () => {
|
||||
const [step, setStep] = useState<IMPORT_STEP>(taskId ? 'importing' : 'idle');
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const { closeModal } = useModalStore();
|
||||
|
||||
// Track Alt key for force-reset on alt+close
|
||||
const altKeyRef = useRef(false);
|
||||
useEffect(() => {
|
||||
const onKey = (e: KeyboardEvent) => { altKeyRef.current = e.altKey; };
|
||||
const onBlur = () => { altKeyRef.current = false; };
|
||||
window.addEventListener('keydown', onKey);
|
||||
window.addEventListener('keyup', onKey);
|
||||
window.addEventListener('blur', onBlur);
|
||||
return () => {
|
||||
window.removeEventListener('keydown', onKey);
|
||||
window.removeEventListener('keyup', onKey);
|
||||
window.removeEventListener('blur', onBlur);
|
||||
};
|
||||
}, []);
|
||||
|
||||
const handleClose = () => {
|
||||
if (altKeyRef.current && step === 'importing') {
|
||||
taskImportCacheHelper.remove();
|
||||
setTaskId(null);
|
||||
setStep('idle');
|
||||
}
|
||||
};
|
||||
|
||||
const handleCompletedStepClose = () => {
|
||||
closeModal(MODAL_MESSAGE_IMPORTER_ID);
|
||||
}
|
||||
@@ -96,6 +120,7 @@ export const ModalMessageImporter = () => {
|
||||
aria-label={t('Import your old messages in {{mailbox}}', { mailbox: selectedMailbox.email })}
|
||||
modalId={MODAL_MESSAGE_IMPORTER_ID}
|
||||
size={ModalSize.LARGE}
|
||||
onClose={handleClose}
|
||||
confirmFn={step !== 'uploading' ? undefined : handleConfirmCloseModal}
|
||||
>
|
||||
<div className="modal-importer">
|
||||
|
||||
@@ -276,11 +276,11 @@ export const StepForm = ({ onUploading, onSuccess, onError, error, step }: StepF
|
||||
<div className="form-field-row archive_file_field">
|
||||
<RhfFileUploader
|
||||
name="archive_file"
|
||||
accept=".eml,.mbox"
|
||||
accept=".eml,.mbox,.pst"
|
||||
icon={<span className="material-icons">inventory_2</span>}
|
||||
fileSelectedIcon={<span className="material-icons">inventory_2</span>}
|
||||
bigText={t('Drag and drop an archive here')}
|
||||
text={t('EML or MBOX')}
|
||||
text={t('EML, MBOX or PST')}
|
||||
fullWidth
|
||||
/>
|
||||
{[BucketUploadState.INITIATING, BucketUploadState.IMPORTING, BucketUploadState.COMPLETING, BucketUploadState.COMPLETED].includes(bucketUploadManager.state) && (
|
||||
|
||||
@@ -2,7 +2,7 @@ import { StatusEnum } from "@/features/api/gen";
|
||||
import ProgressBar from "@/features/ui/components/progress-bar";
|
||||
import { useImportTaskStatus } from "@/hooks/use-import-task";
|
||||
import { Spinner } from "@gouvfr-lasuite/ui-kit";
|
||||
import { useEffect } from "react";
|
||||
import { useEffect, useRef } from "react";
|
||||
import { useTranslation } from "react-i18next";
|
||||
|
||||
type StepLoaderProps = {
|
||||
@@ -13,21 +13,40 @@ type StepLoaderProps = {
|
||||
|
||||
export type TaskMetadata = {
|
||||
current_message: number;
|
||||
total_messages: number;
|
||||
total_messages: number | null;
|
||||
failure_count: number;
|
||||
success_count: number;
|
||||
message_status: string;
|
||||
type: "string";
|
||||
type: string;
|
||||
|
||||
}
|
||||
|
||||
const renderProgressText = (
|
||||
t: ReturnType<typeof useTranslation>['t'],
|
||||
importStatus: NonNullable<ReturnType<typeof useImportTaskStatus>>
|
||||
) => {
|
||||
if (importStatus.progress !== null && importStatus.progress > 0) {
|
||||
return <p>{t('{{progress}}% imported', { progress: importStatus.progress })}</p>;
|
||||
}
|
||||
if (importStatus.currentMessage > 0 && !importStatus.hasKnownTotal) {
|
||||
return <p>{t('{{count}} messages imported', { count: importStatus.currentMessage })}</p>;
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export const StepLoader = ({ taskId, onComplete, onError }: StepLoaderProps) => {
|
||||
const { t } = useTranslation();
|
||||
const importStatus = useImportTaskStatus(taskId)!;
|
||||
|
||||
// Use refs to avoid stale closures without requiring stable callback props
|
||||
const onCompleteRef = useRef(onComplete);
|
||||
onCompleteRef.current = onComplete;
|
||||
const onErrorRef = useRef(onError);
|
||||
onErrorRef.current = onError;
|
||||
|
||||
useEffect(() => {
|
||||
if (importStatus?.state === StatusEnum.SUCCESS) {
|
||||
onComplete();
|
||||
onCompleteRef.current();
|
||||
} else if (importStatus?.state === StatusEnum.FAILURE) {
|
||||
const error = importStatus?.error || '';
|
||||
let errorKey = t('An error occurred while importing messages.');
|
||||
@@ -37,16 +56,16 @@ export const StepLoader = ({ taskId, onComplete, onError }: StepLoaderProps) =>
|
||||
) {
|
||||
errorKey = t('Authentication failed. Please check your credentials and ensure you have enabled IMAP connections in your account.');
|
||||
}
|
||||
onError(errorKey);
|
||||
onErrorRef.current(errorKey);
|
||||
}
|
||||
}, [importStatus?.state]);
|
||||
}, [importStatus?.state, t]);
|
||||
|
||||
return (
|
||||
<div className="task-loader">
|
||||
<Spinner size="lg" />
|
||||
<div className="task-loader__progress_resume">
|
||||
<p>{t('Importing...')}</p>
|
||||
{importStatus.progress > 0&& t('{{progress}}% imported', { progress: importStatus.progress })}
|
||||
{renderProgressText(t, importStatus)}
|
||||
</div>
|
||||
<ProgressBar progress={importStatus.progress} />
|
||||
{importStatus.state === StatusEnum.PROGRESS && <p>{t('You can close this window and continue using the app.')}</p>}
|
||||
|
||||
@@ -106,7 +106,7 @@ const ApplicationMenu = () => {
|
||||
if (taskStatus) {
|
||||
if (taskStatus.state === StatusEnum.PROGRESS) {
|
||||
label = t("Importing messages...");
|
||||
if (taskStatus.loading) icon = <CircularProgress loading />;
|
||||
if (taskStatus.loading || taskStatus.progress === null) icon = <CircularProgress loading />;
|
||||
else icon = <CircularProgress progress={taskStatus.progress} withLabel />;
|
||||
}
|
||||
if (taskStatus.state === StatusEnum.SUCCESS) {
|
||||
|
||||
@@ -35,6 +35,13 @@
|
||||
animation: progress-stripes 1s linear infinite;
|
||||
border-radius: 50vw;
|
||||
}
|
||||
|
||||
&--indeterminate {
|
||||
.progress-bar__progress {
|
||||
width: 30% !important;
|
||||
animation: progress-indeterminate 1.5s ease-in-out infinite, progress-stripes 1s linear infinite;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes progress-stripes {
|
||||
@@ -45,3 +52,25 @@
|
||||
background-position: 40px 0;
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes progress-indeterminate {
|
||||
0% {
|
||||
margin-left: -30%;
|
||||
}
|
||||
100% {
|
||||
margin-left: 100%;
|
||||
}
|
||||
}
|
||||
|
||||
@media (prefers-reduced-motion: reduce) {
|
||||
.progress-bar,
|
||||
.progress-bar__progress {
|
||||
animation: none;
|
||||
}
|
||||
.progress-bar--indeterminate .progress-bar__progress {
|
||||
animation: none;
|
||||
width: 100% !important;
|
||||
margin-left: 0;
|
||||
opacity: 0.5;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,11 +1,16 @@
|
||||
type ProgressBarProps = {
|
||||
progress: number;
|
||||
progress: number | null;
|
||||
indeterminate?: boolean;
|
||||
}
|
||||
|
||||
const ProgressBar = ({ progress }: ProgressBarProps) => {
|
||||
const ProgressBar = ({ progress, indeterminate }: ProgressBarProps) => {
|
||||
const isIndeterminate = indeterminate || progress === null;
|
||||
return (
|
||||
<div className="progress-bar">
|
||||
<div className="progress-bar__progress" style={{ width: `${progress}%` }} />
|
||||
<div className={`progress-bar${isIndeterminate ? ' progress-bar--indeterminate' : ''}`}>
|
||||
<div
|
||||
className="progress-bar__progress"
|
||||
style={{ width: isIndeterminate ? '100%' : `${progress}%` }}
|
||||
/>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
@@ -23,13 +23,17 @@ export function useImportTaskStatus(
|
||||
const taskStatus = taskQuery.data?.data.status;
|
||||
const taskMetadata = taskQuery.data?.data.result as TaskMetadata | undefined;
|
||||
|
||||
const hasKnownTotal = taskMetadata?.total_messages != null && taskMetadata.total_messages > 0;
|
||||
const currentMessage = taskMetadata?.current_message ?? 0;
|
||||
|
||||
const progress = useMemo(() => {
|
||||
if (taskStatus === StatusEnum.SUCCESS) return 100;
|
||||
if (taskStatus && taskStatus !== StatusEnum.PROGRESS) return 0;
|
||||
if (!hasKnownTotal) return null;
|
||||
if (!taskMetadata?.success_count || !taskMetadata.total_messages)
|
||||
return null;
|
||||
return (taskMetadata.success_count / taskMetadata.total_messages) * 100;
|
||||
}, [taskStatus, taskMetadata]);
|
||||
}, [taskStatus, taskMetadata, hasKnownTotal]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!enabled || taskStatus === StatusEnum.FAILURE || taskStatus === StatusEnum.SUCCESS) {
|
||||
@@ -41,9 +45,11 @@ export function useImportTaskStatus(
|
||||
|
||||
if (!taskId) return null;
|
||||
return {
|
||||
progress: Math.ceil(progress || 0),
|
||||
progress: progress !== null ? Math.ceil(progress) : null,
|
||||
state: taskQuery.data?.data.status,
|
||||
loading: taskQuery.isPending || !progress,
|
||||
loading: taskQuery.isPending || progress === null,
|
||||
error: taskQuery.data?.data.error,
|
||||
hasKnownTotal,
|
||||
currentMessage,
|
||||
};
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user