# User Film Import, Navbar & Notifications **Date:** 2026-03-29 **Status:** Approved ## Overview Add a navbar for authenticated users with a user dropdown (import films, logout) and a notifications dropdown (with unread count badge and page title update). Users can import their Letterboxd CSV to sync films and actors via async processing, with files stored on a remote SeaweedFS instance. ## Data Model ### New Entities **`UserMovie`** — join table User <-> Movie - `id` (int, PK) - `user` (ManyToOne -> User) - `movie` (ManyToOne -> Movie) - Unique constraint on `(user, movie)` **`Import`** — tracks a CSV import job - `id` (int, PK) - `user` (ManyToOne -> User) - `filePath` (string) — path on SeaweedFS - `status` (string, enum: `pending`, `processing`, `completed`, `failed`) - `totalBatches` (int, default 0) - `processedBatches` (int, default 0) - `totalFilms` (int, default 0) - `failedFilms` (int, default 0) - `createdAt` (datetime) - `completedAt` (datetime, nullable) **`Notification`** — user notifications - `id` (int, PK) - `user` (ManyToOne -> User) - `message` (string) - `read` (bool, default false) - `createdAt` (datetime) ### Modified Entities **`User`** — add OneToMany relations to `UserMovie`, `Import`, `Notification`. ## File Storage (SeaweedFS) - **Library:** `league/flysystem-aws-s3-v3` with Flysystem S3 adapter - **Endpoint:** `s3.lclr.dev` - **Bucket:** `ltbxd-actorle` - **Credentials:** Symfony Secrets (`S3_ACCESS_KEY`, `S3_SECRET_KEY`) - **File path pattern:** `imports/{userId}/{importId}.csv` - No local/Docker SeaweedFS — always the remote instance, including in dev. ## Async Processing (Messenger) ### Messages **`ProcessImportMessage(importId)`** - Dispatched by the upload controller. - Single entry point for import processing. **`ImportFilmsBatchMessage(importId, offset, limit)`** - Dispatched by `ProcessImportMessageHandler`. - One per batch of 50 films. ### Handler: `ProcessImportMessageHandler` 1. Fetch `Import` entity 2. Download CSV from SeaweedFS via Flysystem 3. Parse the file: save to a temp file, then use `LtbxdGateway->parseFile()` (which expects a local path), then delete the temp file 4. Calculate `totalFilms`, `totalBatches` (batches of 50), update the Import 5. Dispatch N `ImportFilmsBatchMessage(importId, offset, limit)` messages 6. Set Import status to `processing` ### Handler: `ImportFilmsBatchMessageHandler` 1. Fetch Import, download CSV from SeaweedFS, read slice [offset, offset+limit] 2. For each film in the slice: - Look up by `ltbxdRef` in DB; if missing, call `TMDBGateway->searchMovie()` and create Movie - Fetch actors via TMDB, create missing Actor/MovieRole entries - Create `UserMovie` link if it doesn't exist 3. Atomically increment `processedBatches` (`UPDATE ... SET processed_batches = processed_batches + 1`) to avoid race conditions with multiple workers 4. If `processedBatches == totalBatches`: set Import to `completed`, set `completedAt`, create Notification ("Import terminé : X/Y films importés") 5. On per-film error: log and continue, increment `failedFilms` ### Error Handling - `ProcessImportMessageHandler` failure (SeaweedFS down, invalid CSV): set Import to `failed`, create error Notification. - `ImportFilmsBatchMessageHandler` per-film failure: log, skip film, increment `failedFilms`, continue. - Messenger retry: default config (3 retries with backoff), then failure transport. ## Extracted Services The logic currently embedded in `SyncFilmsCommand` and `SyncActorsCommand` is extracted into reusable services: - **`FilmImporter`** — given a parsed CSV row, finds or creates a Movie entity via TMDB lookup. - **`ActorSyncer`** — given a Movie, fetches cast from TMDB and creates missing Actor/MovieRole entries. The existing commands are refactored to use these services. ## API Endpoints ### `POST /api/imports` (authenticated) - **Input:** CSV file as multipart form data - **Validation:** `.csv` extension, max 5 MB - **Action:** Upload to SeaweedFS, create Import entity (status `pending`), dispatch `ProcessImportMessage` - **Response:** `201 { id, status: "pending" }` ### `GET /api/notifications` (authenticated) - **Response:** `200 { unreadCount: N, notifications: [{ id, message, read, createdAt }] }` - Sorted by `createdAt` desc, limited to 20 most recent ### `POST /api/notifications/read` (authenticated) - **Action:** Mark all notifications for the authenticated user as read - **Response:** `204` ## Frontend ### Navbar (Twig + Stimulus) - Added in the main layout (`base.html.twig` or partial), visible only when authenticated. - Right side: notification icon (bell) + user icon. ### User Dropdown (`dropdown_controller` Stimulus) - Click user icon -> toggle dropdown menu - Entries: "Importer ses films", "Se deconnecter" - Click outside -> close ### Notifications Dropdown (`notifications_controller` Stimulus) - Click bell icon -> dropdown listing recent notifications - Polling every 30s on `GET /api/notifications` returning notifications + unread count - On dropdown open: call `POST /api/notifications/read` to mark as read - Badge (red, unread count) updates on each poll - `document.title` updates: `(N) Actorle` if unread > 0, `Actorle` otherwise ### Import Modal (`import_modal_controller` Stimulus) - Click "Importer ses films" -> show modal (HTML in DOM, toggle `hidden`) - File input (accept `.csv`) - "Importer" button -> `POST /api/imports` via fetch (multipart) - On success: "Import lance !" message, close modal - Client-side validation: `.csv` extension only ## Out of Scope - Modifying game logic based on user's imported films (future: game config page) - Mercure/WebSocket for real-time notifications (polling is sufficient) - Docker SeaweedFS for local dev