diff --git a/.gitignore b/.gitignore index 098699d..0876d78 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,4 @@ +.env *.env agent.md docker-compose.yml @@ -20,7 +21,14 @@ dist/ .vscode/ .DS_Store -# Local data/dbs +# Local data/dbs (do not commit local state) +data/ +/data/ data/*.db data/*.sqlite data/*.sqlite3 +data/chunks/ +state.db + +# Logs +*.log diff --git a/README.md b/README.md index 5d52b35..fb6ea14 100644 --- a/README.md +++ b/README.md @@ -7,23 +7,16 @@ Admin users log in to create public invite links; invite links are always public ## Features -- **Invite links (public)** — create upload links you can share with anyone -- **One‑time link claim** — first browser session claims a one‑time link; it can upload multiple files, others are blocked -- **Optional public uploader** — disabled by default; can be enabled via `.env` -- **Queue with progress** via WebSocket (success / duplicate / error) -- **Duplicate prevention** (local SHA‑1 cache + optional Immich bulk‑check) -- **Original dates preserved** (EXIF → `fileCreatedAt` / `fileModifiedAt`) -- **Mobile‑friendly** -- **.env‑only config** (clean deploys) + Docker/Compose -- **Privacy‑first**: never lists server media; UI only shows the current session -- **Dark mode** — detects system preference; manual toggle persists across pages -- **Albums** — add uploads to a configured album (creates if needed) -- **Copy + QR** — copy invite link and display QR for easy sharing - - **Chunked uploads (optional)** — split large files to bypass proxy limits; configurable size - - **Invite passwords (optional)** — protect invite links with a password prompt - - **Device‑flexible HMI** — responsive layout, mobile‑safe picker, sticky mobile bar - - **Retry failed uploads** — one‑click retry for any errored item - - **Invites without album** — create links that don’t add uploads to any album +- **Invite Links:** public-by-URL links for uploads; one-time or multi-use +- **Manage Links:** search/sort, enable/disable, delete, edit name/expiry +- **Row Actions:** icon-only actions with tooltips (Open, Copy, Details, QR, Save) +- **Passwords (optional):** protect invites with a password gate +- **Albums (optional):** upload into a specific album (auto-create supported) +- **Duplicate Prevention:** local SHA‑1 cache (+ optional Immich bulk-check) +- **Progress Queue:** WebSocket updates; retry failed items +- **Chunked Uploads (optional):** large-file support with configurable chunk size +- **Privacy-first:** never lists server media; session-local uploads only +- **Mobile + Dark Mode:** responsive UI, safe-area padding, persistent theme --- @@ -105,27 +98,17 @@ docker compose up -d ``` --- -## New Features +## What's New -### 🔐 Login + Menu -- Login with your Immich credentials to access the menu. -- The menu lets you list/create albums and create invite links. -- The menu is always behind login; logout clears the session. +### v0.5.0 – Manage Links overhaul +- In-panel bulk actions footer (Delete/Enable/Disable stay inside the box) +- Per-row icon actions with tooltips; Save button lights up only on changes +- Per-row QR modal; Details modal close fixed and reliable +- Auto-refresh after creating a link; new row is highlighted and scrolled into view +- Expiry save fix: stores end-of-day to avoid off-by-one date issues -### 🔗 Invite Links -- Links are always public by URL (no login required to use). -- You can make links one‑time (claimed by the first browser session) or indefinite / limited uses. -- Set link expiry (e.g., 1, 2, 7 days). Expired links are inactive. -- Copy link and view a QR code for easy sharing. - -### 🔑 Invite Passwords (New) -- When creating an invite, you can optionally set a password. -- Recipients must enter the password before they can upload through the link. -- The app stores only a salted hash server‑side; sessions that pass the check are marked authorized. - -### 🧩 Chunked Uploads (New) -- Opt‑in support for splitting large files into chunks to bypass proxy limits (e.g., Cloudflare 100MB). -- Enable with `CHUNKED_UPLOADS_ENABLED=true`; tune `CHUNK_SIZE_MB` (default 95MB). +Roadmap highlight +- We’d like to add a per-user UI and remove reliance on a fixed API key by allowing users to authenticate and provide their own Immich API tokens. This is not in scope for the initial versions but aligns with future direction. - The frontend automatically switches to chunked mode only for files larger than the configured chunk size. ### 📱 Device‑Flexible HMI (New) @@ -147,22 +130,10 @@ docker compose up -d - Support for invites with no album association. ### 🌙 Dark Mode -- Automatically detects system dark/light preference on first visit -- Manual toggle button in the header (sun/moon icon) -- Preference saved in browser localStorage -- Smooth color transitions for better UX -- All UI elements properly themed for both modes +- Automatic or manual toggle; persisted preference ### 📁 Album Integration -- Configure `IMMICH_ALBUM_NAME` environment variable to auto-add uploads to a specific album -- Album is automatically created if it doesn't exist -- Efficient caching of album ID to minimize API calls -- Visual feedback showing which album uploads are being added to -- Works seamlessly with existing duplicate detection - -### 🐛 Bug Fixes -- Fixed WebSocket disconnection error that occurred when clients closed connections -- Improved error handling for edge cases +- Auto-create + assign album if configured; optional invites without album --- diff --git a/app/app.py b/app/app.py index 0fa6a89..7615609 100644 --- a/app/app.py +++ b/app/app.py @@ -184,6 +184,28 @@ def sha1_hex(file_bytes: bytes) -> str: h.update(file_bytes) return h.hexdigest() +def sanitize_filename(name: Optional[str]) -> str: + """Return a minimally sanitized filename that preserves the original name. + + - Removes control characters (\x00-\x1F, \x7F) + - Replaces path separators ('/' and '\\') with underscore + - Falls back to 'file' if empty + Other Unicode characters and spaces are preserved. + """ + if not name: + return "file" + cleaned_chars = [] + for ch in str(name): + o = ord(ch) + if o < 32 or o == 127: + continue + if ch in ('/', '\\'): + cleaned_chars.append('_') + else: + cleaned_chars.append(ch) + cleaned = ''.join(cleaned_chars).strip() + return cleaned or "file" + def read_exif_datetimes(file_bytes: bytes): """ Extract EXIF DateTimeOriginal / ModifyDate values when possible. @@ -429,6 +451,7 @@ async def api_upload( session_id: str = Form(...), last_modified: Optional[int] = Form(None), invite_token: Optional[str] = Form(None), + fingerprint: Optional[str] = Form(None), ): """Receive a file, check duplicates, forward to Immich; stream progress via WS.""" raw = await file.read() @@ -458,15 +481,17 @@ async def api_upload( await send_progress(session_id, item_id, "duplicate", 100, "Duplicate (server)", asset_id) return JSONResponse({"status": "duplicate", "id": asset_id}, status_code=200) + safe_name = sanitize_filename(file.filename) def gen_encoder() -> MultipartEncoder: return MultipartEncoder(fields={ - "assetData": (file.filename, io.BytesIO(raw), file.content_type or "application/octet-stream"), + "assetData": (safe_name, io.BytesIO(raw), file.content_type or "application/octet-stream"), "deviceAssetId": device_asset_id, "deviceId": f"python-{session_id}", "fileCreatedAt": created_iso, "fileModifiedAt": modified_iso, "isFavorite": "false", - "filename": file.filename, + "filename": safe_name, + "originalFileName": safe_name, }) encoder = gen_encoder() @@ -478,7 +503,7 @@ async def api_upload( try: conn = sqlite3.connect(SETTINGS.state_db) cur = conn.cursor() - cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_by_session, password_hash FROM invites WHERE token = ?", (invite_token,)) + cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_by_session, password_hash, COALESCE(disabled,0) FROM invites WHERE token = ?", (invite_token,)) row = cur.fetchone() conn.close() except Exception as e: @@ -487,7 +512,14 @@ async def api_upload( if not row: await send_progress(session_id, item_id, "error", 100, "Invalid invite token") return JSONResponse({"error": "invalid_invite"}, status_code=403) - _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_by_session, password_hash = row + _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_by_session, password_hash, disabled = row + # Admin deactivation check + try: + if int(disabled) == 1: + await send_progress(session_id, item_id, "error", 100, "Invite disabled") + return JSONResponse({"error": "invite_disabled"}, status_code=403) + except Exception: + pass # If invite requires password, ensure this session is authorized if password_hash: try: @@ -611,6 +643,40 @@ async def api_upload( conn2.close() except Exception as e: logger.exception("Failed to increment invite usage: %s", e) + # Log uploader identity and file metadata + try: + connlg = sqlite3.connect(SETTINGS.state_db) + curlg = connlg.cursor() + curlg.execute( + """ + CREATE TABLE IF NOT EXISTS upload_events ( + id INTEGER PRIMARY KEY AUTOINCREMENT, + token TEXT, + uploaded_at TEXT DEFAULT CURRENT_TIMESTAMP, + ip TEXT, + user_agent TEXT, + fingerprint TEXT, + filename TEXT, + size INTEGER, + checksum TEXT, + immich_asset_id TEXT + ); + """ + ) + ip = None + try: + ip = (request.client.host if request and request.client else None) or request.headers.get('x-forwarded-for') + except Exception: + ip = None + ua = request.headers.get('user-agent', '') if request else '' + curlg.execute( + "INSERT INTO upload_events (token, ip, user_agent, fingerprint, filename, size, checksum, immich_asset_id) VALUES (?,?,?,?,?,?,?,?)", + (invite_token or '', ip, ua, fingerprint or '', file.filename, size, checksum, asset_id or None) + ) + connlg.commit() + connlg.close() + except Exception: + pass return JSONResponse({"id": asset_id, "status": status}, status_code=200) else: try: @@ -712,6 +778,7 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: name = (data or {}).get("name") or "upload.bin" last_modified = (data or {}).get("last_modified") invite_token = (data or {}).get("invite_token") + fingerprint = (data or {}).get("fingerprint") content_type = (data or {}).get("content_type") or "application/octet-stream" if not item_id or not session_id: return JSONResponse({"error": "missing_ids"}, status_code=400) @@ -726,6 +793,14 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: total_chunks = int(meta.get("total_chunks") or (data or {}).get("total_chunks") or 0) if total_chunks <= 0: return JSONResponse({"error": "missing_total"}, status_code=400) + # Prefer the name captured at init if request did not include it + if not name: + try: + name = meta.get("name") or name + except Exception: + pass + if not name: + name = "upload.bin" # Assemble parts = [] try: @@ -786,15 +861,17 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: await send_progress(session_id_local, item_id_local, "duplicate", 100, "Duplicate (server)", asset_id) return JSONResponse({"status": "duplicate", "id": asset_id}, status_code=200) + safe_name2 = sanitize_filename(file_like_name) def gen_encoder2() -> MultipartEncoder: return MultipartEncoder(fields={ - "assetData": (file_like_name, io.BytesIO(raw), content_type or "application/octet-stream"), + "assetData": (safe_name2, io.BytesIO(raw), content_type or "application/octet-stream"), "deviceAssetId": device_asset_id, "deviceId": f"python-{session_id_local}", "fileCreatedAt": created_iso, "fileModifiedAt": modified_iso, "isFavorite": "false", - "filename": file_like_name, + "filename": safe_name2, + "originalFileName": safe_name2, }) # Invite validation/gating mirrors api_upload @@ -804,7 +881,7 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: try: conn = sqlite3.connect(SETTINGS.state_db) cur = conn.cursor() - cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_by_session, password_hash FROM invites WHERE token = ?", (invite_token,)) + cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_by_session, password_hash, COALESCE(disabled,0) FROM invites WHERE token = ?", (invite_token,)) row = cur.fetchone() conn.close() except Exception as e: @@ -813,7 +890,14 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: if not row: await send_progress(session_id_local, item_id_local, "error", 100, "Invalid invite token") return JSONResponse({"error": "invalid_invite"}, status_code=403) - _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_by_session, password_hash = row + _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_by_session, password_hash, disabled = row + # Admin deactivation check + try: + if int(disabled) == 1: + await send_progress(session_id_local, item_id_local, "error", 100, "Invite disabled") + return JSONResponse({"error": "invite_disabled"}, status_code=403) + except Exception: + pass if password_hash: try: ia = request.session.get("inviteAuth") or {} @@ -923,6 +1007,40 @@ async def api_upload_chunk_complete(request: Request) -> JSONResponse: conn2.close() except Exception as e: logger.exception("Failed to increment invite usage: %s", e) + # Log uploader identity and file metadata + try: + connlg = sqlite3.connect(SETTINGS.state_db) + curlg = connlg.cursor() + curlg.execute( + """ + CREATE TABLE IF NOT EXISTS upload_events ( + id INTEGER PRIMARY KEY AUTOINCREMENT, + token TEXT, + uploaded_at TEXT DEFAULT CURRENT_TIMESTAMP, + ip TEXT, + user_agent TEXT, + fingerprint TEXT, + filename TEXT, + size INTEGER, + checksum TEXT, + immich_asset_id TEXT + ); + """ + ) + ip = None + try: + ip = (request.client.host if request and request.client else None) or request.headers.get('x-forwarded-for') + except Exception: + ip = None + ua = request.headers.get('user-agent', '') if request else '' + curlg.execute( + "INSERT INTO upload_events (token, ip, user_agent, fingerprint, filename, size, checksum, immich_asset_id) VALUES (?,?,?,?,?,?,?,?)", + (invite_token or '', ip, ua, fingerprint or '', file_like_name, file_size, checksum, asset_id or None) + ) + connlg.commit() + connlg.close() + except Exception: + pass return JSONResponse({"id": asset_id, "status": status}, status_code=200) else: try: @@ -1061,6 +1179,27 @@ def ensure_invites_table() -> None: cur.execute("ALTER TABLE invites ADD COLUMN password_hash TEXT") except Exception: pass + # Ownership and management fields (best-effort migrations) + try: + cur.execute("ALTER TABLE invites ADD COLUMN owner_user_id TEXT") + except Exception: + pass + try: + cur.execute("ALTER TABLE invites ADD COLUMN owner_email TEXT") + except Exception: + pass + try: + cur.execute("ALTER TABLE invites ADD COLUMN owner_name TEXT") + except Exception: + pass + try: + cur.execute("ALTER TABLE invites ADD COLUMN name TEXT") + except Exception: + pass + try: + cur.execute("ALTER TABLE invites ADD COLUMN disabled INTEGER DEFAULT 0") + except Exception: + pass conn.commit() conn.close() except Exception as e: @@ -1125,18 +1264,26 @@ async def api_invites_create(request: Request) -> JSONResponse: except Exception: return "" pw_hash = hash_password(invite_password or "") if (invite_password and str(invite_password).strip()) else None + # Owner info from session + owner_user_id = str(request.session.get("userId") or "") + owner_email = str(request.session.get("userEmail") or "") + owner_name = str(request.session.get("name") or "") + # Friendly name: default to album + creation timestamp if not provided in future updates + # Here we set a default immediately + now_tag = datetime.utcnow().strftime("%Y%m%d-%H%M") + default_link_name = f"{album_name or 'NoAlbum'}-{now_tag}" try: conn = sqlite3.connect(SETTINGS.state_db) cur = conn.cursor() if pw_hash: cur.execute( - "INSERT INTO invites (token, album_id, album_name, max_uses, expires_at, password_hash) VALUES (?,?,?,?,?,?)", - (token, resolved_album_id, album_name, max_uses, expires_at, pw_hash) + "INSERT INTO invites (token, album_id, album_name, max_uses, expires_at, password_hash, owner_user_id, owner_email, owner_name, name) VALUES (?,?,?,?,?,?,?,?,?,?)", + (token, resolved_album_id, album_name, max_uses, expires_at, pw_hash, owner_user_id, owner_email, owner_name, default_link_name) ) else: cur.execute( - "INSERT INTO invites (token, album_id, album_name, max_uses, expires_at) VALUES (?,?,?,?,?)", - (token, resolved_album_id, album_name, max_uses, expires_at) + "INSERT INTO invites (token, album_id, album_name, max_uses, expires_at, owner_user_id, owner_email, owner_name, name) VALUES (?,?,?,?,?,?,?,?,?)", + (token, resolved_album_id, album_name, max_uses, expires_at, owner_user_id, owner_email, owner_name, default_link_name) ) conn.commit() conn.close() @@ -1157,9 +1304,301 @@ async def api_invites_create(request: Request) -> JSONResponse: "albumId": resolved_album_id, "albumName": album_name, "maxUses": max_uses, - "expiresAt": expires_at + "expiresAt": expires_at, + "name": default_link_name }) +@app.get("/api/invites") +async def api_invites_list(request: Request) -> JSONResponse: + """List invites owned by the logged-in user, with optional q/sort filters.""" + if not request.session.get("accessToken"): + return JSONResponse({"error": "unauthorized"}, status_code=401) + owner_user_id = str(request.session.get("userId") or "") + q = (request.query_params.get("q") or "").strip() + sort = (request.query_params.get("sort") or "-created").strip() + # Map sort tokens to SQL + sort_sql = "created_at DESC" + if sort in ("created", "+created"): + sort_sql = "created_at ASC" + elif sort in ("-created",): + sort_sql = "created_at DESC" + elif sort in ("expires", "+expires"): + sort_sql = "expires_at ASC" + elif sort in ("-expires",): + sort_sql = "expires_at DESC" + elif sort in ("name", "+name"): + sort_sql = "name ASC" + elif sort in ("-name",): + sort_sql = "name DESC" + try: + conn = sqlite3.connect(SETTINGS.state_db) + cur = conn.cursor() + if q: + like = f"%{q}%" + cur.execute( + """ + SELECT token, name, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), COALESCE(disabled,0), created_at + FROM invites + WHERE owner_user_id = ? AND ( + COALESCE(name,'') LIKE ? OR COALESCE(album_name,'') LIKE ? OR token LIKE ? + ) + ORDER BY """ + sort_sql, + (owner_user_id, like, like, like) + ) + else: + cur.execute( + f"SELECT token, name, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), COALESCE(disabled,0), created_at FROM invites WHERE owner_user_id = ? ORDER BY {sort_sql}", + (owner_user_id,) + ) + rows = cur.fetchall() + conn.close() + except Exception as e: + logger.exception("List invites failed: %s", e) + return JSONResponse({"error": "db_error"}, status_code=500) + items = [] + now = datetime.utcnow() + for (token, name, album_id, album_name, max_uses, used_count, expires_at, claimed, disabled, created_at) in rows: + try: + max_uses_int = int(max_uses) if max_uses is not None else -1 + except Exception: + max_uses_int = -1 + remaining = None + try: + if max_uses_int >= 0: + remaining = int(max_uses_int) - int(used_count or 0) + except Exception: + remaining = None + expired = False + if expires_at: + try: + expired = now > datetime.fromisoformat(expires_at) + except Exception: + expired = False + inactive_reason = None + active = True + if (max_uses_int == 1 and claimed) or (remaining is not None and remaining <= 0): + active = False + inactive_reason = "claimed" if max_uses_int == 1 else "exhausted" + if expired: + active = False + inactive_reason = inactive_reason or "expired" + try: + if int(disabled) == 1: + active = False + inactive_reason = "disabled" + except Exception: + pass + items.append({ + "token": token, + "name": name, + "albumId": album_id, + "albumName": album_name, + "maxUses": max_uses, + "used": used_count or 0, + "remaining": remaining, + "expiresAt": expires_at, + "active": active, + "inactiveReason": inactive_reason, + "createdAt": created_at, + }) + return JSONResponse({"items": items}) + +@app.patch("/api/invite/{token}") +async def api_invite_update(token: str, request: Request) -> JSONResponse: + """Update invite fields: name, disabled, maxUses, expiresAt/expiresDays, password, resetUsage.""" + if not request.session.get("accessToken"): + return JSONResponse({"error": "unauthorized"}, status_code=401) + try: + body = await request.json() + except Exception: + body = {} + owner_user_id = str(request.session.get("userId") or "") + # Build dynamic update + fields = [] + params = [] + # Name + if "name" in (body or {}): + fields.append("name = ?") + params.append(str((body or {}).get("name") or "").strip()) + # Disabled toggle + if "disabled" in (body or {}): + try: + disabled = 1 if (bool((body or {}).get("disabled")) is True) else 0 + except Exception: + disabled = 0 + fields.append("disabled = ?") + params.append(disabled) + # Max uses + if "maxUses" in (body or {}): + try: + mu = int((body or {}).get("maxUses")) + except Exception: + mu = 1 + fields.append("max_uses = ?") + params.append(mu) + # Expiration + if "expiresAt" in (body or {}) or "expiresDays" in (body or {}): + expires_at = None + if (body or {}).get("expiresAt"): + try: + # trust provided ISO string + expires_at = str((body or {}).get("expiresAt")) + except Exception: + expires_at = None + else: + try: + days = int((body or {}).get("expiresDays")) + from datetime import timedelta + expires_at = (datetime.utcnow() + timedelta(days=days)).replace(microsecond=0).isoformat() + except Exception: + expires_at = None + fields.append("expires_at = ?") + params.append(expires_at) + # Password + if "password" in (body or {}): + pw = str((body or {}).get("password") or "").strip() + if pw: + # Reuse hasher from above + def _hash_pw(pw: str) -> str: + import os as _os + import binascii as _binascii + salt = _os.urandom(16) + iterations = 200_000 + dk = hashlib.pbkdf2_hmac('sha256', pw.encode('utf-8'), salt, iterations) + return f"pbkdf2_sha256${iterations}${_binascii.hexlify(salt).decode()}${_binascii.hexlify(dk).decode()}" + fields.append("password_hash = ?") + params.append(_hash_pw(pw)) + else: + fields.append("password_hash = NULL") + # Reset usage + reset_usage = bool((body or {}).get("resetUsage")) + try: + if fields: + conn = sqlite3.connect(SETTINGS.state_db) + cur = conn.cursor() + cur.execute( + f"UPDATE invites SET {', '.join(fields)} WHERE token = ? AND owner_user_id = ?", + (*params, token, owner_user_id) + ) + if reset_usage: + cur.execute("UPDATE invites SET used_count = 0, claimed = 0, claimed_at = NULL, claimed_by_session = NULL WHERE token = ? AND owner_user_id = ?", (token, owner_user_id)) + conn.commit() + updated = conn.total_changes + conn.close() + else: + updated = 0 + except Exception as e: + logger.exception("Invite update failed: %s", e) + return JSONResponse({"error": "db_error"}, status_code=500) + if updated == 0: + return JSONResponse({"ok": False, "updated": 0}, status_code=404) + return JSONResponse({"ok": True, "updated": updated}) + +@app.post("/api/invites/bulk") +async def api_invites_bulk(request: Request) -> JSONResponse: + """Bulk enable/disable invites owned by current user. Body: {tokens:[], action:'disable'|'enable'}""" + if not request.session.get("accessToken"): + return JSONResponse({"error": "unauthorized"}, status_code=401) + try: + body = await request.json() + except Exception: + body = {} + tokens = list((body or {}).get("tokens") or []) + action = str((body or {}).get("action") or "disable").lower().strip() + if not tokens: + return JSONResponse({"error": "missing_tokens"}, status_code=400) + val = 1 if action == "disable" else 0 + owner_user_id = str(request.session.get("userId") or "") + try: + conn = sqlite3.connect(SETTINGS.state_db) + cur = conn.cursor() + # Build query with correct number of placeholders + placeholders = ",".join(["?"] * len(tokens)) + cur.execute( + f"UPDATE invites SET disabled = ? WHERE owner_user_id = ? AND token IN ({placeholders})", + (val, owner_user_id, *tokens) + ) + conn.commit() + changed = conn.total_changes + conn.close() + except Exception as e: + logger.exception("Bulk update failed: %s", e) + return JSONResponse({"error": "db_error"}, status_code=500) + return JSONResponse({"ok": True, "updated": changed}) + +@app.post("/api/invites/delete") +async def api_invites_delete(request: Request) -> JSONResponse: + """Hard delete invites owned by the current user and their upload logs. + + Body: { tokens: ["...", ...] } + """ + if not request.session.get("accessToken"): + return JSONResponse({"error": "unauthorized"}, status_code=401) + try: + body = await request.json() + except Exception: + body = {} + tokens = list((body or {}).get("tokens") or []) + if not tokens: + return JSONResponse({"error": "missing_tokens"}, status_code=400) + owner_user_id = str(request.session.get("userId") or "") + try: + conn = sqlite3.connect(SETTINGS.state_db) + cur = conn.cursor() + placeholders = ",".join(["?"] * len(tokens)) + # Delete upload events first to avoid orphan rows + cur.execute( + f"DELETE FROM upload_events WHERE token IN ({placeholders})", + (*tokens,) + ) + # Delete invites scoped to owner + cur.execute( + f"DELETE FROM invites WHERE owner_user_id = ? AND token IN ({placeholders})", + (owner_user_id, *tokens) + ) + conn.commit() + changed = conn.total_changes + conn.close() + except Exception as e: + logger.exception("Bulk delete failed: %s", e) + return JSONResponse({"error": "db_error"}, status_code=500) + return JSONResponse({"ok": True, "deleted": changed}) + +@app.get("/api/invite/{token}/uploads") +async def api_invite_uploads(token: str, request: Request) -> JSONResponse: + """Return upload events for a given token (owner-only).""" + if not request.session.get("accessToken"): + return JSONResponse({"error": "unauthorized"}, status_code=401) + owner_user_id = str(request.session.get("userId") or "") + try: + conn = sqlite3.connect(SETTINGS.state_db) + cur = conn.cursor() + # Verify ownership + cur.execute("SELECT 1 FROM invites WHERE token = ? AND owner_user_id = ?", (token, owner_user_id)) + row = cur.fetchone() + if not row: + conn.close() + return JSONResponse({"error": "forbidden"}, status_code=403) + cur.execute("SELECT uploaded_at, ip, user_agent, fingerprint, filename, size, checksum, immich_asset_id FROM upload_events WHERE token = ? ORDER BY uploaded_at DESC LIMIT 500", (token,)) + rows = cur.fetchall() + conn.close() + except Exception as e: + logger.exception("Fetch uploads failed: %s", e) + return JSONResponse({"error": "db_error"}, status_code=500) + items = [] + for uploaded_at, ip, ua, fp, filename, size, checksum, asset_id in rows: + items.append({ + "uploadedAt": uploaded_at, + "ip": ip, + "userAgent": ua, + "fingerprint": fp, + "filename": filename, + "size": size, + "checksum": checksum, + "assetId": asset_id, + }) + return JSONResponse({"items": items}) + @app.get("/invite/{token}", response_class=HTMLResponse) async def invite_page(token: str, request: Request) -> HTMLResponse: # If public invites disabled and no user session, require login @@ -1172,7 +1611,7 @@ async def api_invite_info(token: str, request: Request) -> JSONResponse: try: conn = sqlite3.connect(SETTINGS.state_db) cur = conn.cursor() - cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_at, password_hash FROM invites WHERE token = ?", (token,)) + cur.execute("SELECT token, album_id, album_name, max_uses, used_count, expires_at, COALESCE(claimed,0), claimed_at, password_hash, COALESCE(disabled,0), name FROM invites WHERE token = ?", (token,)) row = cur.fetchone() conn.close() except Exception as e: @@ -1180,7 +1619,7 @@ async def api_invite_info(token: str, request: Request) -> JSONResponse: return JSONResponse({"error": "db_error"}, status_code=500) if not row: return JSONResponse({"error": "not_found"}, status_code=404) - _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_at, password_hash = row + _, album_id, album_name, max_uses, used_count, expires_at, claimed, claimed_at, password_hash, disabled, link_name = row # compute remaining remaining = None try: @@ -1200,12 +1639,23 @@ async def api_invite_info(token: str, request: Request) -> JSONResponse: except Exception: expired = False deactivated = False + reason = None if one_time and claimed: deactivated = True + reason = "claimed" elif remaining is not None and remaining <= 0: deactivated = True + reason = "exhausted" if expired: deactivated = True + reason = reason or "expired" + # Admin disabled flag + try: + if int(disabled) == 1: + deactivated = True + reason = "disabled" + except Exception: + pass active = not deactivated # Password requirement + authorization state password_required = bool(password_hash) @@ -1219,6 +1669,7 @@ async def api_invite_info(token: str, request: Request) -> JSONResponse: "token": token, "albumId": album_id, "albumName": album_name, + "name": link_name, "maxUses": max_uses, "used": used_count or 0, "remaining": remaining, @@ -1228,6 +1679,7 @@ async def api_invite_info(token: str, request: Request) -> JSONResponse: "claimedAt": claimed_at, "expired": expired, "active": active, + "inactiveReason": (None if active else (reason or "inactive")), "passwordRequired": password_required, "authorized": authorized, }) diff --git a/frontend/app.js b/frontend/app.js index c481c57..b7f7d05 100644 --- a/frontend/app.js +++ b/frontend/app.js @@ -1,5 +1,28 @@ // Frontend logic (mobile-safe picker; no settings UI) const sessionId = (crypto && crypto.randomUUID) ? crypto.randomUUID() : (Math.random().toString(36).slice(2)); +// Simple device fingerprint: stable per-browser using stored id + UA/screen/timezone +function getDeviceId(){ + try{ + let id = localStorage.getItem('immich_drop_device_id'); + if (!id) { id = (crypto && crypto.randomUUID) ? crypto.randomUUID() : (Math.random().toString(36).slice(2)); localStorage.setItem('immich_drop_device_id', id); } + return id; + }catch{ return 'anon'; } +} +function computeFingerprint(){ + try{ + const id = getDeviceId(); + const ua = navigator.userAgent || ''; + const lang = navigator.language || ''; + const plat = navigator.platform || ''; + const tz = Intl.DateTimeFormat().resolvedOptions().timeZone || ''; + const scr = (screen && (screen.width+'x'+screen.height+'x'+screen.colorDepth)) || ''; + const raw = [id, ua, lang, plat, tz, scr].join('|'); + // tiny hash + let h = 0; for (let i=0;i({})); if(!res.ok && next.status!=='error'){ @@ -225,7 +249,8 @@ async function uploadChunked(next){ size: next.file.size, last_modified: next.file.lastModified || '', invite_token: INVITE_TOKEN || '', - content_type: next.file.type || 'application/octet-stream' + content_type: next.file.type || 'application/octet-stream', + fingerprint: FINGERPRINT }) }); } catch {} // upload parts @@ -240,6 +265,7 @@ async function uploadChunked(next){ fd.append('chunk_index', String(i)); fd.append('total_chunks', String(total)); if (INVITE_TOKEN) fd.append('invite_token', INVITE_TOKEN); + fd.append('fingerprint', FINGERPRINT); fd.append('chunk', blob, `${next.file.name}.part${i}`); const r = await fetch('/api/upload/chunk', { method:'POST', body: fd }); if (!r.ok) { @@ -260,6 +286,7 @@ async function uploadChunked(next){ last_modified: next.file.lastModified || '', invite_token: INVITE_TOKEN || '', content_type: next.file.type || 'application/octet-stream', + fingerprint: FINGERPRINT, total_chunks: total }) }); const body = await rc.json().catch(()=>({})); diff --git a/frontend/invite.html b/frontend/invite.html index 4135a59..4f514cb 100644 --- a/frontend/invite.html +++ b/frontend/invite.html @@ -129,7 +129,7 @@ document.getElementById('liUses').textContent = (typeof j.remaining==='number') ? String(j.remaining) : '—'; document.getElementById('liExpires').textContent = j.expiresAt ? fmt(j.expiresAt) : 'No expiry'; document.getElementById('liClaimed').textContent = j.claimed ? 'Yes' : 'No'; - document.getElementById('liStatus').textContent = j.active ? 'Active' : 'Inactive'; + document.getElementById('liStatus').textContent = j.active ? 'Active' : ('Inactive' + (j.inactiveReason ? (' ('+j.inactiveReason+')') : '')); const dz = document.getElementById('dropzone'); const fi = document.getElementById('fileInput'); const itemsEl = document.getElementById('items'); @@ -169,7 +169,7 @@ // Disable dropzone dz.classList.add('opacity-50'); fi.disabled = true; - itemsEl.innerHTML = '
This link is not active.
'; + itemsEl.innerHTML = `
This link is not active${j.inactiveReason?` (${j.inactiveReason})`:''}.
`; } } catch {} })(); diff --git a/frontend/menu.html b/frontend/menu.html index 1055974..9415f8b 100644 --- a/frontend/menu.html +++ b/frontend/menu.html @@ -77,6 +77,47 @@ + +
+
+

Manage Links

+
+ + + +
+
+
+ + + + + + + + + + + + + +
NameStatusUsesExpiresAlbumActions
+
+ +
+ + + +
+
+
If album listing or creation is forbidden by your token, specify a fixed album in the .env file as IMMICH_ALBUM_NAME.
@@ -163,6 +204,9 @@ linkOut.value = link; // Build QR via backend PNG generator (no external libs) qrImg.src = `/api/qr?text=${encodeURIComponent(link)}`; + // Also refresh the managed links list and highlight the new entry + if (j && j.token) { try { LAST_CREATED_TOKEN = j.token; } catch {} } + try { await loadInvites(); } catch {} }catch(err){ showResult('err', String(err)); } }; @@ -182,6 +226,235 @@ // header.js wires theme + ping and shows banner consistently loadAlbums(); + + // --- Manage UI logic --- + const searchQ = document.getElementById('searchQ'); + const sortSel = document.getElementById('sortSel'); + const btnRefresh = document.getElementById('btnRefresh'); + const invitesTBody = document.getElementById('invitesTBody'); + const chkAll = document.getElementById('chkAll'); + const btnDisableSel = document.getElementById('btnDisableSel'); + const btnEnableSel = document.getElementById('btnEnableSel'); + const btnDeleteSel = document.getElementById('btnDeleteSel'); + let INVITES = []; + let LAST_CREATED_TOKEN = null; + + async function loadInvites(){ + const params = new URLSearchParams(); + const q = (searchQ.value||'').trim(); if (q) params.set('q', q); + const sort = (sortSel.value||'').trim(); if (sort) params.set('sort', sort); + try{ + const r = await fetch('/api/invites?'+params.toString()); + const j = await r.json(); + INVITES = (j && j.items) ? j.items : []; + }catch{ INVITES = []; } + renderInvites(); + } + function fmtDayMonth(iso){ try{ const d = new Date(iso); return d.toLocaleDateString(undefined,{ day:'2-digit', month:'short' }); }catch{return '—';} } + function statusBadge(row){ + // Red reasons override disabled (yellow) + const inactive = String(row.inactiveReason||''); + if (/expired|claimed|exhausted/i.test(inactive)){ + const label = inactive.charAt(0).toUpperCase()+inactive.slice(1); + return `${label}`; + } + if (row.active){ + return `Active`; + } + if (/disabled/i.test(inactive)){ + return `Disabled`; + } + return `${row.active?'Active':'Inactive'}`; + } + function renderInvites(){ + invitesTBody.innerHTML = INVITES.map(row => { + const status = statusBadge(row); + const uses = (typeof row.remaining==='number') ? `${row.used||0}/${(row.maxUses<0)?'∞':row.maxUses}` : `${row.used||0}/${(row.maxUses<0)?'∞':row.maxUses??'?'}`; + const exp = row.expiresAt ? `${fmtDayMonth(row.expiresAt)}` : '—'; + const url = location.origin + '/invite/' + row.token; + return ` + + + + + + ${status} + ${uses} + + + + ${row.albumName || '—'} + +
+ + + + + Open + + + +
+ + `; + }).join(''); + // wire row actions + // helper to compute change state and toggle Save + function updateSaveState(token){ + const inName = invitesTBody.querySelector(`.inName[data-token="${token}"]`); + const inExp = invitesTBody.querySelector(`.inExpires[data-token="${token}"]`); + const btn = invitesTBody.querySelector(`.btnSave[data-token="${token}"]`); + if (!inName || !inExp || !btn) return; + const origName = inName.getAttribute('data-original') || ''; + const origExp = inExp.getAttribute('data-original') || ''; + const curName = (inName.value||'').trim(); + const curExp = inExp.value || ''; + const changed = (curName !== origName) || (curExp !== origExp); + btn.disabled = !changed; + btn.classList.toggle('opacity-50', !changed); + btn.classList.toggle('cursor-not-allowed', !changed); + } + // set original values as data attributes and wire change listeners + INVITES.forEach(row => { + const token = row.token; + const inName = invitesTBody.querySelector(`.inName[data-token="${token}"]`); + const inExp = invitesTBody.querySelector(`.inExpires[data-token="${token}"]`); + if (inName) inName.setAttribute('data-original', (row.name||'').trim()); + const dStr = row.expiresAt? new Date(row.expiresAt).toISOString().slice(0,10) : ''; + if (inExp) inExp.setAttribute('data-original', dStr); + if (inName) inName.addEventListener('input', ()=>updateSaveState(token)); + if (inExp) inExp.addEventListener('change', ()=>updateSaveState(token)); + updateSaveState(token); + }); + + invitesTBody.querySelectorAll('.btnSave').forEach(btn => btn.onclick = async ()=>{ + const token = btn.getAttribute('data-token'); + if (btn.disabled) return; + const name = invitesTBody.querySelector(`.inName[data-token="${token}"]`).value.trim(); + const expVal = invitesTBody.querySelector(`.inExpires[data-token="${token}"]`).value; + const payload = { name }; + if (expVal) { + // Set to end-of-day local to avoid off-by-one day on save + const dt = new Date(expVal); + dt.setHours(23,59,59,999); + payload.expiresAt = dt.toISOString().slice(0,19); + } else { payload.expiresAt = null; } + try{ + const r = await fetch(`/api/invite/${token}`, { method:'PATCH', headers:{'Content-Type':'application/json','Accept':'application/json'}, body: JSON.stringify(payload) }); + if (!r.ok) throw new Error('Update failed'); + await loadInvites(); + }catch(e){ showResult('err', String(e.message||e)); } + }); + invitesTBody.querySelectorAll('.btnDetails').forEach(btn => btn.onclick = async ()=>{ + const token = btn.getAttribute('data-token'); + try{ + const r = await fetch(`/api/invite/${token}/uploads`); + const j = await r.json(); + const items = (j && j.items) ? j.items : []; + const html = ` +
+
+
+
Uploads
+ +
+
+ ${items.length? `` + items.map(it=>``).join('') + `
WhenIPFilenameSizeFingerprint
${new Date(it.uploadedAt).toLocaleString()}${it.ip||''}${it.filename||''}${(it.size||0).toLocaleString()}${(it.fingerprint||'').slice(0,16)}
` : '
No uploads yet.
'} +
+
+
`; + const wrap = document.createElement('div'); + wrap.innerHTML = html; + const dlg = wrap.firstElementChild; + document.body.appendChild(dlg); + dlg.querySelectorAll('.dlgClose').forEach(b=> b.onclick = ()=>{ try{ dlg.remove(); }catch{} }); + }catch(e){ showResult('err', 'Failed to load uploads'); } + }); + invitesTBody.querySelectorAll('.btnQR').forEach(btn => btn.onclick = ()=>{ + const url = btn.getAttribute('data-url'); + const html = ` +
+
+
+
QR Code
+ +
+
+ QR +
${url}
+
+
+
`; + const wrap = document.createElement('div'); + wrap.innerHTML = html; + const dlg = wrap.firstElementChild; + document.body.appendChild(dlg); + dlg.querySelectorAll('.dlgClose').forEach(b=> b.onclick = ()=>{ try{ dlg.remove(); }catch{} }); + }); + invitesTBody.querySelectorAll('.btnCopyLink').forEach(btn => btn.onclick = ()=>{ + const url = btn.getAttribute('data-url'); + const flash = ()=>{ btn.setAttribute('data-old', btn.innerHTML); btn.innerHTML='Copied'; setTimeout(()=>{ const old=btn.getAttribute('data-old'); if(old) btn.innerHTML=old; }, 1000); }; + if (navigator.clipboard && navigator.clipboard.writeText) { navigator.clipboard.writeText(url).then(flash).catch(()=>{}); } + }); + if (chkAll) chkAll.checked = false; + // Highlight and scroll to newly created invite if present + if (LAST_CREATED_TOKEN) { + const tr = invitesTBody.querySelector(`tr[data-token="${LAST_CREATED_TOKEN}"]`); + if (tr) { + tr.classList.add('bg-yellow-50','dark:bg-yellow-900/30'); + try { tr.scrollIntoView({ behavior:'smooth', block:'center' }); } catch {} + setTimeout(()=>{ try{ tr.classList.remove('bg-yellow-50','dark:bg-yellow-900/30'); }catch{} }, 1600); + } + LAST_CREATED_TOKEN = null; + } + } + btnRefresh.onclick = loadInvites; + searchQ.oninput = ()=>{ clearTimeout(searchQ._t); searchQ._t = setTimeout(loadInvites, 300); }; + sortSel.onchange = loadInvites; + chkAll.onchange = ()=>{ invitesTBody.querySelectorAll('.chkRow').forEach(c=>{ c.checked = chkAll.checked; }); }; + btnDisableSel.onclick = async ()=>{ + const toks = Array.from(invitesTBody.querySelectorAll('.chkRow:checked')).map(x=>x.getAttribute('data-token')); + if (!toks.length) return; + try{ + const r = await fetch('/api/invites/bulk', { method:'POST', headers:{'Content-Type':'application/json','Accept':'application/json'}, body: JSON.stringify({ tokens: toks, action:'disable' }) }); + if (!r.ok) throw new Error('Bulk disable failed'); + await loadInvites(); + }catch(e){ showResult('err', String(e.message||e)); } + }; + btnEnableSel.onclick = async ()=>{ + const toks = Array.from(invitesTBody.querySelectorAll('.chkRow:checked')).map(x=>x.getAttribute('data-token')); + if (!toks.length) return; + try{ + const r = await fetch('/api/invites/bulk', { method:'POST', headers:{'Content-Type':'application/json','Accept':'application/json'}, body: JSON.stringify({ tokens: toks, action:'enable' }) }); + if (!r.ok) throw new Error('Bulk enable failed'); + await loadInvites(); + }catch(e){ showResult('err', String(e.message||e)); } + }; + btnDeleteSel.onclick = async ()=>{ + const toks = Array.from(invitesTBody.querySelectorAll('.chkRow:checked')).map(x=>x.getAttribute('data-token')); + if (!toks.length) return; + if (!confirm('Are you sure? This cannot be undone.')) return; + try{ + const r = await fetch('/api/invites/delete', { method:'POST', headers:{'Content-Type':'application/json','Accept':'application/json'}, body: JSON.stringify({ tokens: toks }) }); + if (!r.ok) throw new Error('Delete failed'); + await loadInvites(); + }catch(e){ showResult('err', String(e.message||e)); } + }; + + // Initial load + loadInvites();