Compare commits

...

4 Commits

Author SHA1 Message Date
262cffe4a6 Translate all user-facing output to English
- Scripts: start-webdav.cmd, stop-webdav.cmd (echo messages, REM comments)
- Server: server.js (console.log, HTTP error messages)
- Token tools: token-test.js, token-refresh.js
- Other: auth-poc.js, debug-name-decrypt.js, internxt-client.js, upload.js
- Docs: README, .env.example, docs/*.md

Made-with: Cursor
2026-02-28 16:37:28 +01:00
19dd30e0fb Logging konfigurierbar via WEBDAV_LOG; DEBUG entfernt
- WEBDAV_LOG=debug|error|off steuert Datei-Logging
- DEBUG durch WEBDAV_LOG in server.js ersetzt
- .env.example: WEBDAV_LOG-Doku
- Doku: Troubleshooting mit WEBDAV_LOG=debug

Made-with: Cursor
2026-02-28 16:28:09 +01:00
b463579896 restic/rclone: PROPFIND für Dateien, MKCOL-Fix, Logging, Cache
- PROPFIND auf Dateipfade (rclone-Verifizierung nach PUT)
- MKCOL: 'already exists' -> 201 statt 500
- resolveResource: name.bin-Fallback für Dateien ohne Erweiterung
- recentFileCache für neu erstellte Dateien (API-Verzögerung)
- Logging: webdav-debug.log, webdav-errors.log, REQ/RES
- start-webdav.cmd: Log-Ausgabe in Datei, PORT aus .env
- Troubleshooting-Doku für restic 500-Fehler

Made-with: Cursor
2026-02-28 16:11:22 +01:00
bbf3b899f7 Restic-Kompatibilität, POST, rekursives MKCOL, findstr /C: Fix
- Server: POST für Uploads, rekursives MKCOL/PUT, ensureFolderExists
- PUT: fehlende Elternordner werden erstellt
- Scripts: findstr /C: für Literalsuche (Punkt-Konflikt behoben)
- Docs: Restic + rclone Hinweis

Made-with: Cursor
2026-02-28 15:18:57 +01:00
17 changed files with 537 additions and 358 deletions

View File

@@ -1,22 +1,25 @@
# Internxt API (Production - aus internxt/cli .env.template)
# Internxt API (Production - from internxt/cli .env.template)
DRIVE_API_URL=https://gateway.internxt.com/drive
# Bridge/Network für Datei-Up/Download (optional, Default: gateway.internxt.com/network)
# Bridge/Network for file up/download (optional, default: gateway.internxt.com/network)
# BRIDGE_URL=https://gateway.internxt.com/network
# Crypto secret - CLI: 6KYQBP847D4ATSFA
# drive-web nutzt REACT_APP_CRYPTO_SECRET (evtl. anderer Wert - aus drive.internxt.com JS extrahieren)
# drive-web uses REACT_APP_CRYPTO_SECRET (may differ - extract from drive.internxt.com JS)
CRYPTO_SECRET=6KYQBP847D4ATSFA
# Für Namensentschlüsselung (CRYPTO_SECRET2). Falls nicht gesetzt, wird CRYPTO_SECRET verwendet.
# For name decryption (CRYPTO_SECRET2). If unset, CRYPTO_SECRET is used.
# CRYPTO_SECRET2=6KYQBP847D4ATSFA
# DEBUG=1 # Salt-Decryption testen (ob CRYPTO_SECRET stimmt)
# DEBUG=1 # Salt decryption test; stack trace on PUT errors
# WEBDAV_LOG=debug # REQ/RES, PUT steps, errors → logs/webdav-debug.log, webdav-errors.log
# WEBDAV_LOG=error # Errors only → logs/webdav-errors.log
# WEBDAV_LOG=off # No file logging (default)
# Browser-Token (für token-test.js und WebDAV) aus drive.internxt.com localStorage
# Browser tokens (for token-test.js and WebDAV) from drive.internxt.com localStorage
# INXT_TOKEN= # xNewToken
# INXT_MNEMONIC= # xMnemonic (für Datei-Entschlüsselung)
# INXT_MNEMONIC= # xMnemonic (for file decryption)
# WebDAV-Credentials (für Duplicati, Explorer etc.)
# Ohne Angabe: beliebige Basic-Auth-Credentials werden akzeptiert.
# Mit Angabe: nur diese Credentials werden akzeptiert.
# WebDAV credentials (for Duplicati, Explorer, etc.)
# If unset: any Basic Auth credentials accepted.
# If set: only these credentials accepted.
# WEBDAV_USER=
# WEBDAV_PASS=

View File

@@ -1,34 +1,34 @@
# Internxt WebDAV Wrapper
WebDAV-Zugang zu Internxt Drive für Account-Tiers ohne CLI- oder Rclone-Native-Zugang.
WebDAV access to Internxt Drive for account tiers without native CLI or Rclone access.
## Hintergrund
## Background
Internxt blockiert für bestimmte Account-Typen (z.B. Free, Partner) den Zugang über CLI und Rclone.
Internxt blocks CLI and Rclone access for certain account types (e.g. Free, Partner).
**Lösung:** Das Web-UI (drive.internxt.com) funktioniert es nutzt `clientName: "drive-web"`. Dieser Wrapper imitiert diese Auth und bietet einen WebDAV-Server.
**Solution:** The web UI (drive.internxt.com) works it uses `clientName: "drive-web"`. This wrapper mimics that auth and provides a WebDAV server.
## Schnellstart
## Quick Start
```bash
npm install
cp .env.example .env
# .env: INXT_TOKEN, INXT_MNEMONIC, CRYPTO_SECRET eintragen (siehe docs/browser-token-auth.md)
# .env: Add INXT_TOKEN, INXT_MNEMONIC, CRYPTO_SECRET (see docs/browser-token-auth.md)
npm start
```
Server läuft auf `http://127.0.0.1:3005`.
Server runs at `http://127.0.0.1:3005`.
## Docker
```bash
# Image bauen
# Build image
docker build -t internxt-webdav .
# Container starten (Umgebungsvariablen aus .env)
# Start container (env vars from .env)
docker run -d --name internxt-webdav -p 3005:3005 --env-file .env internxt-webdav
# Oder einzelne Variablen übergeben
# Or pass individual variables
docker run -d -p 3005:3005 \
-e INXT_TOKEN="..." \
-e INXT_MNEMONIC="..." \
@@ -36,38 +36,39 @@ docker run -d -p 3005:3005 \
internxt-webdav
```
WebDAV erreichbar unter `http://localhost:3005`.
WebDAV available at `http://localhost:3005`.
## WebDAV-Funktionen
## WebDAV Features
- **PROPFIND** Verzeichnis auflisten
- **MKCOL** Ordner erstellen
- **DELETE** Dateien/Ordner löschen
- **MOVE** Verschieben/Umbenennen
- **GET** Dateien herunterladen
- **PUT** Dateien hochladen
- **PROPFIND** List directory
- **MKCOL** Create folder
- **DELETE** Delete files/folders
- **MOVE** Move/rename
- **GET** Download files
- **PUT** Upload files
## Clients
- **Duplicati** Backup-Destination (Pre: `scripts/start-webdav.cmd`, Post: `scripts/stop-webdav.cmd`)
- **Duplicati** Backup destination (Pre: `scripts/start-webdav.cmd`, Post: `scripts/stop-webdav.cmd`)
- **rclone** `rclone config` → WebDAV, URL `http://127.0.0.1:3005`
- **Windows Explorer** Netzlaufwerk verbinden
- **restic** via rclone `restic -r rclone:internxt-webdav:restic init`
- **Windows Explorer** Map network drive
## Dokumentation
## Documentation
| Datei | Beschreibung |
|-------|---------------|
| [docs/browser-token-auth.md](docs/browser-token-auth.md) | Token aus Browser extrahieren, WebDAV-Credentials |
| [docs/webdav-architektur.md](docs/webdav-architektur.md) | Architektur-Übersicht |
| [docs/wsl-setup.md](docs/wsl-setup.md) | WSL-Setup (login mit Keys) |
| [docs/auth-analysis.md](docs/auth-analysis.md) | Analyse Web vs CLI Auth |
| [docs/crypto-secret-extract.md](docs/crypto-secret-extract.md) | CRYPTO_SECRET aus drive.internxt.com ermitteln |
| File | Description |
|------|-------------|
| [docs/browser-token-auth.md](docs/browser-token-auth.md) | Extract tokens from browser, WebDAV credentials |
| [docs/webdav-architektur.md](docs/webdav-architektur.md) | Architecture overview |
| [docs/wsl-setup.md](docs/wsl-setup.md) | WSL setup (login with keys) |
| [docs/auth-analysis.md](docs/auth-analysis.md) | Web vs CLI auth analysis |
| [docs/crypto-secret-extract.md](docs/crypto-secret-extract.md) | Extract CRYPTO_SECRET from drive.internxt.com |
## Scripts
| Befehl | Beschreibung |
|-------|--------------|
| `npm start` | WebDAV-Server starten |
| `npm run token-test` | Token prüfen |
| `npm run token-refresh` | Browser öffnen, einloggen → Tokens automatisch extrahieren |
| `npm run debug-names` | Namensentschlüsselung testen |
| Command | Description |
|---------|-------------|
| `npm start` | Start WebDAV server |
| `npm run token-test` | Verify token |
| `npm run token-refresh` | Open browser, login → tokens extracted automatically |
| `npm run debug-names` | Test name decryption |

View File

@@ -1,17 +1,17 @@
# Internxt Auth-Analyse: Web vs CLI vs Rclone
# Internxt Auth Analysis: Web vs CLI vs Rclone
## Kernbefund: Client-Identifikation bestimmt Zugang
## Core Finding: Client Identification Determines Access
Der Backend-Server blockiert bestimmte Account-Tiers basierend auf der **Client-Identifikation**:
The backend server blocks certain account tiers based on **client identification**:
| Client | clientName | Login-Methode | Endpoint | Status für eingeschränkte Tiers |
|--------|-----------|---------------|----------|--------------------------------|
| **drive-web** | `drive-web` | `login()` | `/auth/login` | ✅ Erlaubt |
| **drive-desktop** | `drive-desktop` | `login()` | `/auth/login` | ✅ Erlaubt |
| **internxt-cli** | `internxt-cli` | `loginAccess()` | `/auth/login/access` | ❌ Blockiert |
| **rclone** | (rclone-adapter) | loginAccess-ähnlich | `/auth/login/access` | ❌ Blockiert |
| Client | clientName | Login Method | Endpoint | Status for restricted tiers |
|--------|------------|--------------|----------|-----------------------------|
| **drive-web** | `drive-web` | `login()` | `/auth/login` | ✅ Allowed |
| **drive-desktop** | `drive-desktop` | `login()` | `/auth/login` | ✅ Allowed |
| **internxt-cli** | `internxt-cli` | `loginAccess()` | `/auth/login/access` | ❌ Blocked |
| **rclone** | (rclone-adapter) | loginAccess-like | `/auth/login/access` | ❌ Blocked |
## Quellen
## Sources
### drive-web ([auth.service.ts](drive-web/src/services/auth.service.ts))
@@ -24,13 +24,13 @@ const getAuthClient = (authType: 'web' | 'desktop') => {
return AUTH_CLIENT[authType];
};
// Login mit authClient.login() - NICHT loginAccess()
// Login with authClient.login() - NOT loginAccess()
return authClient.login(loginDetails, cryptoProvider)
```
- **createAuthClient()**: `clientName: packageJson.name` = `"drive-web"`
- **createDesktopAuthClient()**: `clientName: "drive-desktop"`
- **Methode**: `login()` (nicht `loginAccess`)
- **Method**: `login()` (not `loginAccess`)
### CLI ([auth.service.ts](https://github.com/internxt/cli/blob/main/src/services/auth.service.ts))
@@ -39,8 +39,8 @@ const authClient = SdkManager.instance.getAuth();
const data = await authClient.loginAccess(loginDetails, CryptoService.cryptoProvider);
```
- **getAppDetails()**: `clientName: packageJson.clientName` = `"internxt-cli"` (aus [package.json](https://github.com/internxt/cli/blob/main/package.json))
- **Methode**: `loginAccess()` (nicht `login`)
- **getAppDetails()**: `clientName: packageJson.clientName` = `"internxt-cli"` (from [package.json](https://github.com/internxt/cli/blob/main/package.json))
- **Method**: `loginAccess()` (not `login`)
### SDK Factory ([drive-web](drive-web/src/app/core/factory/sdk/index.ts))
@@ -61,38 +61,38 @@ private static getDesktopAppDetails(): AppDetails {
}
```
## Lösung für WebDAV-Wrapper
## Solution for WebDAV Wrapper
**Strategie:** Den Auth-Client so konfigurieren, dass er sich als `drive-web` ausgibt und `login()` statt `loginAccess()` verwendet.
**Strategy:** Configure the auth client to identify as `drive-web` and use `login()` instead of `loginAccess()`.
1. **@internxt/sdk** mit `Auth.client(apiUrl, appDetails, apiSecurity)` verwenden
2. **appDetails** setzen: `{ clientName: "drive-web", clientVersion: "1.0" }`
3. **login()** aufrufen (nicht `loginAccess()`)
4. CryptoProvider wie in drive-web implementieren (passToHash, decryptText, getKeys, parseAndDecryptUserKeys)
1. Use **@internxt/sdk** with `Auth.client(apiUrl, appDetails, apiSecurity)`
2. Set **appDetails**: `{ clientName: "drive-web", clientVersion: "1.0" }`
3. Call **login()** (not `loginAccess()`)
4. Implement CryptoProvider like in drive-web (passToHash, decryptText, getKeys, parseAndDecryptUserKeys)
## Abhängigkeiten für WebDAV-Wrapper
## Dependencies for WebDAV Wrapper
- `@internxt/sdk` (Version 1.13.x oder kompatibel drive-web nutzt 1.13.2)
- `@internxt/lib` (für aes, Crypto)
- Crypto-Logik aus drive-web: `app/crypto/services/keys.service`, `app/crypto/services/utils`
- Keys-Format: ECC + Kyber (post-quantum)
- `@internxt/sdk` (version 1.13.x or compatible drive-web uses 1.13.2)
- `@internxt/lib` (for aes, Crypto)
- Crypto logic from drive-web: `app/crypto/services/keys.service`, `app/crypto/services/utils`
- Keys format: ECC + Kyber (post-quantum)
## Aktueller Status (Stand: Analyse)
## Current Status (as of analysis)
- **CRYPTO_SECRET**: Korrekt (Salt-Decryption OK mit `6KYQBP847D4ATSFA`)
- **loginWithoutKeys**: Liefert weiterhin "Wrong login credentials" möglicherweise lehnt das Backend diesen Flow für bestimmte Account-Typen (z.B. mailbox.org-Partner) ab
- **login() mit Keys**: Kyber-WASM schlägt unter Windows fehl (`@dashlane/pqc-kem-kyber512-node`)
- **CRYPTO_SECRET**: Correct (salt decryption OK with `6KYQBP847D4ATSFA`)
- **loginWithoutKeys**: Still returns "Wrong login credentials" backend may reject this flow for certain account types (e.g. mailbox.org partner)
- **login() with keys**: Kyber-WASM fails under Windows (`@dashlane/pqc-kem-kyber512-node`)
## Nächste Schritte
## Next Steps
1. **Ansatz B testen**: Browser-basierter Token-Extrakt im Web einloggen, Session-Token aus localStorage/DevTools lesen, im Wrapper verwenden
2. **login() unter Linux**: Kyber-Paket könnte unter Linux funktionieren
3. **Internxt-Support**: Nachfragen, ob Partner-Accounts (mailbox.org) andere Auth-Flows nutzen
1. **Test approach B**: Browser-based token extraction log in via web, read session token from localStorage/DevTools, use in wrapper
2. **login() under Linux**: Kyber package may work under Linux
3. **Internxt support**: Ask whether partner accounts (mailbox.org) use different auth flows
## CRYPTO_SECRET und API-URL
## CRYPTO_SECRET and API URL
Aus [internxt/cli .env.template](https://github.com/internxt/cli/blob/main/.env.template):
From [internxt/cli .env.template](https://github.com/internxt/cli/blob/main/.env.template):
- **DRIVE_API_URL**: `https://gateway.internxt.com/drive`
- **APP_CRYPTO_SECRET**: `6KYQBP847D4ATSFA`
Der PoC nutzt diese Werte als Fallback.
The PoC uses these values as fallback.

View File

@@ -1,44 +1,44 @@
# Browser-Token-Authentifizierung (Ansatz B)
# Browser Token Authentication (Approach B)
Da der API-Login für Ihren Account-Typ blockiert ist, können Sie sich im Browser einloggen und die Session-Daten für den WebDAV-Wrapper verwenden.
Since API login is blocked for your account type, you can log in via the browser and use the session data for the WebDAV wrapper.
## Ablauf
## Flow
1. Auf https://drive.internxt.com einloggen
2. Token und Mnemonic aus dem Browser extrahieren
3. In `.env` eintragen
4. WebDAV-Server starten
1. Log in at https://drive.internxt.com
2. Extract token and mnemonic from the browser
3. Add to `.env`
4. Start WebDAV server
## Token extrahieren
## Extracting Tokens
### Schritt 1: Alle gespeicherten Keys anzeigen
### Step 1: Show all stored keys
Auf **https://drive.internxt.com** eingeloggt sein. DevTools (F12) → **Console**:
Be logged in at **https://drive.internxt.com**. DevTools (F12) → **Console**:
```javascript
// Alle localStorage-Keys anzeigen
// Show all localStorage keys
Object.keys(localStorage).filter(k => k.includes('x') || k.includes('token') || k.includes('Token')).forEach(k => console.log(k));
```
Damit sehen Sie, welche Keys es gibt (z.B. `xNewToken`, `xMnemonic`, `xUser`).
This shows which keys exist (e.g. `xNewToken`, `xMnemonic`, `xUser`).
### Schritt 2: Token und Mnemonic auslesen
### Step 2: Read token and mnemonic
```javascript
// Token und Mnemonic anzeigen
console.log('Token:', localStorage.getItem('xNewToken') || localStorage.getItem('xToken') || '(nicht gefunden)');
console.log('Mnemonic:', localStorage.getItem('xMnemonic') || '(nicht gefunden)');
// Display token and mnemonic
console.log('Token:', localStorage.getItem('xNewToken') || localStorage.getItem('xToken') || '(not found)');
console.log('Mnemonic:', localStorage.getItem('xMnemonic') || '(not found)');
```
### Schritt 3: Falls nichts gefunden wird
### Step 3: If nothing is found
- **Application-Tab prüfen:** DevTools → **Application** (oder **Anwendung**) → links **Local Storage****https://drive.internxt.com** auswählen. Dort alle Einträge durchsehen.
- **Richtige URL:** Sie müssen auf `https://drive.internxt.com` sein (nicht internxt.com) und **eingeloggt** sein nach dem Login auf `/drive` oder `/app`.
- **Session vs. Local:** Manche Werte liegen in `sessionStorage`. Testen mit:
- **Check Application tab:** DevTools → **Application** (or **Storage**) → **Local Storage** select **https://drive.internxt.com**. Inspect all entries.
- **Correct URL:** You must be on `https://drive.internxt.com` (not internxt.com) and **logged in** after login, on `/drive` or `/app`.
- **Session vs Local:** Some values may be in `sessionStorage`. Test with:
```javascript
console.log('sessionStorage:', Object.keys(sessionStorage));
```
- **Alle Keys anzeigen:** Zum Debuggen alle Keys mit Werten:
- **Show all keys:** For debugging, list all keys with values:
```javascript
for (let i = 0; i < localStorage.length; i++) {
const k = localStorage.key(i);
@@ -46,91 +46,106 @@ console.log('Mnemonic:', localStorage.getItem('xMnemonic') || '(nicht gefunden)'
}
```
## .env eintragen
## Add to .env
```
INXT_TOKEN=eyJhbGciOiJIUzI1NiIs...
INXT_MNEMONIC=word1 word2 word3 ...
# Namensentschlüsselung: CRYPTO_SECRET oder CRYPTO_SECRET2 (CLI-Default: 6KYQBP847D4ATSFA)
# Name decryption: CRYPTO_SECRET or CRYPTO_SECRET2 (CLI default: 6KYQBP847D4ATSFA)
CRYPTO_SECRET=6KYQBP847D4ATSFA
# Optional: WebDAV-Credentials erzwingen (sonst beliebige Credentials akzeptiert)
# Optional: Enforce WebDAV credentials (otherwise any credentials accepted)
# WEBDAV_USER=backup
# WEBDAV_PASS=geheim
# WEBDAV_PASS=secret
```
## Duplicati Pre-/Post-Scripts (optional)
## Duplicati Pre/Post Scripts (optional)
Falls der WebDAV-Server nicht dauerhaft läuft, kann Duplicati ihn vor dem Backup starten und danach beenden:
If the WebDAV server does not run permanently, Duplicati can start it before backup and stop it after:
| Script | Duplicati-Einstellung | Pfad |
|--------|------------------------|------|
| Start | Vor dem Backup ausführen | `scripts\start-webdav.cmd` |
| Stop | Nach dem Backup ausführen | `scripts\stop-webdav.cmd` |
| Script | Duplicati setting | Path |
|--------|-------------------|------|
| Start | Run before backup | `scripts\start-webdav.cmd` |
| Stop | Run after backup | `scripts\stop-webdav.cmd` |
**Einstellungen → Erweitert → Scripts** jeweils den vollen Pfad eintragen, z.B.:
**Settings → Advanced → Scripts** enter full path, e.g.:
```
C:\Pfad\zu\internxt-webdav\scripts\start-webdav.cmd
C:\Pfad\zu\internxt-webdav\scripts\stop-webdav.cmd
C:\Path\to\internxt-webdav\scripts\start-webdav.cmd
C:\Path\to\internxt-webdav\scripts\stop-webdav.cmd
```
Optional Port als Argument (Standard: 3005):
Optional port as argument (default: 3005):
```
C:\Pfad\zu\internxt-webdav\scripts\start-webdav.cmd 8080
C:\Pfad\zu\internxt-webdav\scripts\stop-webdav.cmd 8080
C:\Path\to\internxt-webdav\scripts\start-webdav.cmd 8080
C:\Path\to\internxt-webdav\scripts\stop-webdav.cmd 8080
```
Der Server startet im Hintergrund und ist nach ~5 Sekunden bereit.
The server starts in the background and is ready after ~5 seconds.
## WebDAV-Credentials (für Duplicati, Explorer)
## Restic + rclone
Der Server erwartet **Basic Auth**. Ohne `WEBDAV_USER`/`WEBDAV_PASS` in `.env` akzeptiert er **beliebige** Credentials Sie können in Duplicati z.B. Benutzername `backup` und Passwort `geheim` eintragen. Mit `WEBDAV_USER` und `WEBDAV_PASS` werden nur diese Credentials akzeptiert.
```bash
restic -r rclone:internxt-webdav:repo-name init
```
## WebDAV-Server starten
The server creates missing folders recursively (MKCOL). On 500 errors: check server log (`PUT Fehler:`), renew token with `npm run token-refresh`.
### Restic "object not found" / 500
1. **Check port:** rclone URL must match server port exactly. If console shows e.g. `http://127.0.0.1:3010`, set `url = http://127.0.0.1:3010` in rclone.
2. **Single server only:** Stop `npm start` (Ctrl+C), then use only `scripts\start-webdav.cmd` otherwise an old process may respond.
3. **rclone config:** `rclone config` → Remote `internxt-webdav` → `url` = `http://127.0.0.1:PORT` (PORT from server startup).
4. **Logs:** Set `WEBDAV_LOG=debug` in `.env`, restart server, then check `logs\webdav-errors.log` and `logs\webdav-debug.log`.
## WebDAV Credentials (for Duplicati, Explorer)
The server expects **Basic Auth**. Without `WEBDAV_USER`/`WEBDAV_PASS` in `.env`, it accepts **any** credentials you can use e.g. username `backup` and password `secret` in Duplicati. With `WEBDAV_USER` and `WEBDAV_PASS` set, only those credentials are accepted.
## Start WebDAV Server
```bash
npm start
```
Server läuft auf `http://127.0.0.1:3005`. Phase 14 aktiv: PROPFIND, MKCOL, DELETE, MOVE, GET, PUT. Für GET und PUT wird INXT_MNEMONIC benötigt.
Server runs at `http://127.0.0.1:3005`. Phase 14 active: PROPFIND, MKCOL, DELETE, MOVE, GET, PUT. INXT_MNEMONIC required for GET and PUT.
### PowerShell Copy-Item: Null character in path
### PowerShell Copy-Item: "Null character in path"
Windows/.NET fügt bei WebDAV-Pfaden manchmal Null-Bytes ein. **Workaround:**
Windows/.NET sometimes adds null bytes to WebDAV paths. **Workaround:**
```powershell
# Variante 1: Direkt per HTTP (umgeht WebDAV-Bugs, UUID aus dir i: übernehmen)
# Option 1: Direct HTTP (bypasses WebDAV bugs, use UUID from dir i:)
Invoke-WebRequest -Uri "http://127.0.0.1:3005/_.69942103-e16f-4714-89bb-9f9f7d3b1bd5" -OutFile test.md
# Upload per PUT (PowerShell)
Invoke-WebRequest -Uri "http://127.0.0.1:3005/meine-datei.txt" -Method PUT -Body "Inhalt" -ContentType "application/octet-stream"
# Upload via PUT (PowerShell)
Invoke-WebRequest -Uri "http://127.0.0.1:3005/my-file.txt" -Method PUT -Body "Content" -ContentType "application/octet-stream"
# Variante 2: Robocopy (kopiert alle Dateien aus Root)
# Option 2: Robocopy (copy all files from root)
robocopy "i:\" "." /NFL /NDL
# Variante 3: Explorer Datei per Drag & Drop kopieren
``` Windows Explorer: Netzlaufwerk verbinden`http://127.0.0.1:3005`.
# Option 3: Explorer drag & drop file
# Windows Explorer: Map network drive → http://127.0.0.1:3005
## Token erneuern (bei 401 / abgelaufen)
## Renew Token (on 401 / expired)
Tokens laufen nach einiger Zeit ab (typisch Stunden). Bei 401-Fehlern oder „Nicht autorisiert“:
Tokens expire after some time (typically hours). On 401 errors or "Unauthorized":
### Option A: Automatisch (Chromium)
### Option A: Automatic (Chromium)
```bash
npm run token-refresh
```
Öffnet einen Browser mit drive.internxt.com. Einloggen die Tokens werden extrahiert und `.env` automatisch aktualisiert. Server neu starten.
Opens a browser with drive.internxt.com. Log in tokens are extracted and `.env` updated automatically. Restart server.
### Option B: Manuell
### Option B: Manual
1. **[https://drive.internxt.com](https://drive.internxt.com)** öffnen und erneut einloggen
2. Token und Mnemonic wie oben (Schritt 2) aus der Console auslesen
3. `.env` mit den neuen Werten aktualisieren
4. WebDAV-Server neu starten
1. Open **[https://drive.internxt.com](https://drive.internxt.com)** and log in again
2. Read token and mnemonic from Console as in Step 2 above
3. Update `.env` with new values
4. Restart WebDAV server
## Hinweise
## Notes
- **Bridge-API**: Der Download nutzt die Internxt Bridge mit `x-api-version: 2` und den Headern `internxt-version`/`internxt-client`. Ohne diese liefert die Bridge 400.
- **Sicherheit**: Mnemonic und Token sind hochsensibel. Nicht in Git committen, `.env` in `.gitignore` belassen.
- **Nur für Sie**: Die Tokens sind an Ihre Session gebunden. Für andere Nutzer funktioniert dieser Ansatz nicht.
- **Bridge API:** Download uses Internxt Bridge with `x-api-version: 2` and headers `internxt-version`/`internxt-client`. Without these, Bridge returns 400.
- **Security:** Mnemonic and token are highly sensitive. Do not commit to Git, keep `.env` in `.gitignore`.
- **Personal only:** Tokens are bound to your session. This approach does not work for other users.

View File

@@ -1,35 +1,35 @@
# CRYPTO_SECRET aus drive.internxt.com ermitteln
# Extract CRYPTO_SECRET from drive.internxt.com
Falls der Login mit "Wrong login credentials" fehlschlägt, ist vermutlich der `CRYPTO_SECRET` falsch. drive-web nutzt `REACT_APP_CRYPTO_SECRET`, der CLI-Wert (`6KYQBP847D4ATSFA`) kann abweichen.
If login fails with "Wrong login credentials", `CRYPTO_SECRET` is likely incorrect. drive-web uses `REACT_APP_CRYPTO_SECRET`, which may differ from the CLI value (`6KYQBP847D4ATSFA`).
## Methode 1: DEBUG-Modus (Salt-Decryption prüfen)
## Method 1: DEBUG mode (verify salt decryption)
```bash
DEBUG=1 npm run auth-test
```
- **"Salt-Decryption OK"** → CRYPTO_SECRET stimmt, Problem liegt woanders (Passwort, API)
- **"Salt-Decryption fehlgeschlagen"** → CRYPTO_SECRET ist falsch
- **"Salt decryption OK"** → CRYPTO_SECRET is correct, problem is elsewhere (password, API)
- **"Salt decryption failed"** → CRYPTO_SECRET is wrong
## Methode 2: Secret im Browser suchen
## Method 2: Search for secret in browser
1. https://drive.internxt.com öffnen
1. Open https://drive.internxt.com
2. DevTools (F12) → **Sources**
3. **Strg+Shift+F** (Suche in allen Dateien)
4. Suchen nach:
- `6KYQBP847D4ATSFA` falls gefunden, wird derselbe Wert wie beim CLI genutzt
- `REACT_APP_CRYPTO_SECRET` oder `CRYPTO_SECRET`
- Hex-Strings (z.B. 16 Zeichen wie `a1b2c3d4e5f6...`)
3. **Ctrl+Shift+F** (search in all files)
4. Search for:
- `6KYQBP847D4ATSFA` if found, same value as CLI is used
- `REACT_APP_CRYPTO_SECRET` or `CRYPTO_SECRET`
- Hex strings (e.g. 16 chars like `a1b2c3d4e5f6...`)
5. Gefundenen Wert in `.env` eintragen:
5. Add found value to `.env`:
```
CRYPTO_SECRET=gefundener_wert
CRYPTO_SECRET=found_value
```
## Methode 3: drive-web lokal bauen (mit bekanntem Secret)
## Method 3: Build drive-web locally (with known secret)
Falls Sie Zugriff auf drive-web haben und den korrekten Secret kennen:
If you have access to drive-web and know the correct secret:
1. In `drive-web` eine `.env` mit `REACT_APP_CRYPTO_SECRET=...` anlegen
2. `yarn build` ausführen
3. In den Build-Artefakten nach dem eingebetteten Wert suchen
1. Create `.env` in `drive-web` with `REACT_APP_CRYPTO_SECRET=...`
2. Run `yarn build`
3. Search build artifacts for the embedded value

View File

@@ -1,8 +1,8 @@
# WebDAV-Server Architektur
# WebDAV Server Architecture
## Empfohlener Ansatz: Adapter (nicht Proxy)
## Recommended Approach: Adapter (not Proxy)
Der WebDAV-Server ist ein **Adapter**: Er implementiert das WebDAV-Protokoll und übersetzt Anfragen in Internxt Drive API + Network-Aufrufe. Es wird **nicht** zu einem anderen WebDAV-Server weitergeleitet.
The WebDAV server is an **adapter**: it implements the WebDAV protocol and translates requests into Internxt Drive API + Network calls. It does **not** forward to another WebDAV server.
```mermaid
flowchart LR
@@ -13,7 +13,7 @@ flowchart LR
subgraph Adapter [WebDAV Adapter]
WebDAV[WebDAV Server]
Mapping[Path zu UUID]
Mapping[Path to UUID]
Crypto[Encrypt/Decrypt]
end
@@ -29,46 +29,46 @@ flowchart LR
Crypto --> Network
```
## Datenfluss
## Data Flow
| WebDAV-Anfrage | Internxt-Operation |
| WebDAV Request | Internxt Operation |
|----------------|-------------------|
| PROPFIND (Verzeichnisinhalt) | Storage.getFolderContentByUuid |
| GET (Datei lesen) | File-Metadaten → Network.download → Entschlüsseln |
| PUT (Datei schreiben) | Verschlüsseln → Network.upload → createFileEntry |
| MKCOL (Ordner anlegen) | Storage.createFolderByUuid |
| DELETE | Trash oder permanent delete |
| PROPFIND (directory listing) | Storage.getFolderContentByUuid |
| GET (read file) | File metadata → Network.download → Decrypt |
| PUT (write file) | Encrypt → Network.upload → createFileEntry |
| MKCOL (create folder) | Storage.createFolderByUuid |
| DELETE | Trash or permanent delete |
| MOVE | Storage.moveFileByUuid / moveFolderByUuid |
## Komplexität
## Complexity
- **Einfach:** PROPFIND, MKCOL nur Drive API, keine Verschlüsselung
- **Mittel:** DELETE, MOVE Drive API
- **Aufwändig:** GET, PUT Bridge/Network + Mnemonic-Verschlüsselung
- **Simple:** PROPFIND, MKCOL Drive API only, no encryption
- **Medium:** DELETE, MOVE Drive API
- **Complex:** GET, PUT Bridge/Network + mnemonic encryption
Die drive-web nutzt `Network.client` (Bridge) und `NetworkFacade` für Up-/Download. Die Bridge-Credentials kommen aus der User-Session.
drive-web uses `Network.client` (Bridge) and `NetworkFacade` for up/download. Bridge credentials come from the user session.
## Implementierungsreihenfolge
## Implementation Order
1. **Phase 1:** PROPFIND (Verzeichnis auflisten) ✅ implementiert
2. **Phase 2:** MKCOL, DELETE, MOVE ✅ implementiert
3. **Phase 3:** GET (Download) Bridge + Entschlüsselung ✅ implementiert
4. **Phase 4:** PUT (Upload) Verschlüsselung + Bridge ✅ implementiert
1. **Phase 1:** PROPFIND (list directory) ✅ implemented
2. **Phase 2:** MKCOL, DELETE, MOVE ✅ implemented
3. **Phase 3:** GET (download) Bridge + decryption ✅ implemented
4. **Phase 4:** PUT (upload) Encryption + Bridge ✅ implemented
### Namensentschlüsselung
### Name Decryption
Internxt nutzt Zero-Knowledge-Verschlüsselung. Die API liefert verschlüsselte Namen (`name`). Wenn `plain_name` fehlt, kann der Server mit `CRYPTO_SECRET2` (oder `CRYPTO_SECRET`) Namen entschlüsseln analog zur drive-web `aes.decrypt`-Logik mit `secret2-parentId`/`secret2-folderId`. Ohne gesetztes Secret werden die rohen (verschlüsselten) Namen verwendet.
Internxt uses zero-knowledge encryption. The API returns encrypted names (`name`). When `plain_name` is missing, the server can decrypt names with `CRYPTO_SECRET2` (or `CRYPTO_SECRET`) analogous to drive-web `aes.decrypt` logic with `secret2-parentId`/`secret2-folderId`. Without a set secret, raw (encrypted) names are used.
## Token vs. Bridge-Credentials
## Token vs Bridge Credentials
- **Drive API:** Nutzt `xNewToken` (Authorization: Bearer)
- **Network/Bridge:** Braucht `bridgeUser` + `userId` (aus User-Credentials) und Mnemonic für Verschlüsselung
- **Drive API:** Uses `xNewToken` (Authorization: Bearer)
- **Network/Bridge:** Requires `bridgeUser` + `userId` (from user credentials) and mnemonic for encryption
**Bridge-Credentials aus refreshUser:** Der Endpoint `/users/refresh` liefert `user` (UserResponseDto) mit:
- `bridgeUser` E-Mail des Nutzers
- `userId` wird mit SHA256 gehasht und als Bridge-Passwort genutzt
- `bucket` Bucket-ID für Uploads
- `mnemonic` für Dateiverschlüsselung
- `rootFolderId` Root-Ordner-UUID
**Bridge credentials from refreshUser:** The `/users/refresh` endpoint returns `user` (UserResponseDto) with:
- `bridgeUser` User email
- `userId` hashed with SHA256 and used as Bridge password
- `bucket` Bucket ID for uploads
- `mnemonic` for file encryption
- `rootFolderId` Root folder UUID
Damit liefert der Browser-Token (via refreshUser) alle nötigen Daten für Drive API und Bridge.
Thus the browser token (via refreshUser) provides all data needed for Drive API and Bridge.

View File

@@ -1,36 +1,36 @@
# Entwicklung unter WSL (login() mit Keys)
# Development under WSL (login() with Keys)
Unter Windows schlägt das Kyber-WASM-Modul fehl. Unter WSL (Ubuntu/Debian) funktioniert es in der Regel.
Under Windows, the Kyber WASM module fails. Under WSL (Ubuntu/Debian) it usually works.
## Voraussetzungen
## Prerequisites
- WSL2 mit Ubuntu oder Debian
- WSL2 with Ubuntu or Debian
- Node.js 20+ (`node -v`)
## Setup
```bash
# Im WSL-Terminal
# In WSL terminal
cd /mnt/c/Users/mbusc/source/repos/internxt-webdav
# Abhängigkeiten installieren (inkl. Kyber für login mit Keys)
# Install dependencies (including Kyber for login with keys)
npm install
# .env mit Credentials (INXT_EMAIL, INXT_PASSWORD)
# Optional: DEBUG=1 für Salt-Check
# .env with credentials (INXT_EMAIL, INXT_PASSWORD)
# Optional: DEBUG=1 for salt check
```
## Auth-PoC mit login() testen
## Test Auth PoC with login()
Zuerst den Auth-PoC auf `login()` umstellen (mit Keys):
First switch the Auth PoC to `login()` (with keys):
```bash
npm run auth-test
```
Falls der Fehler "Wrong login credentials" weiterhin auftritt, liegt das Problem nicht am Kyber-WASM, sondern am Backend/Account-Typ.
If the "Wrong login credentials" error persists, the issue is not Kyber-WASM but the backend/account type.
## Projektpfad
## Project Path
Windows-Pfad: `c:\Users\mbusc\source\repos\internxt-webdav`
WSL-Pfad: `/mnt/c/Users/mbusc/source/repos/internxt-webdav`
Windows path: `c:\Users\mbusc\source\repos\internxt-webdav`
WSL path: `/mnt/c/Users/mbusc/source/repos/internxt-webdav`

View File

@@ -1,46 +1,49 @@
@echo off
REM Duplicati Pre-Start: WebDAV-Server starten
REM In Duplicati: Einstellungen -> Erweitert -> Scripts -> Vor dem Backup ausfuehren
REM Pfad: C:\Pfad\zu\internxt-webdav\scripts\start-webdav.cmd
REM Optional: Port als Argument (z.B. start-webdav.cmd 8080)
if "%1"=="" (set PORT=3005) else (set PORT=%1)
REM Duplicati Pre-Start: Start WebDAV server
REM In Duplicati: Settings -> Advanced -> Scripts -> Run before backup
REM Path: C:\path\to\internxt-webdav\scripts\start-webdav.cmd
REM Optional: Port as argument (e.g. start-webdav.cmd 8080)
cd /d "%~dp0.."
if "%1"=="" (set PORT=3005) else (set PORT=%1)
for /f "tokens=2 delims==" %%a in ('findstr /B "PORT=" .env 2^>nul') do set PORT=%%a
REM .env und Token pruefen
REM Check .env and token
if not exist .env (
echo FEHLER: .env fehlt. Bitte von .env.example kopieren und INXT_TOKEN eintragen.
echo ERROR: .env missing. Copy from .env.example and add INXT_TOKEN.
exit /b 1
)
findstr /B "INXT_TOKEN=" .env 2>nul | findstr "INXT_TOKEN=." > nul 2>&1
if %errorlevel% neq 0 (
echo FEHLER: INXT_TOKEN fehlt oder leer in .env. Token abgelaufen? npm run token-refresh ausfuehren.
echo ERROR: INXT_TOKEN missing or empty in .env. Token expired? Run npm run token-refresh.
exit /b 1
)
REM Pruefen ob Server bereits laeuft (0.0.0.0:0 = Listening, sprachunabhaengig)
netstat -an | findstr ":%PORT% " | findstr "0.0.0.0:0" > nul 2>&1
REM Check if server already running (0.0.0.0:0 = Listening)
netstat -an | findstr /C:":%PORT% " | findstr /C:"0.0.0.0:0" > nul 2>&1
if %errorlevel% equ 0 (
echo WebDAV-Server laeuft bereits.
echo WebDAV server already running.
exit /b 0
)
echo Starte WebDAV-Server...
start /B node src/server.js > nul 2>&1
if not exist "%~dp0..\logs" mkdir "%~dp0..\logs"
set LOGFILE=%~dp0..\logs\webdav.log
echo [%date% %time%] Starting WebDAV server... >> "%LOGFILE%"
echo Starting WebDAV server... Log: %LOGFILE%
start /B node src/server.js >> "%LOGFILE%" 2>&1
REM Warten und pruefen ob Server antwortet (OPTIONS braucht keine Auth)
REM Wait and check if server responds (OPTIONS does not require auth)
set RETRIES=0
:wait
timeout /t 2 /nobreak > nul
powershell -NoProfile -Command "try { (Invoke-WebRequest -Uri http://127.0.0.1:%PORT%/ -Method OPTIONS -UseBasicParsing -TimeoutSec 2).StatusCode -eq 200 } catch { exit 1 }" > nul 2>&1
if %errorlevel% equ 0 (
echo WebDAV-Server gestartet.
echo WebDAV server started.
exit /b 0
)
set /a RETRIES+=1
if %RETRIES% geq 5 (
echo FEHLER: Server antwortet nicht. Token pruefen: npm run token-test
echo ERROR: Server not responding. Check token: npm run token-test
exit /b 1
)
goto wait

View File

@@ -1,18 +1,18 @@
@echo off
REM Duplicati Post-Backup: WebDAV-Server beenden
REM In Duplicati: Einstellungen -> Erweitert -> Scripts -> Nach dem Backup ausfuehren
REM Pfad: C:\Pfad\zu\internxt-webdav\scripts\stop-webdav.cmd
REM Optional: Port als Argument (z.B. stop-webdav.cmd 8080)
REM Duplicati Post-Backup: Stop WebDAV server
REM In Duplicati: Settings -> Advanced -> Scripts -> Run after backup
REM Path: C:\path\to\internxt-webdav\scripts\stop-webdav.cmd
REM Optional: Port as argument (e.g. stop-webdav.cmd 8080)
if "%1"=="" (set PORT=3005) else (set PORT=%1)
REM Prozess auf Port finden und beenden
REM Filter: Port + "0.0.0.0:0" = Listening (sprachunabhaengig)
for /f "tokens=5" %%a in ('netstat -ano 2^>nul ^| findstr ":%PORT% " ^| findstr "0.0.0.0:0"') do (
REM Find and terminate process on port
REM Filter: Port + "0.0.0.0:0" = Listening (language-independent)
for /f "tokens=5" %%a in ('netstat -ano 2^>nul ^| findstr /C:":%PORT% " ^| findstr /C:"0.0.0.0:0"') do (
taskkill /PID %%a /F > nul 2>&1
echo WebDAV-Server beendet (PID %%a).
echo WebDAV server stopped - PID %%a
exit /b 0
)
echo WebDAV-Server war nicht aktiv.
echo WebDAV server was not running.
exit /b 0

View File

@@ -21,7 +21,7 @@ const password = process.env.INXT_PASSWORD;
const twoFactorCode = process.env.INXT_2FA || '';
if (!email || !password) {
console.error('Fehler: INXT_EMAIL und INXT_PASSWORD müssen gesetzt sein.');
console.error('Error: INXT_EMAIL and INXT_PASSWORD must be set.');
process.exit(1);
}
@@ -70,10 +70,10 @@ const apiSecurity = {
};
async function main() {
console.log('Internxt Auth PoC - Login mit clientName "drive-web"');
console.log('Internxt Auth PoC - Login with clientName "drive-web"');
console.log('API:', DRIVE_API_URL);
console.log('E-Mail:', email);
console.log('2FA:', twoFactorCode ? '***' + twoFactorCode.slice(-2) : '(nicht gesetzt)');
console.log('Email:', email);
console.log('2FA:', twoFactorCode ? '***' + twoFactorCode.slice(-2) : '(not set)');
console.log('');
const authClient = Auth.client(DRIVE_API_URL, appDetails, apiSecurity);
@@ -84,9 +84,9 @@ async function main() {
const details = await authClient.securityDetails(email.toLowerCase());
const salt = decryptText(details.encryptedSalt);
const isHex = /^[0-9a-f]+$/i.test(salt);
console.log('DEBUG: Salt-Decryption OK, Format:', isHex ? 'Hex' : 'anderes');
console.log('DEBUG: Salt decryption OK, format:', isHex ? 'Hex' : 'other');
} catch (e) {
console.error('DEBUG: Salt-Decryption fehlgeschlagen - CRYPTO_SECRET evtl. falsch:', e.message);
console.error('DEBUG: Salt decryption failed - CRYPTO_SECRET may be wrong:', e.message);
}
}
@@ -100,27 +100,27 @@ async function main() {
cryptoProvider
);
console.log('Login erfolgreich!');
console.log('Login successful!');
console.log('Token:', result.newToken?.substring(0, 20) + '...');
console.log('User:', result.user?.email);
console.log('');
console.log('Der WebDAV-Wrapper kann mit dieser Auth gebaut werden.');
console.log('The WebDAV wrapper can be built with this auth.');
} catch (err) {
console.error('Login fehlgeschlagen:', err.message);
console.error('Login failed:', err.message);
if (err.response?.data) {
console.error('Response:', JSON.stringify(err.response.data, null, 2));
}
if (err.message?.includes('cli access not allowed') || err.message?.includes('rclone access not allowed')) {
console.error('');
console.error('Hinweis: Dieser Fehler sollte mit clientName "drive-web" NICHT auftreten.');
console.error('Note: This error should NOT occur with clientName "drive-web".');
}
if (err.message?.includes('Wrong login credentials')) {
console.error('');
console.error('Mögliche Ursachen:');
console.error('1. CRYPTO_SECRET falsch - drive-web nutzt REACT_APP_CRYPTO_SECRET (evtl. anderer Wert)');
console.error(' -> DEBUG=1 setzen und erneut ausführen, um Salt-Decryption zu prüfen');
console.error('2. 2FA-Code abgelaufen (30s gültig) - neuen Code eingeben');
console.error('3. Passwort/E-Mail falsch');
console.error('Possible causes:');
console.error('1. CRYPTO_SECRET wrong - drive-web uses REACT_APP_CRYPTO_SECRET (possibly different value)');
console.error(' -> Set DEBUG=1 and run again to verify salt decryption');
console.error('2. 2FA code expired (valid 30s) - enter new code');
console.error('3. Password/email incorrect');
}
process.exit(1);
}

View File

@@ -15,13 +15,13 @@ const DRIVE_API_URL = process.env.DRIVE_API_URL || 'https://gateway.internxt.com
const token = process.env.INXT_TOKEN;
if (!token) {
console.error('INXT_TOKEN fehlt');
console.error('INXT_TOKEN missing');
process.exit(1);
}
const secret = process.env.CRYPTO_SECRET2 || process.env.CRYPTO_SECRET;
if (!secret) {
console.error('CRYPTO_SECRET oder CRYPTO_SECRET2 fehlt in .env');
console.error('CRYPTO_SECRET or CRYPTO_SECRET2 missing in .env');
process.exit(1);
}
@@ -29,7 +29,7 @@ const appDetails = { clientName: 'drive-web', clientVersion: '1.0' };
const apiSecurity = {
token,
unauthorizedCallback: () => {
throw new Error('Token ungültig');
throw new Error('Token invalid');
},
};
@@ -43,7 +43,7 @@ async function main() {
const rootUuid = user?.rootFolderUuid || user?.rootFolderId;
if (!rootUuid) {
console.error('Root-Ordner nicht gefunden');
console.error('Root folder not found');
process.exit(1);
}
@@ -58,15 +58,15 @@ async function main() {
const folders = content?.children || [];
const files = content?.files || [];
console.log('=== Namensentschlüsselung in', path, '===');
console.log('CRYPTO_SECRET2/CRYPTO_SECRET:', secret ? secret.substring(0, 4) + '***' : '(nicht gesetzt)');
console.log('=== Name decryption in', path, '===');
console.log('CRYPTO_SECRET2/CRYPTO_SECRET:', secret ? secret.substring(0, 4) + '***' : '(not set)');
console.log('');
for (const c of folders) {
const plain = getPlainName(c.name, c.plain_name ?? c.plainName, c.parent_id ?? c.parentId, null);
const ok = plain !== c.name && plain.length > 0 && !/^[A-Za-z0-9+/=]{20,}$/.test(plain);
console.log('Ordner:', ok ? '✓' : '✗', plain);
console.log(' verschlüsselt:', c.name?.substring(0, 50) + '...');
console.log('Folder:', ok ? '✓' : '✗', plain);
console.log(' encrypted:', c.name?.substring(0, 50) + '...');
console.log(' parent_id:', c.parent_id ?? c.parentId);
console.log('');
}
@@ -74,14 +74,14 @@ async function main() {
for (const f of files) {
const plain = getPlainName(f.name, f.plain_name ?? f.plainName, null, f.folder_id ?? f.folderId);
const ok = plain !== f.name && plain.length > 0 && !/^[A-Za-z0-9+/=]{20,}$/.test(plain);
console.log('Datei:', ok ? '✓' : '✗', plain);
console.log(' verschlüsselt:', f.name?.substring(0, 50) + '...');
console.log('File:', ok ? '✓' : '✗', plain);
console.log(' encrypted:', f.name?.substring(0, 50) + '...');
console.log(' folder_id:', f.folder_id ?? f.folderId);
console.log('');
}
if (folders.length === 0 && files.length === 0) {
console.log('(Leerer Ordner)');
console.log('(Empty folder)');
}
}

View File

@@ -19,7 +19,7 @@ export function createClients(token) {
const apiSecurity = {
token,
unauthorizedCallback: () => {
throw new Error('Token abgelaufen oder ungültig');
throw new Error('Token expired or invalid');
},
};
return {

View File

@@ -107,6 +107,7 @@ export async function resolveResource(storage, rootFolderUuid, path) {
const bucket = file.bucket ?? file.bucket_id;
const fileId = file.fileId ?? file.file_id ?? file.networkFileId;
const name = getPlainName(file.name, file.plain_name ?? file.plainName, null, file.folder_id ?? file.folderId);
const size = file.size ?? file.file_size ?? 0;
return {
uuid: file.uuid,
type: 'file',
@@ -114,6 +115,7 @@ export async function resolveResource(storage, rootFolderUuid, path) {
parentUuid: parent.uuid,
bucket,
fileId,
size,
};
}
}
@@ -127,14 +129,21 @@ export async function resolveResource(storage, rootFolderUuid, path) {
return { uuid: folder.uuid, type: 'folder', name, parentUuid: parent.uuid };
}
const file = content?.files?.find((f) => {
let file = content?.files?.find((f) => {
const name = getPlainName(f.name, f.plain_name ?? f.plainName, null, f.folder_id ?? f.folderId);
return sanitize(name).toLowerCase() === sanitize(childName).toLowerCase();
});
if (!file && !childName.includes('.')) {
file = content?.files?.find((f) => {
const name = getPlainName(f.name, f.plain_name ?? f.plainName, null, f.folder_id ?? f.folderId);
return sanitize(name).toLowerCase() === sanitize(childName + '.bin').toLowerCase();
});
}
if (file) {
const bucket = file.bucket ?? file.bucket_id;
const fileId = file.fileId ?? file.file_id ?? file.networkFileId;
const name = getPlainName(file.name, file.plain_name ?? file.plainName, null, file.folder_id ?? file.folderId);
const size = file.size ?? file.file_size ?? 0;
return {
uuid: file.uuid,
type: 'file',
@@ -142,6 +151,7 @@ export async function resolveResource(storage, rootFolderUuid, path) {
parentUuid: parent.uuid,
bucket,
fileId,
size,
};
}

View File

@@ -6,6 +6,8 @@
*/
import 'dotenv/config';
import fs from 'fs';
import path from 'path';
import express from 'express';
import { createClients, refreshUser } from './internxt-client.js';
import { pathToSegments, segmentsToPath, listFolder, resolveFolder, resolveResource } from './path-resolver.js';
@@ -18,10 +20,55 @@ const token = process.env.INXT_TOKEN;
const mnemonic = process.env.INXT_MNEMONIC;
if (!token) {
console.error('Fehler: INXT_TOKEN muss gesetzt sein. Siehe docs/browser-token-auth.md');
console.error('Error: INXT_TOKEN must be set. See docs/browser-token-auth.md');
process.exit(1);
}
const LOG_DIR = path.join(process.cwd(), 'logs');
/** WEBDAV_LOG: debug | error | off steuert Datei-Logging (logs/webdav-*.log) */
const WEBDAV_LOG = (process.env.WEBDAV_LOG || '').toLowerCase();
const LOG_DEBUG = WEBDAV_LOG === 'debug' || WEBDAV_LOG === '1';
const LOG_ERROR = LOG_DEBUG || WEBDAV_LOG === 'error';
/** Schreibt in logs/webdav-debug.log (nur bei WEBDAV_LOG=debug) */
function logToFile(...args) {
if (!LOG_DEBUG) return;
const msg = args.map((a) => (typeof a === 'object' ? JSON.stringify(a) : String(a))).join(' ') + '\n';
try {
fs.mkdirSync(LOG_DIR, { recursive: true });
fs.appendFileSync(path.join(LOG_DIR, 'webdav-debug.log'), `[${new Date().toISOString()}] ${msg}`);
} catch (_) {}
}
/** Schreibt Fehler in logs/webdav-errors.log (bei WEBDAV_LOG=debug oder error) */
function logError(...args) {
if (!LOG_ERROR) return;
const msg = args.map((a) => (typeof a === 'object' ? JSON.stringify(a) : String(a))).join(' ') + '\n';
try {
fs.mkdirSync(LOG_DIR, { recursive: true });
fs.appendFileSync(path.join(LOG_DIR, 'webdav-errors.log'), `[${new Date().toISOString()}] ${msg}`);
} catch (e) {
console.error('logError failed:', e.message);
}
}
process.on('unhandledRejection', (reason, promise) => {
logError('unhandledRejection', reason);
});
process.on('uncaughtException', (err) => {
logError('uncaughtException', err.message, err.stack);
});
/** Cache für neu erstellte Dateien rclone verifiziert per GET direkt nach PUT; API kann verzögert sein */
const recentFileCache = new Map();
const CACHE_TTL_MS = 60_000;
function cacheRecentFile(pathKey, resource) {
recentFileCache.set(pathKey, resource);
setTimeout(() => recentFileCache.delete(pathKey), CACHE_TTL_MS);
}
const app = express();
// WebDAV-Credentials: Wenn gesetzt, werden Client-Credentials dagegen geprüft.
@@ -39,7 +86,7 @@ function basicAuth(req, res, next) {
const auth = req.headers?.authorization;
if (!auth || !auth.startsWith('Basic ')) {
res.set('WWW-Authenticate', 'Basic realm="Internxt WebDAV"');
res.status(401).send('Authentifizierung erforderlich');
res.status(401).send('Authentication required');
return;
}
@@ -51,13 +98,13 @@ function basicAuth(req, res, next) {
pass = colon >= 0 ? decoded.slice(colon + 1) : '';
} catch (_) {
res.set('WWW-Authenticate', 'Basic realm="Internxt WebDAV"');
res.status(401).send('Ungültige Credentials');
res.status(401).send('Invalid credentials');
return;
}
if (authStrict && (user !== webdavUser || pass !== webdavPass)) {
res.set('WWW-Authenticate', 'Basic realm="Internxt WebDAV"');
res.status(401).send('Ungültige Credentials');
res.status(401).send('Invalid credentials');
return;
}
@@ -67,8 +114,23 @@ function basicAuth(req, res, next) {
app.use(basicAuth);
// Request-Body: PUT als Raw (Datei-Upload), PROPFIND als Text
app.use(express.raw({ type: (req) => req.method === 'PUT', limit: '1gb' }));
app.use((req, res, next) => {
if (req.url && req.url.includes('restic')) {
logToFile('REQ', req.method, req.url);
}
const origSend = res.send;
res.send = function (...args) {
if (req.url && req.url.includes('restic')) {
logToFile('RES', req.method, req.url, 'status:', res.statusCode);
if (res.statusCode >= 400) logError('HTTP', res.statusCode, req.method, req.url);
}
return origSend.apply(this, args);
};
next();
});
// Request-Body: PUT/POST als Raw (Datei-Upload), PROPFIND als Text
app.use(express.raw({ type: (req) => req.method === 'PUT' || req.method === 'POST', limit: '1gb' }));
app.use(express.text({ type: 'application/xml', limit: '1kb' }));
/**
@@ -170,18 +232,37 @@ async function handlePropfind(req, res) {
const baseUrl = `${req.protocol}://${req.get('host')}`;
try {
const { users, storage } = createClients(token);
const { storage } = createClients(token);
const refresh = await refreshUser(token);
const user = refresh.user;
const rootUuid = user?.rootFolderUuid || user?.rootFolderId || user?.root_folder_id;
if (!rootUuid) {
res.status(500).send('Root-Ordner nicht gefunden');
res.status(500).send('Root folder not found');
return;
}
// PROPFIND auf Datei (z.B. rclone-Verifizierung nach PUT)
let resource = await resolveResource(storage, rootUuid, path);
if (!resource) resource = recentFileCache.get(path);
if (resource && resource.type === 'file') {
const segments = pathToSegments(path);
const fileName = segments[segments.length - 1] || 'file';
const items = [{
path,
name: resource.name || fileName,
isCollection: false,
updatedAt: new Date().toISOString(),
size: resource.size ?? 0,
}];
const xml = buildPropfindResponse(baseUrl, items).replace(/\0/g, '');
res.set('Content-Type', 'application/xml; charset="utf-8"');
res.status(207).send(xml);
return;
}
const listing = await listFolder(storage, rootUuid, path);
if (!listing) {
res.status(404).send('Nicht gefunden');
res.status(404).send('Not found');
return;
}
@@ -201,7 +282,7 @@ async function handlePropfind(req, res) {
// Kinder bei depth 1
if (depth !== '0') {
for (const f of listing.folders) {
const safeName = sanitizeForPath(f.name) || 'Unbenannt';
const safeName = sanitizeForPath(f.name) || 'Unnamed';
const childPath = path === '/' ? '/' + safeName : path + '/' + safeName;
items.push({
path: childPath,
@@ -212,7 +293,7 @@ async function handlePropfind(req, res) {
});
}
for (const f of listing.files) {
const rawName = sanitizeForPath(f.name) || 'Unbenannt';
const rawName = sanitizeForPath(f.name) || 'Unnamed';
const useUuidPath = /[+=]/.test(rawName) || rawName.length > 80;
const pathSegment = useUuidPath ? `_.${f.uuid}` : rawName;
const childPath = path === '/' ? '/' + pathSegment : path + '/' + pathSegment;
@@ -232,12 +313,12 @@ async function handlePropfind(req, res) {
res.set('Content-Type', 'application/xml; charset="utf-8"');
res.status(207).send(xml);
} catch (err) {
console.error('PROPFIND Fehler:', err.message);
console.error('PROPFIND error:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token abgelaufen. Neu einloggen: https://drive.internxt.com');
res.status(401).send('Unauthorized Token expired. Log in again: https://drive.internxt.com');
return;
}
res.status(500).send(err.message || 'Interner Fehler');
res.status(500).send(err.message || 'Internal error');
}
}
@@ -249,12 +330,42 @@ async function getContext() {
const refresh = await refreshUser(token);
const user = refresh.user;
const rootUuid = user?.rootFolderUuid || user?.rootFolderId || user?.root_folder_id;
if (!rootUuid) throw new Error('Root-Ordner nicht gefunden');
if (!rootUuid) throw new Error('Root folder not found');
return { storage, rootUuid };
}
/**
* MKCOL Handler Ordner anlegen
* Stellt sicher, dass ein Ordnerpfad existiert (erstellt fehlende Eltern rekursiv).
* @returns {Promise<{ uuid: string } | null>} Ordner oder null
*/
async function ensureFolderExists(storage, rootUuid, path) {
const segments = pathToSegments(path);
let currentUuid = rootUuid;
for (const segment of segments) {
const [contentPromise] = storage.getFolderContentByUuid({ folderUuid: currentUuid });
const content = await contentPromise;
const child = content?.children?.find((c) => {
const name = getPlainName(c.name, c.plain_name ?? c.plainName, c.parent_id ?? c.parentId, null);
return sanitizeForPath(name).toLowerCase() === sanitizeForPath(segment).toLowerCase();
});
if (child) {
currentUuid = child.uuid;
} else {
const [createPromise] = storage.createFolderByUuid({
parentFolderUuid: currentUuid,
plainName: segment,
});
const created = await createPromise;
currentUuid = created?.uuid;
if (!currentUuid) return null;
}
}
return { uuid: currentUuid };
}
/**
* MKCOL Handler Ordner anlegen (rekursiv: fehlende Eltern werden erstellt)
*/
async function handleMkcol(req, res) {
let path = req.url || '/';
@@ -263,14 +374,14 @@ async function handleMkcol(req, res) {
} catch (_) {}
if (!path.startsWith('/')) path = '/' + path;
if (path === '/') {
res.status(403).send('Root kann nicht erstellt werden');
res.status(403).send('Root cannot be created');
return;
}
if (path.endsWith('/')) path = path.slice(0, -1);
const segments = pathToSegments(path);
if (segments.length === 0) {
res.status(403).send('Root bereits vorhanden');
res.status(403).send('Root already exists');
return;
}
@@ -279,9 +390,12 @@ async function handleMkcol(req, res) {
try {
const { storage, rootUuid } = await getContext();
const parent = await resolveFolder(storage, rootUuid, parentPath);
const parent =
parentPath && parentPath !== '/'
? await ensureFolderExists(storage, rootUuid, parentPath)
: { uuid: rootUuid };
if (!parent) {
res.status(409).send('Übergeordneter Ordner existiert nicht');
res.status(409).send('Parent folder does not exist');
return;
}
@@ -291,7 +405,7 @@ async function handleMkcol(req, res) {
res.status(201).send();
return;
}
res.status(405).send('Ressource existiert bereits (kein Ordner)');
res.status(405).send('Resource already exists (not a folder)');
return;
}
@@ -302,12 +416,16 @@ async function handleMkcol(req, res) {
await createPromise;
res.status(201).send();
} catch (err) {
console.error('MKCOL Fehler:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token erneuern: https://drive.internxt.com');
if (err?.message?.toLowerCase().includes('already exists')) {
res.status(201).send();
return;
}
res.status(500).send(err.message || 'Interner Fehler');
console.error('MKCOL error:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Unauthorized Refresh token: https://drive.internxt.com');
return;
}
res.status(500).send(err.message || 'Internal error');
}
}
@@ -322,7 +440,7 @@ async function handleDelete(req, res) {
if (!path.startsWith('/')) path = '/' + path;
if (path.endsWith('/')) path = path.slice(0, -1);
if (path === '/') {
res.status(403).send('Root kann nicht gelöscht werden');
res.status(403).send('Root cannot be deleted');
return;
}
@@ -330,7 +448,7 @@ async function handleDelete(req, res) {
const { storage, rootUuid } = await getContext();
const resource = await resolveResource(storage, rootUuid, path);
if (!resource) {
res.status(404).send('Nicht gefunden');
res.status(404).send('Not found');
return;
}
@@ -341,12 +459,12 @@ async function handleDelete(req, res) {
}
res.status(204).send();
} catch (err) {
console.error('DELETE Fehler:', err.message);
console.error('DELETE error:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token erneuern: https://drive.internxt.com');
res.status(401).send('Unauthorized Refresh token: https://drive.internxt.com');
return;
}
res.status(500).send(err.message || 'Interner Fehler');
res.status(500).send(err.message || 'Internal error');
}
}
@@ -357,7 +475,7 @@ async function handleMove(req, res) {
let path = req.url || '/';
const destinationHeader = req.headers['destination'];
if (!destinationHeader) {
res.status(400).send('Destination-Header fehlt');
res.status(400).send('Destination header missing');
return;
}
@@ -367,7 +485,7 @@ async function handleMove(req, res) {
if (!path.startsWith('/')) path = '/' + path;
if (path.endsWith('/')) path = path.slice(0, -1);
if (path === '/') {
res.status(403).send('Root kann nicht verschoben werden');
res.status(403).send('Root cannot be moved');
return;
}
@@ -376,7 +494,7 @@ async function handleMove(req, res) {
const destUrl = new URL(destinationHeader);
destPath = decodeURIComponent(destUrl.pathname || '/');
} catch (_) {
res.status(400).send('Ungültige Destination-URL');
res.status(400).send('Invalid destination URL');
return;
}
if (!destPath.startsWith('/')) destPath = '/' + destPath;
@@ -388,7 +506,7 @@ async function handleMove(req, res) {
const { storage, rootUuid } = await getContext();
const source = await resolveResource(storage, rootUuid, path);
if (!source) {
res.status(404).send('Quelle nicht gefunden');
res.status(404).send('Source not found');
return;
}
@@ -398,14 +516,14 @@ async function handleMove(req, res) {
const destParent = await resolveFolder(storage, rootUuid, destParentPath);
if (!destParent) {
res.status(409).send('Zielordner existiert nicht');
res.status(409).send('Destination folder does not exist');
return;
}
const existingDest = await resolveResource(storage, rootUuid, destPath);
if (existingDest) {
if (!overwrite) {
res.status(412).send('Ziel existiert, Overwrite nicht erlaubt');
res.status(412).send('Destination exists, overwrite not allowed');
return;
}
if (existingDest.type === 'folder') {
@@ -427,12 +545,12 @@ async function handleMove(req, res) {
}
res.status(201).send();
} catch (err) {
console.error('MOVE Fehler:', err.message);
console.error('MOVE error:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token erneuern: https://drive.internxt.com');
res.status(401).send('Unauthorized Refresh token: https://drive.internxt.com');
return;
}
res.status(500).send(err.message || 'Interner Fehler');
res.status(500).send(err.message || 'Internal error');
}
}
@@ -440,35 +558,37 @@ async function handleMove(req, res) {
* GET Handler Datei herunterladen
*/
async function handleGet(req, res) {
let path = req.url || '/';
try {
path = decodeURIComponent(path);
} catch (_) {}
let path = getPathFromRequest(req);
if (!path.startsWith('/')) path = '/' + path;
if (path.endsWith('/')) path = path.slice(0, -1);
path = sanitizeForPath(path);
if (path === '/') {
res.status(405).send('Verzeichnis kann nicht heruntergeladen werden');
res.status(405).send('Directory cannot be downloaded');
return;
}
if (!mnemonic) {
res.status(500).send('INXT_MNEMONIC fehlt für Datei-Entschlüsselung');
res.status(500).send('INXT_MNEMONIC required for file decryption');
return;
}
try {
const { storage, rootUuid } = await getContext();
const resource = await resolveResource(storage, rootUuid, path);
let resource = await resolveResource(storage, rootUuid, path);
if (!resource) {
res.status(404).send('Nicht gefunden');
resource = recentFileCache.get(path);
if (resource) logToFile('GET cache hit', path);
}
if (!resource) {
res.status(404).send('Not found');
return;
}
if (resource.type !== 'file') {
res.status(405).send('Keine Datei');
res.status(405).send('Not a file');
return;
}
if (!resource.bucket || !resource.fileId) {
res.status(404).send('Datei hat keinen Inhalt (leere Datei)');
res.status(404).send('File has no content (empty file)');
return;
}
@@ -478,7 +598,7 @@ async function handleGet(req, res) {
const bridgePass = user?.userId;
if (!bridgeUser || !bridgePass) {
res.status(500).send('Bridge-Credentials fehlen');
res.status(500).send('Bridge credentials missing');
return;
}
@@ -497,12 +617,12 @@ async function handleGet(req, res) {
else res.destroy();
});
} catch (err) {
console.error('GET Fehler:', err.message);
console.error('GET error:', err.message);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token erneuern: https://drive.internxt.com');
res.status(401).send('Unauthorized Refresh token: https://drive.internxt.com');
return;
}
if (!res.headersSent) res.status(500).send(err.message || 'Interner Fehler');
if (!res.headersSent) res.status(500).send(err.message || 'Internal error');
}
}
@@ -510,12 +630,10 @@ async function handleGet(req, res) {
* HEAD Handler wie GET, aber nur Header
*/
async function handleHead(req, res) {
let path = req.url || '/';
try {
path = decodeURIComponent(path);
} catch (_) {}
let path = getPathFromRequest(req);
if (!path.startsWith('/')) path = '/' + path;
if (path.endsWith('/')) path = path.slice(0, -1);
path = sanitizeForPath(path);
if (path === '/') {
res.status(405).send();
return;
@@ -523,7 +641,8 @@ async function handleHead(req, res) {
try {
const { storage, rootUuid } = await getContext();
const resource = await resolveResource(storage, rootUuid, path);
let resource = await resolveResource(storage, rootUuid, path);
if (!resource) resource = recentFileCache.get(path);
if (!resource) {
res.status(404).send();
return;
@@ -541,12 +660,12 @@ async function handleHead(req, res) {
/** Parst Dateiname in plainName + Typ (Extension) */
function parseFileName(name) {
if (!name || typeof name !== 'string') return { plainName: 'Unbenannt', type: '' };
if (!name || typeof name !== 'string') return { plainName: 'Unnamed', type: '' };
const s = sanitizeForPath(name);
const lastDot = s.lastIndexOf('.');
if (lastDot <= 0) return { plainName: s || 'Unbenannt', type: '' };
if (lastDot <= 0) return { plainName: s || 'Unnamed', type: '' };
return {
plainName: s.slice(0, lastDot) || 'Unbenannt',
plainName: s.slice(0, lastDot) || 'Unnamed',
type: s.slice(lastDot + 1).toLowerCase() || '',
};
}
@@ -560,31 +679,40 @@ async function handlePut(req, res) {
if (path.endsWith('/')) path = path.slice(0, -1);
path = sanitizeForPath(path);
if (LOG_DEBUG) {
console.log('PUT', path, 'Content-Length:', req.headers['content-length'], 'Body:', req.body?.length ?? 0);
}
if (path === '/') {
res.status(403).send('Root kann nicht überschrieben werden');
res.status(403).send('Root cannot be overwritten');
return;
}
const buffer = req.body;
if (!Buffer.isBuffer(buffer)) {
res.status(400).send('Kein Dateiinhalt erhalten');
res.status(400).send('No file content received');
return;
}
if (!mnemonic) {
res.status(500).send('INXT_MNEMONIC fehlt für Datei-Verschlüsselung');
res.status(500).send('INXT_MNEMONIC required for file encryption');
return;
}
try {
logToFile('PUT try start', path);
const { storage, rootUuid } = await getContext();
logToFile('PUT getContext OK', path);
const segments = pathToSegments(path);
const parentPath = segmentsToPath(segments.slice(0, -1));
const fileName = segments[segments.length - 1];
const parent = await resolveFolder(storage, rootUuid, parentPath);
let parent = await resolveFolder(storage, rootUuid, parentPath);
if (!parent && parentPath && parentPath !== '/') {
parent = await ensureFolderExists(storage, rootUuid, parentPath);
}
if (!parent) {
res.status(409).send('Zielordner existiert nicht');
res.status(409).send('Destination folder does not exist');
return;
}
@@ -593,7 +721,7 @@ async function handlePut(req, res) {
if (existing.type === 'file') {
await storage.deleteFileByUuid(existing.uuid);
} else {
res.status(409).send('Ziel ist ein Ordner');
res.status(409).send('Destination is a folder');
return;
}
}
@@ -608,20 +736,29 @@ async function handlePut(req, res) {
const bucketId = user?.bucket;
if (!bridgeUser || !bridgePass || !bucketId) {
res.status(500).send('Bridge-Credentials oder Bucket fehlen');
res.status(500).send('Bridge credentials or bucket missing');
return;
}
const { plainName, type } = parseFileName(fileName);
const fileId = await uploadFileBuffer({
bucketId,
bridgeUser,
bridgePass,
mnemonic,
buffer: uploadBuffer,
});
let fileId;
logToFile('PUT Upload start', path);
try {
fileId = await uploadFileBuffer({
bucketId,
bridgeUser,
bridgePass,
mnemonic,
buffer: uploadBuffer,
});
} catch (uploadErr) {
logError('PUT Upload (Bridge) failed', path, uploadErr.message);
throw uploadErr;
}
logToFile('PUT Upload OK', path);
const date = new Date().toISOString();
logToFile('PUT createFileEntry start', path);
const doCreate = async () => {
await storage.createFileEntryByUuid({
@@ -639,7 +776,11 @@ async function handlePut(req, res) {
try {
await doCreate();
logToFile('PUT createFileEntry OK', path);
const fullName = type ? `${plainName}.${type}` : plainName;
cacheRecentFile(path, { type: 'file', bucket: bucketId, fileId, name: fullName, size: buffer.length });
} catch (createErr) {
logError('PUT createFileEntry failed', path, createErr.message);
// "File already exists" Datei per Namen löschen und erneut versuchen
if (createErr?.message?.toLowerCase().includes('already exists')) {
const [contentPromise] = storage.getFolderContentByUuid({ folderUuid: parent.uuid });
@@ -664,19 +805,24 @@ async function handlePut(req, res) {
res.status(201).send();
} catch (err) {
console.error('PUT Fehler:', err.message);
logError('PUT CATCH', path, err?.message ?? String(err), err?.response?.status, err?.response?.data);
const apiErr = err.response?.data ? JSON.stringify(err.response.data) : '';
const status = err.response?.status;
if (LOG_ERROR) logError('Stack:', err.stack);
console.error('PUT error:', path, err.message, status ? `HTTP ${status}` : '', apiErr || '');
if (LOG_DEBUG) console.error(err.stack);
if (err.message?.includes('Token') || err.response?.status === 401) {
res.status(401).send('Nicht autorisiert Token erneuern: https://drive.internxt.com');
res.status(401).send('Unauthorized Refresh token: https://drive.internxt.com');
return;
}
if (!res.headersSent) res.status(500).send(err.message || 'Interner Fehler');
if (!res.headersSent) res.status(500).send(err.message || 'Internal error');
}
}
// WebDAV Endpoints
app.options('*', (req, res) => {
res.set('DAV', '1, 2');
res.set('Allow', 'OPTIONS, PROPFIND, GET, HEAD, PUT, DELETE, MKCOL, MOVE');
res.set('Allow', 'OPTIONS, PROPFIND, GET, HEAD, PUT, POST, DELETE, MKCOL, MOVE');
res.sendStatus(200);
});
@@ -700,9 +846,9 @@ app.use((req, res, next) => {
});
return;
}
if (req.method === 'PUT') {
if (req.method === 'PUT' || req.method === 'POST') {
handlePut(req, res).catch((err) => {
console.error('PUT unhandled:', err);
logError('PUT unhandled', err?.message, err?.stack);
if (!res.headersSent) res.status(500).send(err.message);
});
return;
@@ -733,6 +879,7 @@ app.use((req, res, next) => {
app.listen(PORT, () => {
console.log(`Internxt WebDAV Server http://127.0.0.1:${PORT}`);
console.log('Phase 14: PROPFIND, MKCOL, DELETE, MOVE, GET, PUT aktiv.');
console.log('Verwendung: z.B. Windows Explorer → Netzlaufwerk verbinden');
console.log('Phase 14: PROPFIND, MKCOL, DELETE, MOVE, GET, PUT active.');
console.log(`rclone/restic: URL must be http://127.0.0.1:${PORT} (same port!)`);
console.log('Usage: e.g. Windows Explorer → Map network drive');
});

View File

@@ -60,7 +60,7 @@ function updateEnv(token, mnemonic) {
}
async function main() {
console.log('Starte Browser bitte auf', DRIVE_URL, 'einloggen.\n');
console.log('Starting browser please log in at', DRIVE_URL, '\n');
const browser = await puppeteer.launch({
headless: false,
@@ -76,14 +76,14 @@ async function main() {
const { token, mnemonic } = await getTokens(page);
if (token && mnemonic) {
updateEnv(token, mnemonic);
console.log('\n.env aktualisiert. Server neu starten.\n');
console.log('\n.env updated. Restart the server.\n');
await browser.close();
return;
}
await new Promise((r) => setTimeout(r, POLL_MS));
}
console.log('Timeout keine Tokens gefunden. Bitte einloggen und erneut ausführen.');
console.log('Timeout no tokens found. Please log in and run again.');
await browser.close();
process.exit(1);
}

View File

@@ -14,18 +14,18 @@ const DRIVE_API_URL = process.env.DRIVE_API_URL || 'https://gateway.internxt.com
const token = process.env.INXT_TOKEN;
if (!token) {
console.error('Fehler: INXT_TOKEN muss gesetzt sein (aus Browser localStorage)');
console.error('Error: INXT_TOKEN must be set (from browser localStorage)');
process.exit(1);
}
const appDetails = { clientName: 'drive-web', clientVersion: '1.0' };
const apiSecurity = {
token,
unauthorizedCallback: () => console.error('Token abgelaufen oder ungültig'),
unauthorizedCallback: () => console.error('Token expired or invalid'),
};
async function main() {
console.log('Token-Test Drive API mit Browser-Token');
console.log('Token test Drive API with browser token');
console.log('');
const usersClient = Users.client(DRIVE_API_URL, appDetails, apiSecurity);
@@ -42,15 +42,15 @@ async function main() {
if (rootUuid) {
const [content] = storageClient.getFolderContentByUuid({ folderUuid: rootUuid });
const folderContent = await content;
console.log('Dateien/Ordner im Root:', folderContent.children?.length ?? 0);
console.log('Files/folders in root:', folderContent.children?.length ?? 0);
}
console.log('');
console.log('Token funktioniert WebDAV-Server kann gestartet werden.');
console.log('Token works WebDAV server can be started.');
} catch (err) {
console.error('Fehler:', err.message);
console.error('Error:', err.message);
if (err.response?.status === 401) {
console.error('Token abgelaufen bitte erneut auf drive.internxt.com einloggen und Token aktualisieren.');
console.error('Token expired please log in again at drive.internxt.com and update token.');
}
process.exit(1);
}

View File

@@ -38,7 +38,7 @@ async function uploadToUrl(buffer, url, signal) {
body: buffer,
signal,
});
if (!res.ok) throw new Error(`Upload fehlgeschlagen: ${res.status}`);
if (!res.ok) throw new Error(`Upload failed: ${res.status}`);
}
/**
@@ -58,7 +58,7 @@ export async function uploadFileBuffer(params) {
const fileSize = buffer.length;
if (!validateMnemonic(mnemonic)) {
throw new Error('Ungültiges Mnemonic');
throw new Error('Invalid mnemonic');
}
const auth = await getAuth(bridgeUser, bridgePass);