Compare commits

...

13 Commits

Author SHA1 Message Date
oib
72f79b1059 Update authentication system, database models, and UI components 2025-08-07 19:39:22 +02:00
oib
d497492186 feat: Overhaul client-side navigation and clean up project
- Implement a unified SPA routing system in nav.js, removing all legacy and conflicting navigation scripts (router.js, inject-nav.js, fix-nav.js).
- Refactor dashboard.js to delegate all navigation handling to the new nav.js module.
- Create new modular JS files (auth.js, personal-player.js, logger.js) to improve code organization.
- Fix all navigation-related bugs, including guest access and broken footer links.
- Clean up the project root by moving development scripts and backups to a dedicated /dev directory.
- Add a .gitignore file to exclude the database, logs, and other transient files from the repository.
2025-07-28 16:42:46 +02:00
oib
88e468b716 feat: migrate UID system from usernames to email addresses
- Database migration: Updated publicstream.uid from usernames to email addresses
  - devuser → oib@bubuit.net
  - oibchello → oib@chello.at
- Updated related tables (UploadLog, UserQuota) to use email-based UIDs
- Fixed backend audio route to map email UIDs to username-based directories
- Updated SSE event payloads to use email for UID and username for display
- Removed redundant display_name field from SSE events
- Fixed frontend rendering conflicts between nav.js and streams-ui.js
- Updated stream player template to display usernames instead of email addresses
- Added cache-busting parameters to force browser refresh
- Created migration script for future reference

Benefits:
- Eliminates UID duplicates and inconsistency
- Provides stable, unique email-based identifiers
- Maintains user-friendly username display
- Follows proper data normalization practices

Fixes: Stream UI now displays usernames (devuser, oibchello) instead of email addresses
2025-07-27 09:47:38 +02:00
oib
1171510683 Move legacy audio-player.js to dev directory
- audio-player.js was legacy code not used in production
- Actual audio players are in app.js (personal stream) and streams-ui.js (streams page)
- Moving to dev directory to keep production code clean
2025-07-27 09:15:35 +02:00
oib
a9a1c22fee Fix audio player synchronization between streams and personal pages
- Add global audio manager to coordinate playback between different players
- Integrate synchronization into streams-ui.js (streams page player)
- Integrate synchronization into app.js (personal stream player)
- Remove simultaneous playback issues - only one audio plays at a time
- Clean transitions when switching between streams and personal audio

Fixes issue where starting audio on one page didn't stop audio on the other page.
2025-07-27 09:13:55 +02:00
oib
fc4a9c926f Fix upload timeout issue: increase Gunicorn worker timeout to 300s
- Increased timeout from 60s to 300s (5 minutes) for large file uploads
- Added max_requests, max_requests_jitter, and worker_connections settings
- Removed limits on request line and field sizes to handle large uploads
- Also updated Nginx configuration with optimized timeout settings for /upload endpoint

This resolves the 502 Bad Gateway errors that were occurring during large file uploads due to worker timeouts.
2025-07-27 09:00:41 +02:00
oib
f4f712031e Reorganize project structure
- Move development and test files to dev/ directory
- Update .gitignore to exclude development files
- Update paths in configuration files
- Add new audio-player.js for frontend
2025-07-27 07:54:24 +02:00
oib
f6c501030e RC2 2025-07-21 17:39:09 +02:00
oib
ab9d93d913 RC1 2025-07-20 09:26:07 +02:00
oib
da28b205e5 fix: resolve mobile navigation visibility for authenticated users
- Add fix-nav.js to handle navigation state
- Update mobile.css with more specific selectors
- Modify dashboard.js to ensure proper auth state
- Update index.html to include the new fix script
- Ensure guest navigation stays hidden during client-side navigation
2025-07-20 09:24:51 +02:00
oib
c5412b07ac Migrate from file-based to database-backed stream metadata storage
- Add PublicStream model and migration
- Update list_streams.py and upload.py to use database
- Add import script for data migration
- Remove public_streams.txt (replaced by database)
- Fix quota sync between userquota and publicstream tables
2025-07-19 10:49:16 +02:00
oib
402e920bc6 Fix double audio playback and add UID handling for personal stream
- Fixed double playback issue on stream page by properly scoping event delegation in streams-ui.js
- Added init-personal-stream.js to handle UID for personal stream playback
- Improved error handling and logging for audio playback
- Added proper event propagation control to prevent duplicate event handling
2025-07-18 16:51:39 +02:00
oib
17616ac5b8 feat: Add database migrations and auth system
- Add Alembic for database migrations
- Implement user authentication system
- Update frontend styles and components
- Add new test audio functionality
- Update stream management and UI
2025-07-02 09:37:03 +02:00
80 changed files with 11138 additions and 1794 deletions

79
.gitignore vendored
View File

@ -1,25 +1,80 @@
# Bytecode-Dateien # Bytecode files
__pycache__/ __pycache__/
*.py[cod] *.py[cod]
# Virtuelle Umgebungen # Virtual environments
.venv/ .venv/
venv/ venv/
# Betriebssystem-Dateien # System files
.DS_Store .DS_Store
Thumbs.db Thumbs.db
# Logfiles und Dumps # Logs and temporary files
*.log *.log
*.bak *.bak
*.swp *.swp
*.tmp *.tmp
# IDEs und Editoren # Node.js dependencies
node_modules/
package.json
package-lock.json
yarn.lock
# Development documentation
PERFORMANCE-TESTING.md
# Build and distribution
dist/
build/
*.min.js
*.min.css
*.map
# Testing
coverage/
*.test.js
*.spec.js
.nyc_output/
# Environment variables
.env
.env.*
!.env.example
# Debug logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Local Database
dicta2stream.db
# Development directory
dev/
# Configuration files
alembic.ini
*.ini
*.conf
*.config
*.yaml
*.yml
*.toml
# IDEs and editors
.vscode/ .vscode/
.idea/ .idea/
*.sublime-workspace
*.sublime-project
# Local development
.cache/
.temp/
.tmp/
# Project specific
data/* data/*
!data/.gitignore !data/.gitignore
@ -28,3 +83,17 @@ log/*
streams/* streams/*
!streams/.gitignore !streams/.gitignore
# Test files
tests/**/*.js
!tests/*.test.js
!tests/*.spec.js
!tests/README.md
!tests/profile-auth.js
# Performance test results
performance-results/*
!performance-results/.gitkeep
# Legacy files
public_streams.txt

93
DATABASE.md Normal file
View File

@ -0,0 +1,93 @@
# Database Setup and Migrations
This document explains how to set up and manage the database for the dicta2stream application.
## Prerequisites
- PostgreSQL database server
- Python 3.8+
- Required Python packages (install with `pip install -r requirements.txt`)
## Initial Setup
1. Create a PostgreSQL database:
```bash
createdb dicta2stream
```
2. Set up the database URL in your environment:
```bash
echo "DATABASE_URL=postgresql://username:password@localhost/dicta2stream" > .env
```
Replace `username` and `password` with your PostgreSQL credentials.
3. Initialize the database:
```bash
python init_db.py
```
## Running Migrations
After making changes to the database models, you can create and apply migrations:
1. Install the required dependencies:
```bash
pip install -r requirements.txt
```
2. Run the migrations:
```bash
python run_migrations.py
```
## Database Models
The application uses the following database models:
### User
- Stores user account information
- Fields: username, email, hashed_password, is_active, created_at, updated_at
### Session
- Manages user sessions
- Fields: id, user_id, token, ip_address, user_agent, created_at, expires_at, last_used_at, is_active
### PublicStream
- Tracks publicly available audio streams
- Fields: uid, filename, size, mtime, created_at, updated_at
### UserQuota
- Tracks user storage quotas
- Fields: uid, storage_bytes, updated_at
### UploadLog
- Logs file uploads
- Fields: id, uid, filename, size, ip_address, user_agent, created_at
## Backing Up the Database
To create a backup of your database:
```bash
pg_dump -U username -d dicta2stream -f backup.sql
```
To restore from a backup:
```bash
psql -U username -d dicta2stream -f backup.sql
```
## Troubleshooting
- If you encounter connection issues, verify that:
- The PostgreSQL server is running
- The database URL in your .env file is correct
- The database user has the necessary permissions
- If you need to reset the database:
```bash
dropdb dicta2stream
createdb dicta2stream
python init_db.py
```

136
account_router.py Normal file
View File

@ -0,0 +1,136 @@
# account_router.py — Account management endpoints
from fastapi import APIRouter, Request, HTTPException, Depends
from fastapi.responses import JSONResponse
from sqlmodel import Session, select
from models import User, UserQuota, UploadLog, DBSession, PublicStream
from database import get_db
import os
from typing import Dict, Any
router = APIRouter(prefix="/api", tags=["account"])
@router.post("/delete-account")
async def delete_account(data: Dict[str, Any], request: Request):
try:
# Get UID from request data
uid = data.get("uid")
if not uid:
# Debug messages disabled
raise HTTPException(status_code=400, detail="Missing UID")
ip = request.client.host
# Debug messages disabled
# Verify user exists and IP matches
# Use the database session context manager
with get_db() as db:
# Handle both email-based and username-based UIDs for backward compatibility
user = None
# First try to find by email (new UID format)
if '@' in uid:
user = db.query(User).filter(User.email == uid).first()
# Debug messages disabled
# If not found by email, try by username (legacy UID format)
if not user:
user = db.query(User).filter(User.username == uid).first()
# Debug messages disabled
if not user:
# Debug messages disabled
raise HTTPException(status_code=404, detail="User not found")
# Extract user attributes while the object is still bound to the session
actual_uid = user.email
user_ip = user.ip
username = user.username
# Debug messages disabled
if user_ip != ip:
# Debug messages disabled
raise HTTPException(status_code=403, detail="Unauthorized: IP address does not match")
# Use the database session context manager for all database operations
with get_db() as db:
try:
# Delete user's upload logs (use actual_uid which is always the email)
uploads = db.query(UploadLog).filter(UploadLog.uid == actual_uid).all()
for upload in uploads:
db.delete(upload)
# Debug messages disabled
# Delete user's public streams
streams = db.query(PublicStream).filter(PublicStream.uid == actual_uid).all()
for stream in streams:
db.delete(stream)
# Debug messages disabled
# Delete user's quota
quota = db.get(UserQuota, actual_uid)
if quota:
db.delete(quota)
# Debug messages disabled
# Delete user's active sessions (check both email and username as uid)
sessions_by_email = db.query(DBSession).filter(DBSession.uid == actual_uid).all()
sessions_by_username = db.query(DBSession).filter(DBSession.uid == username).all()
all_sessions = list(sessions_by_email) + list(sessions_by_username)
# Remove duplicates using token (primary key)
unique_sessions = {session.token: session for session in all_sessions}.values()
for session in unique_sessions:
db.delete(session)
# Debug messages disabled
# Delete user account
user_obj = db.get(User, actual_uid) # Use actual_uid which is the email
if user_obj:
db.delete(user_obj)
# Debug messages disabled
db.commit()
# Debug messages disabled
except Exception as e:
db.rollback()
# Debug messages disabled
# Debug messages disabled
raise HTTPException(status_code=500, detail="Database error during account deletion")
# Delete user's files
try:
# Use the email (actual_uid) for the directory name, which matches how files are stored
user_dir = os.path.join('data', actual_uid)
real_user_dir = os.path.realpath(user_dir)
# Security check to prevent directory traversal
if not real_user_dir.startswith(os.path.realpath('data')):
# Debug messages disabled
raise HTTPException(status_code=400, detail="Invalid user directory")
if os.path.exists(real_user_dir):
import shutil
shutil.rmtree(real_user_dir, ignore_errors=True)
# Debug messages disabled
else:
# Debug messages disabled
pass
except Exception as e:
# Debug messages disabled
# Continue even if file deletion fails, as the account is already deleted from the DB
pass
# Debug messages disabled
return {"status": "success", "message": "Account and all associated data have been deleted"}
except HTTPException as he:
# Debug messages disabled
raise
except Exception as e:
# Debug messages disabled
raise HTTPException(status_code=500, detail="An unexpected error occurred")

140
alembic.ini Normal file
View File

@ -0,0 +1,140 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = %(here)s/dev/alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "path_separator"
# below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# path_separator; This indicates what character is used to split lists of file
# paths, including version_locations and prepend_sys_path within configparser
# files such as alembic.ini.
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
#
# Note that in order to support legacy alembic.ini files, this default does NOT
# take place if path_separator is not present in alembic.ini. If this
# option is omitted entirely, fallback logic is as follows:
#
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = postgresql://postgres:postgres@localhost/dicta2stream
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARNING
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

355
analyze_db_legacy.py Normal file
View File

@ -0,0 +1,355 @@
#!/usr/bin/env python3
"""
Database Legacy Data Analysis Script
Analyzes the database for legacy data that doesn't match current authentication implementation
"""
import sys
from datetime import datetime, timedelta
from sqlmodel import Session, select
from database import engine
from models import User, UserQuota, UploadLog, DBSession, PublicStream
import re
def validate_email_format(email):
"""Validate email format using RFC 5322 compliant regex"""
pattern = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
return re.match(pattern, email) is not None
def analyze_user_table():
"""Analyze User table for legacy data issues"""
print("\n=== ANALYZING USER TABLE ===")
issues = []
with Session(engine) as session:
users = session.exec(select(User)).all()
print(f"Total users: {len(users)}")
for user in users:
user_issues = []
# Check if email (primary key) is valid email format
if not validate_email_format(user.email):
user_issues.append(f"Invalid email format: {user.email}")
# Check if username is also email format (current requirement)
if not validate_email_format(user.username):
user_issues.append(f"Username not in email format: {user.username}")
# Check if email and username match (should be same after migration)
if user.email != user.username:
user_issues.append(f"Email/username mismatch: email={user.email}, username={user.username}")
# Check for missing or empty display_name
if not user.display_name or user.display_name.strip() == "":
user_issues.append(f"Empty display_name")
# Check for very old tokens (potential security issue)
if user.token_created < datetime.utcnow() - timedelta(days=30):
user_issues.append(f"Very old token (created: {user.token_created})")
# Check for unconfirmed users
if not user.confirmed:
user_issues.append(f"Unconfirmed user")
if user_issues:
issues.append({
'email': user.email,
'username': user.username,
'issues': user_issues
})
print(f"Users with issues: {len(issues)}")
for issue in issues:
print(f" User {issue['email']}:")
for problem in issue['issues']:
print(f" - {problem}")
return issues
def analyze_session_table():
"""Analyze DBSession table for legacy data issues"""
print("\n=== ANALYZING SESSION TABLE ===")
issues = []
with Session(engine) as session:
sessions = session.exec(select(DBSession)).all()
print(f"Total sessions: {len(sessions)}")
active_sessions = [s for s in sessions if s.is_active]
expired_sessions = [s for s in sessions if s.expires_at < datetime.utcnow()]
old_sessions = [s for s in sessions if s.created_at < datetime.utcnow() - timedelta(days=7)]
print(f"Active sessions: {len(active_sessions)}")
print(f"Expired sessions: {len(expired_sessions)}")
print(f"Sessions older than 7 days: {len(old_sessions)}")
for db_session in sessions:
session_issues = []
# Check if user_id is in email format (current requirement)
if not validate_email_format(db_session.user_id):
session_issues.append(f"user_id not in email format: {db_session.user_id}")
# Check for expired but still active sessions
if db_session.is_active and db_session.expires_at < datetime.utcnow():
session_issues.append(f"Expired but still marked active (expires: {db_session.expires_at})")
# Check for very old sessions that should be cleaned up
if db_session.created_at < datetime.utcnow() - timedelta(days=30):
session_issues.append(f"Very old session (created: {db_session.created_at})")
# Check for sessions with 1-hour expiry (old system)
session_duration = db_session.expires_at - db_session.created_at
if session_duration < timedelta(hours=2): # Less than 2 hours indicates old 1-hour sessions
session_issues.append(f"Short session duration: {session_duration} (should be 24h)")
if session_issues:
issues.append({
'token': db_session.token[:10] + '...',
'user_id': db_session.user_id,
'created_at': db_session.created_at,
'expires_at': db_session.expires_at,
'issues': session_issues
})
print(f"Sessions with issues: {len(issues)}")
for issue in issues:
print(f" Session {issue['token']} (user: {issue['user_id']}):")
for problem in issue['issues']:
print(f" - {problem}")
return issues
def analyze_quota_table():
"""Analyze UserQuota table for legacy data issues"""
print("\n=== ANALYZING USER QUOTA TABLE ===")
issues = []
with Session(engine) as session:
quotas = session.exec(select(UserQuota)).all()
print(f"Total quota records: {len(quotas)}")
for quota in quotas:
quota_issues = []
# Check if uid is in email format (current requirement)
if not validate_email_format(quota.uid):
quota_issues.append(f"UID not in email format: {quota.uid}")
# Check for negative storage
if quota.storage_bytes < 0:
quota_issues.append(f"Negative storage: {quota.storage_bytes}")
# Check for excessive storage (over 100MB limit)
if quota.storage_bytes > 100 * 1024 * 1024:
quota_issues.append(f"Storage over 100MB limit: {quota.storage_bytes / (1024*1024):.1f}MB")
if quota_issues:
issues.append({
'uid': quota.uid,
'storage_bytes': quota.storage_bytes,
'issues': quota_issues
})
print(f"Quota records with issues: {len(issues)}")
for issue in issues:
print(f" Quota {issue['uid']} ({issue['storage_bytes']} bytes):")
for problem in issue['issues']:
print(f" - {problem}")
return issues
def analyze_upload_log_table():
"""Analyze UploadLog table for legacy data issues"""
print("\n=== ANALYZING UPLOAD LOG TABLE ===")
issues = []
with Session(engine) as session:
uploads = session.exec(select(UploadLog)).all()
print(f"Total upload records: {len(uploads)}")
for upload in uploads:
upload_issues = []
# Check if uid is in email format (current requirement)
if not validate_email_format(upload.uid):
upload_issues.append(f"UID not in email format: {upload.uid}")
# Check for missing processed_filename
if not upload.processed_filename:
upload_issues.append(f"Missing processed_filename")
# Check for negative file size
if upload.size_bytes < 0:
upload_issues.append(f"Negative file size: {upload.size_bytes}")
# Check for very old uploads
if upload.created_at < datetime.utcnow() - timedelta(days=365):
upload_issues.append(f"Very old upload (created: {upload.created_at})")
if upload_issues:
issues.append({
'id': upload.id,
'uid': upload.uid,
'filename': upload.filename,
'created_at': upload.created_at,
'issues': upload_issues
})
print(f"Upload records with issues: {len(issues)}")
for issue in issues:
print(f" Upload {issue['id']} (user: {issue['uid']}, file: {issue['filename']}):")
for problem in issue['issues']:
print(f" - {problem}")
return issues
def analyze_public_stream_table():
"""Analyze PublicStream table for legacy data issues"""
print("\n=== ANALYZING PUBLIC STREAM TABLE ===")
issues = []
with Session(engine) as session:
streams = session.exec(select(PublicStream)).all()
print(f"Total public stream records: {len(streams)}")
for stream in streams:
stream_issues = []
# Check if uid is in email format (current requirement)
if not validate_email_format(stream.uid):
stream_issues.append(f"UID not in email format: {stream.uid}")
# Check if username is also email format (should match uid)
if stream.username and not validate_email_format(stream.username):
stream_issues.append(f"Username not in email format: {stream.username}")
# Check if uid and username match (should be same after migration)
if stream.username and stream.uid != stream.username:
stream_issues.append(f"UID/username mismatch: uid={stream.uid}, username={stream.username}")
# Check for negative storage
if stream.storage_bytes < 0:
stream_issues.append(f"Negative storage: {stream.storage_bytes}")
# Check for missing display_name
if not stream.display_name or stream.display_name.strip() == "":
stream_issues.append(f"Empty display_name")
if stream_issues:
issues.append({
'uid': stream.uid,
'username': stream.username,
'display_name': stream.display_name,
'issues': stream_issues
})
print(f"Public stream records with issues: {len(issues)}")
for issue in issues:
print(f" Stream {issue['uid']} (username: {issue['username']}):")
for problem in issue['issues']:
print(f" - {problem}")
return issues
def check_referential_integrity():
"""Check for referential integrity issues between tables"""
print("\n=== CHECKING REFERENTIAL INTEGRITY ===")
issues = []
with Session(engine) as session:
# Get all unique UIDs from each table
users = session.exec(select(User.email)).all()
user_usernames = session.exec(select(User.username)).all()
quotas = session.exec(select(UserQuota.uid)).all()
uploads = session.exec(select(UploadLog.uid)).all()
streams = session.exec(select(PublicStream.uid)).all()
sessions = session.exec(select(DBSession.user_id)).all()
user_emails = set(users)
user_usernames_set = set(user_usernames)
quota_uids = set(quotas)
upload_uids = set(uploads)
stream_uids = set(streams)
session_uids = set(sessions)
print(f"Unique user emails: {len(user_emails)}")
print(f"Unique user usernames: {len(user_usernames_set)}")
print(f"Unique quota UIDs: {len(quota_uids)}")
print(f"Unique upload UIDs: {len(upload_uids)}")
print(f"Unique stream UIDs: {len(stream_uids)}")
print(f"Unique session user_ids: {len(session_uids)}")
# Check for orphaned records
orphaned_quotas = quota_uids - user_emails
orphaned_uploads = upload_uids - user_emails
orphaned_streams = stream_uids - user_emails
orphaned_sessions = session_uids - user_usernames_set # Sessions use username as user_id
if orphaned_quotas:
issues.append(f"Orphaned quota records (no matching user): {orphaned_quotas}")
if orphaned_uploads:
issues.append(f"Orphaned upload records (no matching user): {orphaned_uploads}")
if orphaned_streams:
issues.append(f"Orphaned stream records (no matching user): {orphaned_streams}")
if orphaned_sessions:
issues.append(f"Orphaned session records (no matching user): {orphaned_sessions}")
# Check for users without quota records
users_without_quota = user_emails - quota_uids
if users_without_quota:
issues.append(f"Users without quota records: {users_without_quota}")
# Check for users without stream records
users_without_streams = user_emails - stream_uids
if users_without_streams:
issues.append(f"Users without stream records: {users_without_streams}")
print(f"Referential integrity issues: {len(issues)}")
for issue in issues:
print(f" - {issue}")
return issues
def main():
"""Run complete database legacy analysis"""
print("=== DATABASE LEGACY DATA ANALYSIS ===")
print(f"Analysis started at: {datetime.utcnow()}")
all_issues = {}
try:
all_issues['users'] = analyze_user_table()
all_issues['sessions'] = analyze_session_table()
all_issues['quotas'] = analyze_quota_table()
all_issues['uploads'] = analyze_upload_log_table()
all_issues['streams'] = analyze_public_stream_table()
all_issues['integrity'] = check_referential_integrity()
# Summary
print("\n=== SUMMARY ===")
total_issues = sum(len(issues) if isinstance(issues, list) else 1 for issues in all_issues.values())
print(f"Total issues found: {total_issues}")
for table, issues in all_issues.items():
if issues:
count = len(issues) if isinstance(issues, list) else 1
print(f" {table}: {count} issues")
if total_issues == 0:
print("✅ No legacy data issues found! Database is clean.")
else:
print("⚠️ Legacy data issues found. Consider running cleanup scripts.")
except Exception as e:
print(f"❌ Error during analysis: {e}")
return 1
return 0
if __name__ == "__main__":
sys.exit(main())

95
auth.py Normal file
View File

@ -0,0 +1,95 @@
"""Authentication middleware and utilities for dicta2stream"""
from fastapi import Request, HTTPException, Depends, status
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from sqlmodel import Session, select
from typing import Optional
from models import User, Session as DBSession, verify_session
from database import get_db
security = HTTPBearer()
def get_current_user(
request: Request,
credentials: HTTPAuthorizationCredentials = Depends(security)
) -> User:
"""Dependency to get the current authenticated user"""
token = credentials.credentials
# Use the database session context manager
with get_db() as db:
db_session = verify_session(db, token)
if not db_session:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or expired session",
headers={"WWW-Authenticate": "Bearer"},
)
# Get the user from the session using query interface
user = db.query(User).filter(User.email == db_session.uid).first()
if not user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="User not found",
headers={"WWW-Authenticate": "Bearer"},
)
# Attach the session to the request state for later use
request.state.session = db_session
return user
def get_optional_user(
request: Request,
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security, use_cache=False)
) -> Optional[User]:
"""Dependency that returns the current user if authenticated, None otherwise"""
if not credentials:
return None
try:
# get_current_user now handles its own database session
return get_current_user(request, credentials)
except HTTPException:
return None
def create_session(user: User, request: Request) -> DBSession:
"""Create a new session for the user (valid for 24 hours)"""
import secrets
from datetime import datetime, timedelta
user_agent = request.headers.get("user-agent", "")
ip_address = request.client.host if request.client else "0.0.0.0"
# Create session token and set 24-hour expiry
session_token = secrets.token_urlsafe(32)
expires_at = datetime.utcnow() + timedelta(hours=24)
# Create the session object
session = DBSession(
token=session_token,
user_id=user.email,
ip_address=ip_address,
user_agent=user_agent,
expires_at=expires_at,
is_active=True
)
# Use the database session context manager
with get_db() as db:
try:
db.add(session)
db.commit()
db.refresh(session) # Ensure we have the latest data
return session
except Exception as e:
db.rollback()
# Debug messages disabled
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to create session"
)

149
auth_router.py Normal file
View File

@ -0,0 +1,149 @@
"""Authentication routes for dicta2stream"""
from fastapi import APIRouter, Depends, Request, Response, HTTPException, status
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from sqlmodel import Session, select
from datetime import datetime
from models import Session as DBSession, User
from database import get_db
from auth import get_current_user
router = APIRouter(prefix="/api", tags=["auth"])
security = HTTPBearer()
@router.post("/logout")
async def logout(
request: Request,
response: Response,
credentials: HTTPAuthorizationCredentials = Depends(security)
):
"""Log out by invalidating the current session"""
try:
# Get the token from the Authorization header
token = credentials.credentials if credentials else None
if not token:
return {"message": "No session to invalidate"}
# Use the database session context manager
with get_db() as db:
try:
# Find and invalidate the session using query interface
session = db.query(DBSession).filter(
DBSession.token == token,
DBSession.is_active == True # noqa: E712
).first()
if session:
try:
session.is_active = False
db.add(session)
db.commit()
except Exception as e:
db.rollback()
# Debug messages disabled
# Continue with logout even if session update fails
except Exception as e:
# Debug messages disabled
# Continue with logout even if session lookup fails
pass
# Clear the session cookie
response.delete_cookie(
key="sessionid",
httponly=True,
secure=True,
samesite="lax",
path="/"
)
# Clear any other auth-related cookies
for cookie_name in ["uid", "authToken", "username", "token"]:
response.delete_cookie(
key=cookie_name,
path="/",
domain=request.url.hostname,
secure=True,
httponly=True,
samesite="lax"
)
return {"message": "Successfully logged out"}
except HTTPException:
# Re-raise HTTP exceptions
raise
except Exception as e:
# Debug messages disabled
# Don't expose internal errors to the client
return {"message": "Logout processed"}
@router.get("/me")
async def get_current_user_info(
current_user: User = Depends(get_current_user)
):
"""Get current user information"""
return {
"username": current_user.username,
"email": current_user.email,
"created_at": current_user.token_created.isoformat(),
"is_confirmed": current_user.confirmed
}
@router.get("/sessions")
async def list_sessions(
current_user: User = Depends(get_current_user)
):
"""List all active sessions for the current user"""
# Use the database session context manager
with get_db() as db:
sessions = DBSession.get_active_sessions(db, current_user.username)
return [
{
"id": s.id,
"ip_address": s.ip_address,
"user_agent": s.user_agent,
"created_at": s.created_at.isoformat(),
"last_used_at": s.last_used_at.isoformat(),
"expires_at": s.expires_at.isoformat()
}
for s in sessions
]
@router.post("/sessions/{session_id}/revoke")
async def revoke_session(
session_id: int,
current_user: User = Depends(get_current_user)
):
"""Revoke a specific session"""
# Use the database session context manager
with get_db() as db:
session = db.get(DBSession, session_id)
if not session or session.uid != current_user.email:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Session not found"
)
if not session.is_active:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Session is already inactive"
)
try:
session.is_active = False
db.add(session)
db.commit()
return {"message": "Session revoked successfully"}
except Exception as e:
db.rollback()
# Debug messages disabled
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to revoke session"
)

View File

@ -0,0 +1,31 @@
-- Cleanup script for old format user 'devuser'
-- This user has username-based UID instead of email-based UID
-- Show what will be deleted before deletion
SELECT 'publicstream entries to delete:' as info;
SELECT uid, username, storage_bytes, created_at FROM publicstream WHERE uid = 'devuser';
SELECT 'uploadlog entries to delete:' as info;
SELECT COUNT(*) as count, uid FROM uploadlog WHERE uid = 'devuser' GROUP BY uid;
SELECT 'userquota entries to delete:' as info;
SELECT uid FROM userquota WHERE uid = 'devuser';
-- Delete from all related tables
-- Start with dependent tables first
DELETE FROM uploadlog WHERE uid = 'devuser';
DELETE FROM userquota WHERE uid = 'devuser';
DELETE FROM publicstream WHERE uid = 'devuser';
-- Verify cleanup
SELECT 'Remaining entries for devuser in publicstream:' as info;
SELECT COUNT(*) as count FROM publicstream WHERE uid = 'devuser';
SELECT 'Remaining entries for devuser in uploadlog:' as info;
SELECT COUNT(*) as count FROM uploadlog WHERE uid = 'devuser';
SELECT 'Remaining entries for devuser in userquota:' as info;
SELECT COUNT(*) as count FROM userquota WHERE uid = 'devuser';
SELECT 'Total remaining old format entries in publicstream:' as info;
SELECT COUNT(*) as count FROM publicstream WHERE uid NOT LIKE '%@%' OR uid = username;

View File

@ -0,0 +1,19 @@
-- Final cleanup of orphaned entries that prevent proper account deletion
-- These entries have username-based UIDs that should have been deleted
-- Show what will be deleted
SELECT 'Orphaned publicstream entries to delete:' as info;
SELECT uid, username FROM publicstream WHERE uid = 'oibchello';
SELECT 'Orphaned userquota entries to delete:' as info;
SELECT uid, storage_bytes FROM userquota WHERE uid = 'oibchello';
-- Delete the orphaned entries
DELETE FROM publicstream WHERE uid = 'oibchello';
DELETE FROM userquota WHERE uid = 'oibchello';
-- Verify cleanup
SELECT 'Remaining entries for oibchello:' as info;
SELECT 'publicstream' as table_name, COUNT(*) as count FROM publicstream WHERE uid = 'oibchello'
UNION ALL
SELECT 'userquota' as table_name, COUNT(*) as count FROM userquota WHERE uid = 'oibchello';

169
cleanup_legacy_db.sql Normal file
View File

@ -0,0 +1,169 @@
-- Database Legacy Data Cleanup Script
-- Fixes issues identified in the database analysis
-- Execute these queries step by step to fix legacy data
-- =============================================================================
-- STEP 1: Fix User Table - Update username to match email format
-- =============================================================================
-- Issue: User has username 'oibchello' but email 'oib@chello.at'
-- Fix: Update username to match email (current authentication requirement)
UPDATE "user"
SET username = email,
display_name = CASE
WHEN display_name = '' OR display_name IS NULL
THEN split_part(email, '@', 1) -- Use email prefix as display name
ELSE display_name
END
WHERE email = 'oib@chello.at';
-- Verify the fix
SELECT email, username, display_name, confirmed FROM "user" WHERE email = 'oib@chello.at';
-- =============================================================================
-- STEP 2: Clean Up Expired Sessions
-- =============================================================================
-- Issue: 11 expired sessions still marked as active (security risk)
-- Fix: Mark expired sessions as inactive
UPDATE dbsession
SET is_active = false
WHERE expires_at < NOW() AND is_active = true;
-- Verify expired sessions are now inactive
SELECT COUNT(*) as expired_active_sessions
FROM dbsession
WHERE expires_at < NOW() AND is_active = true;
-- Optional: Delete very old expired sessions (older than 7 days)
DELETE FROM dbsession
WHERE expires_at < NOW() - INTERVAL '7 days';
-- =============================================================================
-- STEP 3: Update Session user_id to Email Format
-- =============================================================================
-- Issue: All sessions use old username format instead of email
-- Fix: Update session user_id to use email format
UPDATE dbsession
SET user_id = 'oib@chello.at'
WHERE user_id = 'oibchello';
-- Verify session user_id updates
SELECT DISTINCT user_id FROM dbsession;
-- =============================================================================
-- STEP 4: Fix PublicStream Username Fields
-- =============================================================================
-- Issue: PublicStream has username/UID mismatches
-- Fix: Update username to match UID (email format)
-- Fix the existing user record
UPDATE publicstream
SET username = uid,
display_name = CASE
WHEN display_name = 'oibchello'
THEN split_part(uid, '@', 1) -- Use email prefix as display name
ELSE display_name
END
WHERE uid = 'oib@chello.at';
-- Verify the fix
SELECT uid, username, display_name FROM publicstream WHERE uid = 'oib@chello.at';
-- =============================================================================
-- STEP 5: Remove Orphaned Records for Deleted User
-- =============================================================================
-- Issue: Records exist for 'oib@bubuit.net' but no user exists
-- Fix: Remove orphaned records
-- Remove orphaned quota record
DELETE FROM userquota WHERE uid = 'oib@bubuit.net';
-- Remove orphaned stream record
DELETE FROM publicstream WHERE uid = 'oib@bubuit.net';
-- Verify orphaned records are removed
SELECT 'userquota' as table_name, COUNT(*) as count FROM userquota WHERE uid = 'oib@bubuit.net'
UNION ALL
SELECT 'publicstream' as table_name, COUNT(*) as count FROM publicstream WHERE uid = 'oib@bubuit.net';
-- =============================================================================
-- VERIFICATION QUERIES
-- =============================================================================
-- Run these to verify all issues are fixed
-- 1. Check user table consistency
SELECT
email,
username,
display_name,
CASE WHEN email = username THEN '' ELSE '' END as email_username_match,
CASE WHEN display_name != '' THEN '' ELSE '' END as has_display_name
FROM "user";
-- 2. Check session table health
SELECT
COUNT(*) as total_sessions,
COUNT(CASE WHEN is_active THEN 1 END) as active_sessions,
COUNT(CASE WHEN expires_at < NOW() AND is_active THEN 1 END) as expired_but_active,
COUNT(CASE WHEN expires_at - created_at > INTERVAL '20 hours' THEN 1 END) as long_duration_sessions
FROM dbsession;
-- 3. Check PublicStream consistency
SELECT
uid,
username,
display_name,
CASE WHEN uid = username THEN '' ELSE '' END as uid_username_match
FROM publicstream;
-- 4. Check referential integrity
SELECT
'Users' as entity,
COUNT(*) as count
FROM "user"
UNION ALL
SELECT
'UserQuota records',
COUNT(*)
FROM userquota
UNION ALL
SELECT
'PublicStream records',
COUNT(*)
FROM publicstream
UNION ALL
SELECT
'Active Sessions',
COUNT(*)
FROM dbsession WHERE is_active = true;
-- 5. Final validation - should return no rows if all issues are fixed
SELECT 'ISSUE: User email/username mismatch' as issue
FROM "user"
WHERE email != username
UNION ALL
SELECT 'ISSUE: Expired active sessions'
FROM dbsession
WHERE expires_at < NOW() AND is_active = true
LIMIT 1
UNION ALL
SELECT 'ISSUE: PublicStream UID/username mismatch'
FROM publicstream
WHERE uid != username
LIMIT 1
UNION ALL
SELECT 'ISSUE: Orphaned quota records'
FROM userquota q
LEFT JOIN "user" u ON q.uid = u.email
WHERE u.email IS NULL
LIMIT 1
UNION ALL
SELECT 'ISSUE: Orphaned stream records'
FROM publicstream p
LEFT JOIN "user" u ON p.uid = u.email
WHERE u.email IS NULL
LIMIT 1;
-- If the final query returns no rows, all legacy issues are fixed! ✅

View File

@ -0,0 +1,31 @@
-- Cleanup script for old format user 'oibchello'
-- This user has username-based UID instead of email-based UID
-- Show what will be deleted before deletion
SELECT 'publicstream entries to delete:' as info;
SELECT uid, username, storage_bytes, created_at FROM publicstream WHERE uid = 'oibchello';
SELECT 'uploadlog entries to delete:' as info;
SELECT COUNT(*) as count, uid FROM uploadlog WHERE uid = 'oibchello' GROUP BY uid;
SELECT 'userquota entries to delete:' as info;
SELECT uid FROM userquota WHERE uid = 'oibchello';
-- Delete from all related tables
-- Start with dependent tables first
DELETE FROM uploadlog WHERE uid = 'oibchello';
DELETE FROM userquota WHERE uid = 'oibchello';
DELETE FROM publicstream WHERE uid = 'oibchello';
-- Verify cleanup
SELECT 'Remaining entries for oibchello in publicstream:' as info;
SELECT COUNT(*) as count FROM publicstream WHERE uid = 'oibchello';
SELECT 'Remaining entries for oibchello in uploadlog:' as info;
SELECT COUNT(*) as count FROM uploadlog WHERE uid = 'oibchello';
SELECT 'Remaining entries for oibchello in userquota:' as info;
SELECT COUNT(*) as count FROM userquota WHERE uid = 'oibchello';
SELECT 'Total remaining old format entries in publicstream:' as info;
SELECT COUNT(*) as count FROM publicstream WHERE uid NOT LIKE '%@%' OR uid = username;

View File

@ -0,0 +1,28 @@
-- Cleanup script for old format user entries
-- Removes users with username-based UIDs instead of email-based UIDs
-- Show what will be deleted before deletion
SELECT 'publicstream entries to delete:' as info;
SELECT uid, username, storage_bytes, created_at FROM publicstream WHERE uid IN ('devuser', 'oibchello');
SELECT 'uploadlog entries to delete:' as info;
SELECT COUNT(*) as count, uid FROM uploadlog WHERE uid IN ('devuser', 'oibchello') GROUP BY uid;
SELECT 'userquota entries to delete:' as info;
SELECT uid, quota_bytes, used_bytes FROM userquota WHERE uid IN ('devuser', 'oibchello');
-- Delete from all related tables
-- Start with dependent tables first
DELETE FROM uploadlog WHERE uid IN ('devuser', 'oibchello');
DELETE FROM userquota WHERE uid IN ('devuser', 'oibchello');
DELETE FROM publicstream WHERE uid IN ('devuser', 'oibchello');
-- Verify cleanup
SELECT 'Remaining old format entries in publicstream:' as info;
SELECT COUNT(*) as count FROM publicstream WHERE uid NOT LIKE '%@%' OR uid = username;
SELECT 'Remaining old format entries in uploadlog:' as info;
SELECT COUNT(*) as count FROM uploadlog WHERE uid NOT LIKE '%@%';
SELECT 'Remaining old format entries in userquota:' as info;
SELECT COUNT(*) as count FROM userquota WHERE uid NOT LIKE '%@%';

View File

@ -0,0 +1,17 @@
-- Cleanup script for orphaned uploadlog entries
-- These entries have username-based UIDs that should have been deleted with the user
-- Show what will be deleted
SELECT 'Orphaned uploadlog entries to delete:' as info;
SELECT uid, filename, processed_filename, created_at FROM uploadlog WHERE uid = 'oibchello';
-- Delete the orphaned entries
DELETE FROM uploadlog WHERE uid = 'oibchello';
-- Verify cleanup
SELECT 'Remaining uploadlog entries for oibchello:' as info;
SELECT COUNT(*) as count FROM uploadlog WHERE uid = 'oibchello';
-- Show all remaining uploadlog entries
SELECT 'All remaining uploadlog entries:' as info;
SELECT uid, filename, created_at FROM uploadlog ORDER BY created_at DESC;

View File

@ -0,0 +1,6 @@
-- Cleanup remaining orphaned uploadlog entries for devuser
DELETE FROM uploadlog WHERE uid = 'devuser';
-- Verify cleanup
SELECT 'All remaining uploadlog entries after cleanup:' as info;
SELECT uid, filename, created_at FROM uploadlog ORDER BY created_at DESC;

View File

@ -9,9 +9,50 @@ def concat_opus_files(user_dir: Path, output_file: Path):
Concatenate all .opus files in user_dir (except stream.opus) in random order into output_file. Concatenate all .opus files in user_dir (except stream.opus) in random order into output_file.
Overwrites output_file if exists. Creates it if missing. Overwrites output_file if exists. Creates it if missing.
""" """
files = [f for f in user_dir.glob('*.opus') if f.name != 'stream.opus'] # Clean up any existing filelist.txt to prevent issues
filelist_path = user_dir / 'filelist.txt'
if filelist_path.exists():
try:
filelist_path.unlink()
except Exception as e:
print(f"Warning: Could not clean up old filelist.txt: {e}")
# Get all opus files except stream.opus and remove any duplicates
import hashlib
file_hashes = set()
files = []
for f in user_dir.glob('*.opus'):
if f.name == 'stream.opus':
continue
try:
# Calculate file hash for duplicate detection
hasher = hashlib.md5()
with open(f, 'rb') as file:
buf = file.read(65536) # Read in 64kb chunks
while len(buf) > 0:
hasher.update(buf)
buf = file.read(65536)
file_hash = hasher.hexdigest()
# Skip if we've seen this exact file before
if file_hash in file_hashes:
print(f"Removing duplicate file: {f.name}")
f.unlink()
continue
file_hashes.add(file_hash)
files.append(f)
except Exception as e:
print(f"Error processing {f}: {e}")
if not files: if not files:
raise FileNotFoundError(f"No opus files to concatenate in {user_dir}") # If no files, create an empty stream.opus
output_file.write_bytes(b'')
return output_file
random.shuffle(files) random.shuffle(files)
# Create a filelist for ffmpeg concat # Create a filelist for ffmpeg concat

View File

@ -1,11 +1,33 @@
# database.py — SQLModel engine/session for PostgreSQL # database.py — SQLModel engine/session for PostgreSQL
from sqlmodel import create_engine, Session from sqlmodel import create_engine, Session, SQLModel
from contextlib import contextmanager
import os import os
POSTGRES_URL = os.getenv("DATABASE_URL", "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream") # Debug messages disabled
engine = create_engine(POSTGRES_URL, echo=False)
POSTGRES_URL = os.getenv("DATABASE_URL", "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream")
engine = create_engine(POSTGRES_URL, echo=False) # Disable echo for production
# SQLAlchemy Base class for models
Base = SQLModel
@contextmanager
def get_db(): def get_db():
with Session(engine) as session: """Session management context manager that ensures proper commit/rollback."""
session = Session(engine)
try:
# Debug messages disabled
yield session yield session
session.commit()
# Debug messages disabled
except Exception as e:
# Debug messages disabled
session.rollback()
raise
finally:
# Debug messages disabled
session.close()
# For backward compatibility
get_db_deprecated = get_db

View File

@ -1,40 +0,0 @@
# dev_user.py — Script to create and confirm a dev user for dicta2stream
import os
from sqlmodel import Session
from database import engine
from models import User, UserQuota
from datetime import datetime
import uuid
USERNAME = os.getenv("DEV_USERNAME", "devuser")
EMAIL = os.getenv("DEV_EMAIL", "devuser@localhost")
IP = os.getenv("DEV_IP", "127.0.0.1")
with Session(engine) as session:
user = session.get(User, EMAIL)
if not user:
token = str(uuid.uuid4())
user = User(
email=EMAIL,
username=USERNAME,
token=token,
confirmed=True,
ip=IP,
token_created=datetime.utcnow()
)
session.add(user)
print(f"[INFO] Created new dev user: {USERNAME} with email: {EMAIL}")
else:
user.confirmed = True
user.ip = IP
print(f"[INFO] Existing user found. Marked as confirmed: {USERNAME}")
quota = session.get(UserQuota, USERNAME)
if not quota:
quota = UserQuota(uid=USERNAME, storage_bytes=0)
session.add(quota)
print(f"[INFO] Created quota for user: {USERNAME}")
session.commit()
print(f"[INFO] Dev user ready: {USERNAME} ({EMAIL}) — confirmed, IP={IP}")
print(f"[INFO] To use: set localStorage uid and confirmed_uid to '{USERNAME}' in your browser.")

22
dicta2stream.service Normal file
View File

@ -0,0 +1,22 @@
[Unit]
Description=Dicta2Stream FastAPI application (Gunicorn)
After=network.target
[Service]
User=oib
Group=www-data
WorkingDirectory=/home/oib/games/dicta2stream
Environment="PATH=/home/oib/games/dicta2stream/venv/bin"
Environment="PYTHONPATH=/home/oib/games/dicta2stream"
ExecStart=/home/oib/games/dicta2stream/venv/bin/gunicorn -c gunicorn_config.py main:app
Restart=always
RestartSec=5
# Security
NoNewPrivileges=true
PrivateTmp=true
ProtectSystem=full
ProtectHome=read-only
[Install]
WantedBy=multi-user.target

View File

@ -0,0 +1,307 @@
--
-- PostgreSQL database dump
--
-- Dumped from database version 15.13 (Debian 15.13-0+deb12u1)
-- Dumped by pg_dump version 15.13 (Debian 15.13-0+deb12u1)
SET statement_timeout = 0;
SET lock_timeout = 0;
SET idle_in_transaction_session_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SELECT pg_catalog.set_config('search_path', '', false);
SET check_function_bodies = false;
SET xmloption = content;
SET client_min_messages = warning;
SET row_security = off;
SET default_tablespace = '';
SET default_table_access_method = heap;
--
-- Name: alembic_version; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public.alembic_version (
version_num character varying(32) NOT NULL
);
ALTER TABLE public.alembic_version OWNER TO d2s;
--
-- Name: dbsession; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public.dbsession (
token character varying NOT NULL,
uid character varying NOT NULL,
ip_address character varying NOT NULL,
user_agent character varying NOT NULL,
created_at timestamp without time zone NOT NULL,
expires_at timestamp without time zone NOT NULL,
is_active boolean NOT NULL,
last_activity timestamp without time zone NOT NULL
);
ALTER TABLE public.dbsession OWNER TO d2s;
--
-- Name: publicstream; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public.publicstream (
uid character varying NOT NULL,
username character varying,
storage_bytes integer NOT NULL,
mtime integer NOT NULL,
last_updated timestamp without time zone,
created_at timestamp without time zone NOT NULL,
updated_at timestamp without time zone NOT NULL
);
ALTER TABLE public.publicstream OWNER TO d2s;
--
-- Name: uploadlog; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public.uploadlog (
id integer NOT NULL,
uid character varying NOT NULL,
ip character varying NOT NULL,
filename character varying,
processed_filename character varying,
size_bytes integer NOT NULL,
created_at timestamp without time zone NOT NULL
);
ALTER TABLE public.uploadlog OWNER TO d2s;
--
-- Name: uploadlog_id_seq; Type: SEQUENCE; Schema: public; Owner: d2s
--
CREATE SEQUENCE public.uploadlog_id_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER TABLE public.uploadlog_id_seq OWNER TO d2s;
--
-- Name: uploadlog_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: d2s
--
ALTER SEQUENCE public.uploadlog_id_seq OWNED BY public.uploadlog.id;
--
-- Name: user; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public."user" (
token_created timestamp without time zone NOT NULL,
email character varying NOT NULL,
username character varying NOT NULL,
token character varying NOT NULL,
confirmed boolean NOT NULL,
ip character varying NOT NULL
);
ALTER TABLE public."user" OWNER TO d2s;
--
-- Name: userquota; Type: TABLE; Schema: public; Owner: d2s
--
CREATE TABLE public.userquota (
uid character varying NOT NULL,
storage_bytes integer NOT NULL
);
ALTER TABLE public.userquota OWNER TO d2s;
--
-- Name: uploadlog id; Type: DEFAULT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.uploadlog ALTER COLUMN id SET DEFAULT nextval('public.uploadlog_id_seq'::regclass);
--
-- Data for Name: alembic_version; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public.alembic_version (version_num) FROM stdin;
\.
--
-- Data for Name: dbsession; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public.dbsession (token, uid, ip_address, user_agent, created_at, expires_at, is_active, last_activity) FROM stdin;
6Y3PfCj-Mk3qLRttXCul8GTFZU9XWZtoHjk9I4EqnTE oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:32:21.725005 2025-08-07 10:32:21.724909 t 2025-08-06 10:32:21.725012
uGnwnfsAUzbNJZoqYsbT__tVxqfl4NtOD04UKYp8FEY oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:35:43.931018 2025-08-07 10:35:43.930918 t 2025-08-06 10:35:43.931023
OmKl-RrM8D4624xmNQigD3tdG4aXq8CzUq7Ch0qEhP4 oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:36:02.758938 2025-08-07 10:36:02.758873 t 2025-08-06 10:36:02.758941
gGpgdAbmpwY3a-zY1Ri92l7hUEjg-GyIt1o2kIDwBE8 oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:45:59.701084 2025-08-07 10:45:59.70098 t 2025-08-06 10:45:59.701091
GT9OKNxnhThcFXKvMBBVop7kczUH-4fE4bkCcRd17xE oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:46:14.181147 2025-08-07 10:46:14.181055 t 2025-08-06 10:46:14.181152
Ok0mwpRLa5Fuimt9eN0l-xUaaCmpipokTkOILSxJNuA oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:46:27.910441 2025-08-07 10:46:27.91036 t 2025-08-06 10:46:27.910444
DCTd4zCq_Lp_GxdwI14hFwZiDjfvNVvQrUVznllTdIA oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:46:35.928008 2025-08-07 10:46:35.927945 t 2025-08-06 10:46:35.928011
dtv0uti4QUudgMTnS1NRzZ9nD9vhLO1stM5bdXL4I1o oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:46:36.104031 2025-08-07 10:46:36.103944 t 2025-08-06 10:46:36.104034
NHZQSW6C2H-5Wq6Un6NqcAmnfSt1PqJeYJnwFKSjAss oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:51:33.897379 2025-08-07 10:51:33.897295 t 2025-08-06 10:51:33.897385
yYZeeLyXmwpyr8Uu1szIyyoIpLc7qiWfQwB57f4kqNI oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:53:43.711315 2025-08-07 10:53:43.711223 t 2025-08-06 10:53:43.71132
KhH9FO4D15l3-SUUkFHjR5Oj1N6Ld-NLmkzaM1QMhtU oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 10:56:22.050456 2025-08-07 10:56:22.050377 t 2025-08-06 10:56:22.050461
zPQqqHEY4l7ZhLrBPBnvQdsQhQj1_j0n9H6CCnIAME8 oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 11:29:49.412786 2025-08-07 11:29:49.412706 t 2025-08-06 11:29:49.412792
oxYZ9qTaezYliV6UtsI62RpPClj7rIAVXK_1FB3gYMQ oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 11:34:42.099366 2025-08-07 11:34:42.099276 t 2025-08-06 11:34:42.099371
Ml6aHvae2EPXs9SWZX1BI_mNKgasjIVRMWnUSwKwixQ oib@chello.at 127.0.0.1 Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 2025-08-06 11:38:06.002942 2025-08-07 11:38:06.002845 t 2025-08-06 11:38:06.002949
\.
--
-- Data for Name: publicstream; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public.publicstream (uid, username, storage_bytes, mtime, last_updated, created_at, updated_at) FROM stdin;
oib@chello.at oibchello 16151127 1754453233 2025-08-06 06:22:53.97839 2025-08-06 06:07:13.525122 2025-08-06 06:07:13.525126
\.
--
-- Data for Name: uploadlog; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public.uploadlog (id, uid, ip, filename, processed_filename, size_bytes, created_at) FROM stdin;
111 oib@chello.at 127.0.0.1 Taös - Bobstep [ Dubstep ] [1YGV5cNJrt0].opus 210388e1-2a9b-4b7c-a72f-d4059111ee80.opus 688750 2025-08-06 06:22:53.970258
112 oib@chello.at backfilled 107_5e6c3567-7457-48f4-83fc-f3073f065718.opus 107_5e6c3567-7457-48f4-83fc-f3073f065718.opus 671050 2025-08-06 08:14:43.312825
99 oib@chello.at 127.0.0.1 Pendulum - Set Me On Fire (Rasta Dubstep Rastep Raggastep) [ndShSlWMaeA].opus b0afe675-de49-43eb-ab77-86e592934342.opus 1051596 2025-08-06 06:07:13.504649
100 oib@chello.at 127.0.0.1 Roots Reggae (1976) [Unreleased Album] Judah Khamani - Twelve Gates of Rebirth [94NDoPCjRL0].opus 6e0e4d7c-31a6-4d3b-ad26-1ccb8aeaaf55.opus 4751764 2025-08-06 06:08:00.96213
101 oib@chello.at backfilled 98_15ba146a-8285-4233-9d44-e77e5fc19cd6.opus 98_15ba146a-8285-4233-9d44-e77e5fc19cd6.opus 805775 2025-08-06 08:05:27.805988
102 oib@chello.at backfilled 97_74e975bf-22f8-4b98-8111-dbcd195a62a2.opus 97_74e975bf-22f8-4b98-8111-dbcd195a62a2.opus 775404 2025-08-06 07:57:50.570271
103 oib@chello.at backfilled 99_b0afe675-de49-43eb-ab77-86e592934342.opus 99_b0afe675-de49-43eb-ab77-86e592934342.opus 1051596 2025-08-06 08:07:13.493002
104 oib@chello.at backfilled 100_6e0e4d7c-31a6-4d3b-ad26-1ccb8aeaaf55.opus 100_6e0e4d7c-31a6-4d3b-ad26-1ccb8aeaaf55.opus 4751764 2025-08-06 08:08:00.944561
105 oib@chello.at backfilled stream.opus stream.opus 7384026 2025-08-06 08:08:01.540555
106 oib@chello.at 127.0.0.1 Roots Reggae (1973) [Unreleased Album] Judah Khamani - Scrolls of the Fire Lion🔥 [wZvlYr5Baa8].opus 516c2ea1-6bf3-4461-91c6-e7c47e913743.opus 4760432 2025-08-06 06:14:17.072377
107 oib@chello.at 127.0.0.1 Reggae Shark Dubstep remix [101PfefUH5A].opus 5e6c3567-7457-48f4-83fc-f3073f065718.opus 671050 2025-08-06 06:14:43.326351
108 oib@chello.at 127.0.0.1 SiriuX - RastaFari (Dubstep REMIX) [VVAWgX0IgxY].opus 25aa73c3-2a9c-4659-835d-8280a0381dc4.opus 939266 2025-08-06 06:17:55.519608
109 oib@chello.at 127.0.0.1 I'm Death, Straight Up DEATH WHISTLE (Wubbaduck x Auphinity DUBSTEP REMIX) [BK6_6RB2h64].opus 9c9b6356-d5b7-427f-9179-942593cd97e6.opus 805775 2025-08-06 06:19:41.29278
110 oib@chello.at 127.0.0.1 N.A.S.A. Way Down (feat. RZA, Barbie Hatch, & John Frusciante).mp3 72c4ce3e-c991-4fb4-b5ab-b2f83b6f616d.opus 901315 2025-08-06 06:22:01.727741
113 oib@chello.at backfilled 110_72c4ce3e-c991-4fb4-b5ab-b2f83b6f616d.opus 110_72c4ce3e-c991-4fb4-b5ab-b2f83b6f616d.opus 901315 2025-08-06 08:22:01.71671
114 oib@chello.at backfilled 108_25aa73c3-2a9c-4659-835d-8280a0381dc4.opus 108_25aa73c3-2a9c-4659-835d-8280a0381dc4.opus 939266 2025-08-06 08:17:55.511047
115 oib@chello.at backfilled 106_516c2ea1-6bf3-4461-91c6-e7c47e913743.opus 106_516c2ea1-6bf3-4461-91c6-e7c47e913743.opus 4760432 2025-08-06 08:14:17.057068
116 oib@chello.at backfilled 109_9c9b6356-d5b7-427f-9179-942593cd97e6.opus 109_9c9b6356-d5b7-427f-9179-942593cd97e6.opus 805775 2025-08-06 08:19:41.282058
117 oib@chello.at backfilled 111_210388e1-2a9b-4b7c-a72f-d4059111ee80.opus 111_210388e1-2a9b-4b7c-a72f-d4059111ee80.opus 688750 2025-08-06 08:22:53.960209
\.
--
-- Data for Name: user; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public."user" (token_created, email, username, token, confirmed, ip) FROM stdin;
2025-08-06 11:37:50.164201 oib@chello.at oibchello 69aef338-4f18-44b2-96bb-403245901d06 t 127.0.0.1
\.
--
-- Data for Name: userquota; Type: TABLE DATA; Schema: public; Owner: d2s
--
COPY public.userquota (uid, storage_bytes) FROM stdin;
oib@chello.at 16151127
\.
--
-- Name: uploadlog_id_seq; Type: SEQUENCE SET; Schema: public; Owner: d2s
--
SELECT pg_catalog.setval('public.uploadlog_id_seq', 117, true);
--
-- Name: alembic_version alembic_version_pkc; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.alembic_version
ADD CONSTRAINT alembic_version_pkc PRIMARY KEY (version_num);
--
-- Name: dbsession dbsession_pkey; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.dbsession
ADD CONSTRAINT dbsession_pkey PRIMARY KEY (token);
--
-- Name: publicstream publicstream_pkey; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.publicstream
ADD CONSTRAINT publicstream_pkey PRIMARY KEY (uid);
--
-- Name: uploadlog uploadlog_pkey; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.uploadlog
ADD CONSTRAINT uploadlog_pkey PRIMARY KEY (id);
--
-- Name: user user_pkey; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public."user"
ADD CONSTRAINT user_pkey PRIMARY KEY (email);
--
-- Name: userquota userquota_pkey; Type: CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.userquota
ADD CONSTRAINT userquota_pkey PRIMARY KEY (uid);
--
-- Name: ix_publicstream_username; Type: INDEX; Schema: public; Owner: d2s
--
CREATE INDEX ix_publicstream_username ON public.publicstream USING btree (username);
--
-- Name: ix_user_username; Type: INDEX; Schema: public; Owner: d2s
--
CREATE UNIQUE INDEX ix_user_username ON public."user" USING btree (username);
--
-- Name: dbsession dbsession_user_id_fkey; Type: FK CONSTRAINT; Schema: public; Owner: d2s
--
ALTER TABLE ONLY public.dbsession
ADD CONSTRAINT dbsession_user_id_fkey FOREIGN KEY (uid) REFERENCES public."user"(email);
--
-- PostgreSQL database dump complete
--

131
docs/auth-consolidation.md Normal file
View File

@ -0,0 +1,131 @@
# Authentication Logic Consolidation
## Overview
The authentication logic has been consolidated from multiple scattered files into a single, centralized `AuthManager` class. This improves maintainability, reduces code duplication, and provides a consistent authentication interface.
## Files Changed
### 1. New Centralized Module
- **`static/auth-manager.js`** - New centralized authentication manager class
### 2. Refactored Files
- **`static/auth.js`** - Simplified to use AuthManager
- **`static/magic-login.js`** - Simplified to use AuthManager
- **`static/cleanup-auth.js`** - Simplified to use AuthManager
## AuthManager Features
### Core Functionality
- **Centralized State Management** - Single source of truth for authentication state
- **Cookie & localStorage Management** - Consistent handling of auth data storage
- **Magic Link Processing** - Handles both URL-based and token-based magic login
- **Authentication Polling** - Periodic state checks with caching and debouncing
- **User Session Management** - Login, logout, and account deletion
### Key Methods
- `initialize()` - Initialize the auth manager and handle magic login
- `setAuthState(email, username, token)` - Set authentication state
- `clearAuthState()` - Clear all authentication data
- `isAuthenticated()` - Check current authentication status
- `getCurrentUser()` - Get current user data
- `logout()` - Perform logout and redirect
- `deleteAccount()` - Handle account deletion
- `cleanupAuthState(email)` - Clean up inconsistent auth state
### Authentication Flow
1. **Magic Login Detection** - Checks URL parameters for login tokens/success
2. **User Info Retrieval** - Fetches email from `/api/me` endpoint
3. **State Setting** - Sets email as primary UID, username for display
4. **UI Updates** - Updates body classes and initializes user session
5. **Navigation** - Redirects to user profile page
## Data Storage Strategy
### localStorage Keys
- `uid` - Primary identifier (email-based)
- `user_email` - Explicit email storage
- `username` - Display name (separate from UID)
- `authToken` - Authentication token
- `isAuthenticated` - Boolean authentication state
- `uid_time` - Session timestamp
### Cookie Strategy
- `uid` - Email-based UID with `SameSite=Lax`
- `authToken` - Auth token with `SameSite=Lax; Secure`
- `isAuthenticated` - Boolean flag with `SameSite=Lax`
## Removed Redundancy
### Eliminated Duplicate Code
- **User info fetching** - Centralized in `fetchUserInfo()`
- **Auth state setting** - Centralized in `setAuthState()`
- **Cookie management** - Centralized in `setAuthState()` and `clearAuthState()`
- **Magic login processing** - Centralized in `processMagicLogin()` and `processTokenLogin()`
### Removed Fields
- `confirmed_uid` - Was duplicate of `uid`, now eliminated
## Backward Compatibility
### Global Functions (Legacy Support)
- `window.getCurrentUser()` - Get current user data
- `window.isAuthenticated()` - Check authentication status
- `window.logout()` - Perform logout
- `window.cleanupAuthState(email)` - Clean up auth state
### Existing Function Exports
- `initMagicLogin()` - Maintained in magic-login.js for compatibility
- `cleanupAuthState()` - Maintained in cleanup-auth.js for compatibility
## Benefits Achieved
### 1. **Maintainability**
- Single source of authentication logic
- Consistent error handling and logging
- Easier to debug and modify
### 2. **Performance**
- Reduced code duplication
- Optimized caching and debouncing
- Fewer redundant API calls
### 3. **Reliability**
- Consistent state management
- Proper cleanup on logout
- Robust error handling
### 4. **Security**
- Consistent cookie security attributes
- Proper state clearing on logout
- Centralized validation
## Migration Notes
### For Developers
- Import `authManager` from `./auth-manager.js` for new code
- Use `authManager.isAuthenticated()` instead of manual checks
- Use `authManager.getCurrentUser()` for user data
- Legacy global functions still work for existing code
### Testing
- Test magic link login (both URL and token-based)
- Test authentication state persistence
- Test logout and account deletion
- Test authentication polling and state changes
## Future Improvements
### Potential Enhancements
1. **Token Refresh** - Automatic token renewal
2. **Session Timeout** - Configurable session expiration
3. **Multi-tab Sync** - Better cross-tab authentication sync
4. **Audit Logging** - Enhanced authentication event logging
5. **Rate Limiting** - Protection against auth abuse
### Configuration Options
Consider adding configuration for:
- Polling intervals
- Cache TTL values
- Debug logging levels
- Cookie security settings

221
execute_db_cleanup.py Normal file
View File

@ -0,0 +1,221 @@
#!/usr/bin/env python3
"""
Execute Database Legacy Data Cleanup
Fixes issues identified in the database analysis using direct SQL execution
"""
import sys
from sqlmodel import Session, text
from database import engine
def execute_step(session, step_name, query, description):
"""Execute a cleanup step and report results"""
print(f"\n=== {step_name} ===")
print(f"Description: {description}")
print(f"Query: {query}")
try:
result = session.exec(text(query))
if query.strip().upper().startswith('SELECT'):
rows = result.fetchall()
print(f"Result: {len(rows)} rows")
for row in rows:
print(f" {row}")
else:
session.commit()
print(f"✅ Success: {result.rowcount} rows affected")
return True
except Exception as e:
print(f"❌ Error: {e}")
session.rollback()
return False
def main():
"""Execute database cleanup step by step"""
print("=== DATABASE LEGACY DATA CLEANUP ===")
with Session(engine) as session:
success_count = 0
total_steps = 0
# Step 1: Fix User Table - Update username to match email format
total_steps += 1
if execute_step(
session,
"STEP 1: Fix User Table",
"""UPDATE "user"
SET username = email,
display_name = CASE
WHEN display_name = '' OR display_name IS NULL
THEN split_part(email, '@', 1)
ELSE display_name
END
WHERE email = 'oib@chello.at'""",
"Update username to match email format and set display_name"
):
success_count += 1
# Verify Step 1
execute_step(
session,
"VERIFY STEP 1",
"""SELECT email, username, display_name, confirmed
FROM "user" WHERE email = 'oib@chello.at'""",
"Verify user table fix"
)
# Step 2: Clean Up Expired Sessions
total_steps += 1
if execute_step(
session,
"STEP 2: Mark Expired Sessions Inactive",
"""UPDATE dbsession
SET is_active = false
WHERE expires_at < NOW() AND is_active = true""",
"Mark expired sessions as inactive for security"
):
success_count += 1
# Verify Step 2
execute_step(
session,
"VERIFY STEP 2",
"""SELECT COUNT(*) as expired_active_sessions
FROM dbsession
WHERE expires_at < NOW() AND is_active = true""",
"Check for remaining expired active sessions"
)
# Step 3: Update Session user_id to Email Format
total_steps += 1
if execute_step(
session,
"STEP 3: Update Session user_id",
"""UPDATE dbsession
SET user_id = 'oib@chello.at'
WHERE user_id = 'oibchello'""",
"Update session user_id to use email format"
):
success_count += 1
# Verify Step 3
execute_step(
session,
"VERIFY STEP 3",
"""SELECT DISTINCT user_id FROM dbsession""",
"Check session user_id values"
)
# Step 4: Fix PublicStream Username Fields
total_steps += 1
if execute_step(
session,
"STEP 4: Fix PublicStream",
"""UPDATE publicstream
SET username = uid,
display_name = CASE
WHEN display_name = 'oibchello'
THEN split_part(uid, '@', 1)
ELSE display_name
END
WHERE uid = 'oib@chello.at'""",
"Update PublicStream username to match UID"
):
success_count += 1
# Verify Step 4
execute_step(
session,
"VERIFY STEP 4",
"""SELECT uid, username, display_name
FROM publicstream WHERE uid = 'oib@chello.at'""",
"Verify PublicStream fix"
)
# Step 5: Remove Orphaned Records
total_steps += 1
orphan_success = True
# Remove orphaned quota record
if not execute_step(
session,
"STEP 5a: Remove Orphaned Quota",
"""DELETE FROM userquota WHERE uid = 'oib@bubuit.net'""",
"Remove orphaned quota record for deleted user"
):
orphan_success = False
# Remove orphaned stream record
if not execute_step(
session,
"STEP 5b: Remove Orphaned Stream",
"""DELETE FROM publicstream WHERE uid = 'oib@bubuit.net'""",
"Remove orphaned stream record for deleted user"
):
orphan_success = False
if orphan_success:
success_count += 1
# Verify Step 5
execute_step(
session,
"VERIFY STEP 5",
"""SELECT 'userquota' as table_name, COUNT(*) as count
FROM userquota WHERE uid = 'oib@bubuit.net'
UNION ALL
SELECT 'publicstream' as table_name, COUNT(*) as count
FROM publicstream WHERE uid = 'oib@bubuit.net'""",
"Verify orphaned records are removed"
)
# Final Verification
print(f"\n=== FINAL VERIFICATION ===")
# Check for remaining issues
execute_step(
session,
"FINAL CHECK",
"""SELECT 'ISSUE: User email/username mismatch' as issue
FROM "user"
WHERE email != username
UNION ALL
SELECT 'ISSUE: Expired active sessions'
FROM dbsession
WHERE expires_at < NOW() AND is_active = true
LIMIT 1
UNION ALL
SELECT 'ISSUE: PublicStream UID/username mismatch'
FROM publicstream
WHERE uid != username
LIMIT 1
UNION ALL
SELECT 'ISSUE: Orphaned quota records'
FROM userquota q
LEFT JOIN "user" u ON q.uid = u.email
WHERE u.email IS NULL
LIMIT 1
UNION ALL
SELECT 'ISSUE: Orphaned stream records'
FROM publicstream p
LEFT JOIN "user" u ON p.uid = u.email
WHERE u.email IS NULL
LIMIT 1""",
"Check for any remaining legacy issues"
)
# Summary
print(f"\n=== CLEANUP SUMMARY ===")
print(f"Total steps: {total_steps}")
print(f"Successful steps: {success_count}")
print(f"Failed steps: {total_steps - success_count}")
if success_count == total_steps:
print("✅ All legacy database issues have been fixed!")
else:
print("⚠️ Some issues remain. Check the output above for details.")
return 0 if success_count == total_steps else 1
if __name__ == "__main__":
sys.exit(main())

174
fix_db_constraints.py Normal file
View File

@ -0,0 +1,174 @@
#!/usr/bin/env python3
"""
Fix Database Constraints and Legacy Data
Handles foreign key constraints properly during cleanup
"""
import sys
from sqlmodel import Session, text
from database import engine
def execute_query(session, query, description):
"""Execute a query and report results"""
print(f"\n{description}")
print(f"Query: {query}")
try:
result = session.exec(text(query))
if query.strip().upper().startswith('SELECT'):
rows = result.fetchall()
print(f"Result: {len(rows)} rows")
for row in rows:
print(f" {row}")
else:
session.commit()
print(f"✅ Success: {result.rowcount} rows affected")
return True
except Exception as e:
print(f"❌ Error: {e}")
session.rollback()
return False
def main():
"""Fix database constraints and legacy data"""
print("=== FIXING DATABASE CONSTRAINTS AND LEGACY DATA ===")
with Session(engine) as session:
# Step 1: First, let's temporarily drop the foreign key constraint
print("\n=== STEP 1: Handle Foreign Key Constraint ===")
# Check current constraint
execute_query(
session,
"""SELECT conname, conrelid::regclass, confrelid::regclass
FROM pg_constraint
WHERE conname = 'dbsession_user_id_fkey'""",
"Check existing foreign key constraint"
)
# Drop the constraint temporarily
execute_query(
session,
"""ALTER TABLE dbsession DROP CONSTRAINT IF EXISTS dbsession_user_id_fkey""",
"Drop foreign key constraint temporarily"
)
# Step 2: Update user table
print("\n=== STEP 2: Update User Table ===")
execute_query(
session,
"""UPDATE "user"
SET username = email,
display_name = CASE
WHEN display_name = '' OR display_name IS NULL
THEN split_part(email, '@', 1)
ELSE display_name
END
WHERE email = 'oib@chello.at'""",
"Update user username to match email"
)
# Verify user update
execute_query(
session,
"""SELECT email, username, display_name FROM "user" WHERE email = 'oib@chello.at'""",
"Verify user table update"
)
# Step 3: Update session user_id references
print("\n=== STEP 3: Update Session References ===")
execute_query(
session,
"""UPDATE dbsession
SET user_id = 'oib@chello.at'
WHERE user_id = 'oibchello'""",
"Update session user_id to email format"
)
# Verify session updates
execute_query(
session,
"""SELECT DISTINCT user_id FROM dbsession""",
"Verify session user_id updates"
)
# Step 4: Recreate the foreign key constraint
print("\n=== STEP 4: Recreate Foreign Key Constraint ===")
execute_query(
session,
"""ALTER TABLE dbsession
ADD CONSTRAINT dbsession_user_id_fkey
FOREIGN KEY (user_id) REFERENCES "user"(username)""",
"Recreate foreign key constraint"
)
# Step 5: Final verification - check for remaining issues
print("\n=== STEP 5: Final Verification ===")
# Check user email/username match
execute_query(
session,
"""SELECT email, username,
CASE WHEN email = username THEN '✓ Match' ELSE '✗ Mismatch' END as status
FROM "user""",
"Check user email/username consistency"
)
# Check expired sessions
execute_query(
session,
"""SELECT COUNT(*) as expired_active_sessions
FROM dbsession
WHERE expires_at < NOW() AND is_active = true""",
"Check for expired active sessions"
)
# Check PublicStream consistency
execute_query(
session,
"""SELECT uid, username,
CASE WHEN uid = username THEN '✓ Match' ELSE '✗ Mismatch' END as status
FROM publicstream""",
"Check PublicStream UID/username consistency"
)
# Check for orphaned records
execute_query(
session,
"""SELECT 'userquota' as table_name, COUNT(*) as orphaned_records
FROM userquota q
LEFT JOIN "user" u ON q.uid = u.email
WHERE u.email IS NULL
UNION ALL
SELECT 'publicstream' as table_name, COUNT(*) as orphaned_records
FROM publicstream p
LEFT JOIN "user" u ON p.uid = u.email
WHERE u.email IS NULL""",
"Check for orphaned records"
)
# Summary of current state
print("\n=== DATABASE STATE SUMMARY ===")
execute_query(
session,
"""SELECT
COUNT(DISTINCT u.email) as total_users,
COUNT(DISTINCT q.uid) as quota_records,
COUNT(DISTINCT p.uid) as stream_records,
COUNT(CASE WHEN s.is_active THEN 1 END) as active_sessions,
COUNT(CASE WHEN s.expires_at < NOW() AND s.is_active THEN 1 END) as expired_active_sessions
FROM "user" u
FULL OUTER JOIN userquota q ON u.email = q.uid
FULL OUTER JOIN publicstream p ON u.email = p.uid
FULL OUTER JOIN dbsession s ON u.username = s.user_id""",
"Database state summary"
)
print("\n✅ Database cleanup completed!")
print("All legacy data issues should now be resolved.")
return 0
if __name__ == "__main__":
sys.exit(main())

13
fix_dbsession_fk.sql Normal file
View File

@ -0,0 +1,13 @@
-- Migration script to update DBSession foreign key to reference user.email
-- Run this when no active sessions exist to avoid deadlocks
BEGIN;
-- Step 1: Drop the existing foreign key constraint if it exists
ALTER TABLE dbsession DROP CONSTRAINT IF EXISTS dbsession_user_id_fkey;
-- Step 2: Add the new foreign key constraint referencing user.email
ALTER TABLE dbsession ADD CONSTRAINT dbsession_uid_fkey
FOREIGN KEY (uid) REFERENCES "user"(email);
COMMIT;

View File

@ -1,10 +1,16 @@
bind = "0.0.0.0:8000" bind = "0.0.0.0:8000"
workers = 2 # Tune based on available CPU cores workers = 2 # Tune based on available CPU cores
worker_class = "uvicorn.workers.UvicornWorker" worker_class = "uvicorn.workers.UvicornWorker"
timeout = 60 timeout = 300 # Increased from 60 to 300 seconds (5 minutes)
keepalive = 30 keepalive = 30
loglevel = "info" loglevel = "info"
accesslog = "-" accesslog = "-"
errorlog = "-" errorlog = "-"
proxy_allow_ips = "*" proxy_allow_ips = "*"
max_requests = 1000
max_requests_jitter = 50
worker_connections = 1000
limit_request_line = 0 # No limit on request line size
limit_request_field_size = 0 # No limit on field size
limit_request_fields = 100 # Limit number of header fields

35
gunicorn_config.py Normal file
View File

@ -0,0 +1,35 @@
# Gunicorn configuration file
import multiprocessing
import os
# Server socket
bind = "0.0.0.0:8000"
# Worker processes
workers = multiprocessing.cpu_count() * 2 + 1
worker_class = "uvicorn.workers.UvicornWorker"
worker_connections = 1000
max_requests = 1000
max_requests_jitter = 50
timeout = 120
keepalive = 5
# Security
limit_request_line = 4094
limit_request_fields = 50
limit_request_field_size = 8190
# Debugging
debug = os.getenv("DEBUG", "false").lower() == "true"
reload = debug
# Logging
loglevel = "debug" if debug else "info"
accesslog = "-" # Log to stdout
errorlog = "-" # Log to stderr
# Server mechanics
preload_app = True
# Process naming
proc_name = "dicta2stream"

View File

@ -1,64 +1,156 @@
# list_streams.py — FastAPI route to list all public streams (users with stream.opus) # list_streams.py — FastAPI route to list all public streams (users with stream.opus)
from fastapi import APIRouter from fastapi import APIRouter, Request, Depends
from fastapi.responses import StreamingResponse, Response
from sqlalchemy.orm import Session
from sqlalchemy import select
from models import PublicStream
from database import get_db
from pathlib import Path from pathlib import Path
from fastapi.responses import StreamingResponse
import asyncio import asyncio
import os
import json
router = APIRouter() router = APIRouter()
DATA_ROOT = Path("./data") DATA_ROOT = Path("./data")
@router.get("/streams-sse") @router.get("/streams-sse")
def streams_sse(): async def streams_sse(request: Request):
return list_streams_sse() # Add CORS headers for SSE
origin = request.headers.get('origin', '')
allowed_origins = ["https://dicta2stream.net", "http://localhost:8000", "http://127.0.0.1:8000"]
import json # Use the request origin if it's in the allowed list, otherwise use the first allowed origin
cors_origin = origin if origin in allowed_origins else allowed_origins[0]
import datetime headers = {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache, no-transform",
"Connection": "keep-alive",
"Access-Control-Allow-Origin": cors_origin,
"Access-Control-Allow-Credentials": "true",
"Access-Control-Expose-Headers": "Content-Type",
"X-Accel-Buffering": "no" # Disable buffering for nginx
}
def list_streams_sse(): # Handle preflight requests
async def event_generator(): if request.method == "OPTIONS":
txt_path = Path("./public_streams.txt") headers.update({
if not txt_path.exists(): "Access-Control-Allow-Methods": "GET, OPTIONS",
print(f"[{datetime.datetime.now()}] [SSE] No public_streams.txt found") "Access-Control-Allow-Headers": request.headers.get("access-control-request-headers", "*"),
"Access-Control-Max-Age": "86400" # 24 hours
})
return Response(status_code=204, headers=headers)
async def event_wrapper():
# Use the database session context manager
with get_db() as db:
try:
async for event in list_streams_sse(db):
yield event
except Exception as e:
# Only log errors if DEBUG is enabled
# Debug messages disabled
yield f"data: {json.dumps({'error': True, 'message': 'An error occurred'})}\n\n"
return StreamingResponse(
event_wrapper(),
media_type="text/event-stream",
headers=headers
)
async def list_streams_sse(db):
"""Stream public streams from the database as Server-Sent Events"""
try:
# Send initial ping
yield ":ping\n\n"
# Query all public streams from the database with required fields
# Also get all valid users to filter out orphaned streams
from models import User
# Use the query interface instead of execute
all_streams = db.query(PublicStream).order_by(PublicStream.mtime.desc()).all()
# Get all valid user UIDs (email and username)
all_users = db.query(User).all()
valid_uids = set()
for user in all_users:
valid_uids.add(user.email)
valid_uids.add(user.username)
# Filter out orphaned streams (streams without corresponding user accounts)
streams = []
orphaned_count = 0
for stream in all_streams:
if stream.uid in valid_uids:
streams.append(stream)
else:
orphaned_count += 1
print(f"[STREAMS] Filtering out orphaned stream: {stream.uid} (username: {stream.username})")
if orphaned_count > 0:
print(f"[STREAMS] Filtered out {orphaned_count} orphaned streams from public display")
if not streams:
print("No public streams found in the database")
yield f"data: {json.dumps({'end': True})}\n\n" yield f"data: {json.dumps({'end': True})}\n\n"
return return
try:
with txt_path.open("r") as f:
for line in f:
line = line.strip()
if not line:
continue
try:
stream = json.loads(line)
print(f"[{datetime.datetime.now()}] [SSE] Yielding stream: {stream}")
yield f"data: {json.dumps(stream)}\n\n"
await asyncio.sleep(0) # Yield control to event loop
except Exception as e:
print(f"[{datetime.datetime.now()}] [SSE] JSON decode error: {e}")
continue # skip malformed lines
print(f"[{datetime.datetime.now()}] [SSE] Yielding end event")
yield f"data: {json.dumps({'end': True})}\n\n"
except Exception as e:
print(f"[{datetime.datetime.now()}] [SSE] Exception: {e}")
yield f"data: {json.dumps({'end': True, 'error': True})}\n\n"
return StreamingResponse(event_generator(), media_type="text/event-stream")
def list_streams(): # Debug messages disabled
txt_path = Path("./public_streams.txt")
if not txt_path.exists(): # Send each stream as an SSE event
return {"streams": []} for stream in streams:
try: try:
streams = [] # Ensure we have all required fields with fallbacks
with txt_path.open("r") as f: stream_data = {
for line in f: 'uid': stream.uid or '',
line = line.strip() 'size': stream.storage_bytes or 0,
if not line: 'mtime': int(stream.mtime) if stream.mtime is not None else 0,
'username': stream.username or '',
'created_at': stream.created_at.isoformat() if stream.created_at else None,
'updated_at': stream.updated_at.isoformat() if stream.updated_at else None
}
# Debug messages disabled
yield f"data: {json.dumps(stream_data)}\n\n"
# Small delay to prevent overwhelming the client
await asyncio.sleep(0.1)
except Exception as e:
print(f"Error processing stream {stream.uid}: {str(e)}")
# Debug messages disabled
continue continue
# Send end of stream marker
# Debug messages disabled
yield f"data: {json.dumps({'end': True})}\n\n"
except Exception as e:
print(f"Error in list_streams_sse: {str(e)}")
# Debug messages disabled
yield f"data: {json.dumps({'error': True, 'message': str(e)})}\n\n"
@router.get("/streams")
def list_streams():
"""List all public streams from the database"""
# Use the database session context manager
with get_db() as db:
try: try:
streams.append(json.loads(line)) # Use the query interface instead of execute
except Exception: streams = db.query(PublicStream).order_by(PublicStream.mtime.desc()).all()
continue # skip malformed lines
return {"streams": streams} return {
except Exception: "streams": [
{
'uid': stream.uid,
'size': stream.size,
'mtime': stream.mtime,
'created_at': stream.created_at.isoformat() if stream.created_at else None,
'updated_at': stream.updated_at.isoformat() if stream.updated_at else None
}
for stream in streams
]
}
except Exception as e:
# Debug messages disabled
return {"streams": []} return {"streams": []}

View File

@ -1,23 +0,0 @@
# list_user_files.py
from fastapi import APIRouter, Depends, HTTPException
from pathlib import Path
from models import User
from database import get_db
router = APIRouter()
@router.get("/user-files/{uid}")
def list_user_files(uid: str, db = Depends(get_db)):
# Check user exists and is confirmed
from sqlmodel import select
user = db.exec(select(User).where((User.username == uid) | (User.email == uid))).first()
if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"):
user = user[0]
if not user or not user.confirmed:
raise HTTPException(status_code=403, detail="Account not confirmed")
user_dir = Path("data") / uid
if not user_dir.exists() or not user_dir.is_dir():
return {"files": []}
files = [f.name for f in user_dir.iterdir() if f.is_file() and not f.name.startswith(".")]
files.sort()
return {"files": files}

3
log.py
View File

@ -15,5 +15,6 @@ def log_violation(event: str, ip: str, uid: str, reason: str):
f.write(log_entry) f.write(log_entry)
# If DEBUG mode, also print to stdout # If DEBUG mode, also print to stdout
if os.getenv("DEBUG", "0") in ("1", "true", "True"): # Set DEBUG=1 in .env to enable if os.getenv("DEBUG", "0") in ("1", "true", "True"): # Set DEBUG=1 in .env to enable
print(f"[DEBUG] {log_entry.strip()}") # Debug messages disabled
pass

116
magic.py
View File

@ -1,34 +1,118 @@
# magic.py — handle magic token login confirmation # magic.py — handle magic token login confirmation
from fastapi import APIRouter, Form, HTTPException, Depends, Request from fastapi import APIRouter, Form, HTTPException, Depends, Request, Response
from fastapi.responses import RedirectResponse from fastapi.responses import RedirectResponse, JSONResponse
from sqlmodel import Session, select from sqlmodel import Session, select
from database import get_db from database import get_db
from models import User from models import User, DBSession
from datetime import datetime, timedelta from datetime import datetime, timedelta
import secrets
import json
router = APIRouter() router = APIRouter()
@router.post("/magic-login") @router.post("/magic-login")
def magic_login(request: Request, db: Session = Depends(get_db), token: str = Form(...)): async def magic_login(request: Request, response: Response, token: str = Form(...)):
print(f"[magic-login] Received token: {token}") # Debug messages disabled
user = db.exec(select(User).where(User.token == token)).first()
print(f"[magic-login] User lookup: {'found' if user else 'not found'}") # Use the database session context manager
with get_db() as db:
try:
# Look up user by token
user = db.query(User).filter(User.token == token).first()
# Debug messages disabled
if not user: if not user:
print("[magic-login] Invalid or expired token") # Debug messages disabled
return RedirectResponse(url="/?error=Invalid%20or%20expired%20token", status_code=302) raise HTTPException(status_code=401, detail="Invalid or expired token")
if datetime.utcnow() - user.token_created > timedelta(minutes=30): if datetime.utcnow() - user.token_created > timedelta(minutes=30):
print(f"[magic-login] Token expired for user: {user.username}") # Debug messages disabled
return RedirectResponse(url="/?error=Token%20expired", status_code=302) raise HTTPException(status_code=401, detail="Token expired")
# Mark user as confirmed if not already
if not user.confirmed: if not user.confirmed:
user.confirmed = True user.confirmed = True
user.ip = request.client.host user.ip = request.client.host
db.commit() db.add(user)
print(f"[magic-login] User {user.username} confirmed. Redirecting to /?login=success&confirmed_uid={user.username}") # Debug messages disabled
else:
print(f"[magic-login] Token already used for user: {user.username}, but allowing multi-use login.")
return RedirectResponse(url=f"/?login=success&confirmed_uid={user.username}", status_code=302) # Create a new session for the user (valid for 24 hours)
session_token = secrets.token_urlsafe(32)
expires_at = datetime.utcnow() + timedelta(hours=24)
# Create new session
session = DBSession(
token=session_token,
uid=user.email or user.username, # Use email as UID
ip_address=request.client.host or "",
user_agent=request.headers.get("user-agent", ""),
expires_at=expires_at,
is_active=True
)
db.add(session)
db.commit()
# Store user data for use after the session is committed
user_email = user.email or user.username
username = user.username
except Exception as e:
db.rollback()
# Debug messages disabled
# Debug messages disabled
raise HTTPException(status_code=500, detail="Database error during login")
# Determine if we're running in development (localhost) or production
is_localhost = request.url.hostname == "localhost"
# Prepare response data
response_data = {
"success": True,
"message": "Login successful",
"user": {
"email": user_email,
"username": username
},
"token": session_token # Include the token in the JSON response
}
# Create the response
response = JSONResponse(
content=response_data,
status_code=200
)
# Set cookies
response.set_cookie(
key="sessionid",
value=session_token,
httponly=True,
secure=not is_localhost,
samesite="lax" if is_localhost else "none",
max_age=86400, # 24 hours
path="/"
)
response.set_cookie(
key="uid",
value=user_email,
samesite="lax" if is_localhost else "none",
secure=not is_localhost,
max_age=86400, # 24 hours
path="/"
)
response.set_cookie(
key="authToken",
value=session_token,
samesite="lax" if is_localhost else "none",
secure=not is_localhost,
max_age=86400, # 24 hours
path="/"
)
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
return response

809
main.py
View File

@ -1,6 +1,6 @@
# main.py — FastAPI backend entrypoint for dicta2stream # main.py — FastAPI backend entrypoint for dicta2stream
from fastapi import FastAPI, Request, Response, status, Form, UploadFile, File, Depends from fastapi import FastAPI, Request, Response, status, Form, UploadFile, File, Depends, HTTPException
from fastapi.responses import HTMLResponse, RedirectResponse, StreamingResponse, JSONResponse from fastapi.responses import HTMLResponse, RedirectResponse, StreamingResponse, JSONResponse
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
@ -11,13 +11,14 @@ import traceback
import shutil import shutil
import mimetypes import mimetypes
from typing import Optional from typing import Optional
from models import User, UploadLog from models import User, UploadLog, UserQuota, get_user_by_uid
from sqlmodel import Session, select, SQLModel from sqlmodel import Session, select, SQLModel
from database import get_db, engine from database import get_db, engine
from log import log_violation from log import log_violation
import secrets import secrets
import time import time
import json import json
import subprocess
from datetime import datetime from datetime import datetime
from dotenv import load_dotenv from dotenv import load_dotenv
@ -36,16 +37,36 @@ from fastapi.requests import Request as FastAPIRequest
from fastapi.exception_handlers import RequestValidationError from fastapi.exception_handlers import RequestValidationError
from fastapi.exceptions import HTTPException as FastAPIHTTPException from fastapi.exceptions import HTTPException as FastAPIHTTPException
app = FastAPI(debug=debug_mode) app = FastAPI(debug=debug_mode, docs_url=None, redoc_url=None, openapi_url=None)
# Override default HTML error handlers to return JSON
from fastapi.exceptions import RequestValidationError, HTTPException as FastAPIHTTPException
from fastapi.responses import JSONResponse
from starlette.exceptions import HTTPException as StarletteHTTPException
@app.exception_handler(StarletteHTTPException)
async def http_exception_handler(request, exc):
return JSONResponse(
status_code=exc.status_code,
content={"detail": exc.detail}
)
# --- CORS Middleware for SSE and API access --- # --- CORS Middleware for SSE and API access ---
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.middleware.gzip import GZipMiddleware
# Add GZip middleware for compression
app.add_middleware(GZipMiddleware, minimum_size=1000)
# Configure CORS
app.add_middleware( app.add_middleware(
CORSMiddleware, CORSMiddleware,
allow_origins=["https://dicta2stream.net", "http://localhost:8000", "http://127.0.0.1:8000"], allow_origins=["https://dicta2stream.net", "http://localhost:8000", "http://127.0.0.1:8000"],
allow_credentials=True, allow_credentials=True,
allow_methods=["*"], allow_methods=["GET", "POST", "OPTIONS"],
allow_headers=["*"], allow_headers=["*"],
expose_headers=["Content-Type", "Content-Length", "Cache-Control", "ETag", "Last-Modified"],
max_age=3600, # 1 hour
) )
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
@ -69,9 +90,32 @@ def get_current_user(request: Request, db: Session = Depends(get_db)):
from range_response import range_response from range_response import range_response
@app.get("/audio/{uid}/{filename}") @app.get("/audio/{uid}/{filename}")
def get_audio(uid: str, filename: str, request: Request, db: Session = Depends(get_db)): def get_audio(uid: str, filename: str, request: Request):
# Allow public access ONLY to stream.opus # Allow public access ONLY to stream.opus
user_dir = os.path.join("data", uid)
# Use the database session context manager
with get_db() as db:
try:
# Use email-based UID directly for file system access
# If UID contains @, it's an email - use it directly
if '@' in uid:
from models import User
user = db.query(User).filter(User.email == uid).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
filesystem_uid = uid # Use email directly for directory
else:
# Legacy support for username-based UIDs - convert to email
from models import User
user = db.query(User).filter(User.username == uid).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
filesystem_uid = user.email # Convert username to email for directory
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Database error: {str(e)}")
user_dir = os.path.join("data", filesystem_uid)
file_path = os.path.join(user_dir, filename) file_path = os.path.join(user_dir, filename)
real_user_dir = os.path.realpath(user_dir) real_user_dir = os.path.realpath(user_dir)
real_file_path = os.path.realpath(file_path) real_file_path = os.path.realpath(file_path)
@ -93,7 +137,8 @@ def get_audio(uid: str, filename: str, request: Request, db: Session = Depends(g
return FileResponse(real_file_path, media_type="audio/ogg") return FileResponse(real_file_path, media_type="audio/ogg")
if debug_mode: if debug_mode:
print("[DEBUG] FastAPI running in debug mode.") # Debug messages disabled
pass
# Global error handler to always return JSON # Global error handler to always return JSON
from slowapi.errors import RateLimitExceeded from slowapi.errors import RateLimitExceeded
@ -115,26 +160,146 @@ async def validation_exception_handler(request: FastAPIRequest, exc: RequestVali
async def generic_exception_handler(request: FastAPIRequest, exc: Exception): async def generic_exception_handler(request: FastAPIRequest, exc: Exception):
return JSONResponse(status_code=500, content={"detail": str(exc)}) return JSONResponse(status_code=500, content={"detail": str(exc)})
# Debug endpoint to list all routes
@app.get("/debug/routes")
async def list_routes():
routes = []
for route in app.routes:
if hasattr(route, "methods") and hasattr(route, "path"):
routes.append({
"path": route.path,
"methods": list(route.methods) if hasattr(route, "methods") else [],
"name": route.name if hasattr(route, "name") else "",
"endpoint": str(route.endpoint) if hasattr(route, "endpoint") else "",
"router": str(route) # Add router info for debugging
})
# Sort routes by path for easier reading
routes.sort(key=lambda x: x["path"])
# Also print to console for server logs
print("\n=== Registered Routes ===")
for route in routes:
print(f"{', '.join(route['methods']).ljust(20)} {route['path']}")
print("======================\n")
return {"routes": routes}
# include routers from submodules # include routers from submodules
from register import router as register_router from register import router as register_router
from magic import router as magic_router from magic import router as magic_router
from upload import router as upload_router from upload import router as upload_router
from streams import router as streams_router from streams import router as streams_router
from list_user_files import router as list_user_files_router
from auth_router import router as auth_router
app.include_router(streams_router) app.include_router(streams_router)
from list_streams import router as list_streams_router from list_streams import router as list_streams_router
from account_router import router as account_router
# Include all routers
app.include_router(auth_router, prefix="/api")
app.include_router(account_router)
app.include_router(register_router) app.include_router(register_router)
app.include_router(magic_router) app.include_router(magic_router)
app.include_router(upload_router) app.include_router(upload_router)
app.include_router(list_user_files_router)
app.include_router(list_streams_router) app.include_router(list_streams_router)
@app.get("/user-files/{uid}")
async def list_user_files(uid: str):
from pathlib import Path
# Get the user's directory and check for files first
user_dir = Path("data") / uid
if not user_dir.exists() or not user_dir.is_dir():
return {"files": []}
# Get all files that actually exist on disk
existing_files = {f.name for f in user_dir.iterdir() if f.is_file()}
# Use the database session context manager for all database operations
with get_db() as db:
# Verify the user exists
user_check = db.query(User).filter((User.username == uid) | (User.email == uid)).first()
if not user_check:
raise HTTPException(status_code=404, detail="User not found")
# Query the UploadLog table for this user
all_upload_logs = db.query(UploadLog).filter(
UploadLog.uid == uid
).order_by(UploadLog.created_at.desc()).all()
# Track processed files to avoid duplicates
processed_files = set()
files_metadata = []
for log in all_upload_logs:
# Skip if no processed filename
if not log.processed_filename:
continue
# Skip if we've already processed this file
if log.processed_filename in processed_files:
continue
# Skip stream.opus from uploads list (it's a special file)
if log.processed_filename == 'stream.opus':
continue
# Skip if file doesn't exist on disk
# Files are stored with the pattern: {upload_id}_{processed_filename}
expected_filename = f"{log.id}_{log.processed_filename}"
if expected_filename not in existing_files:
# Only delete records older than 5 minutes to avoid race conditions
from datetime import datetime, timedelta
cutoff_time = datetime.utcnow() - timedelta(minutes=5)
if log.created_at < cutoff_time:
print(f"[CLEANUP] Removing orphaned DB record (older than 5min): {expected_filename}")
db.delete(log)
continue
# Add to processed files to avoid duplicates
processed_files.add(log.processed_filename)
# Always use the original filename if present
display_name = log.filename if log.filename else log.processed_filename
# Only include files that exist on disk
# Files are stored with the pattern: {upload_id}_{processed_filename}
stored_filename = f"{log.id}_{log.processed_filename}"
file_path = user_dir / stored_filename
if file_path.exists() and file_path.is_file():
try:
# Get the actual file size in case it changed
actual_size = file_path.stat().st_size
files_metadata.append({
"original_name": display_name,
"stored_name": log.processed_filename,
"size": actual_size
})
except OSError:
# If we can't access the file, skip it
continue
# Commit any database changes (deletions of non-existent files)
try:
db.commit()
except Exception as e:
print(f"[ERROR] Failed to commit database changes: {e}")
db.rollback()
return {"files": files_metadata}
# Serve static files # Serve static files
app.mount("/static", StaticFiles(directory="static"), name="static") app.mount("/static", StaticFiles(directory="static"), name="static")
# Serve audio files
os.makedirs("data", exist_ok=True) # Ensure the data directory exists
app.mount("/audio", StaticFiles(directory="data"), name="audio")
@app.post("/log-client") @app.post("/log-client")
async def log_client(request: Request): async def log_client(request: Request):
try: try:
@ -190,9 +355,9 @@ def serve_me():
@app.get("/admin/stats") @app.get("/admin/stats")
def admin_stats(request: Request, db: Session = Depends(get_db)): def admin_stats(request: Request, db: Session = Depends(get_db)):
from sqlmodel import select from sqlmodel import select
users = db.exec(select(User)).all() users = db.query(User).all()
users_count = len(users) users_count = len(users)
total_quota = db.exec(select(UserQuota)).all() total_quota = db.query(UserQuota).all()
total_quota_sum = sum(q.storage_bytes for q in total_quota) total_quota_sum = sum(q.storage_bytes for q in total_quota)
violations_log = 0 violations_log = 0
try: try:
@ -224,105 +389,416 @@ def debug(request: Request):
MAX_QUOTA_BYTES = 100 * 1024 * 1024 MAX_QUOTA_BYTES = 100 * 1024 * 1024
@app.post("/delete-account") # Delete account endpoint - fallback implementation since account_router.py has loading issues
async def delete_account(data: dict, request: Request, db: Session = Depends(get_db)): @app.post("/api/delete-account")
async def delete_account_fallback(request: Request, db: Session = Depends(get_db)):
try:
# Get request data
data = await request.json()
uid = data.get("uid") uid = data.get("uid")
if not uid: if not uid:
raise HTTPException(status_code=400, detail="Missing UID") raise HTTPException(status_code=400, detail="Missing UID")
ip = request.client.host ip = request.client.host
user = get_user_by_uid(uid) # Debug messages disabled
if not user or user.ip != ip:
# Find user by email or username
user = None
if '@' in uid:
user = db.exec(select(User).where(User.email == uid)).first()
if not user:
user = db.exec(select(User).where(User.username == uid)).first()
# If still not found, check if this UID exists in upload logs and try to find the associated user
if not user:
# Look for upload logs with this UID to find the real user
upload_log = db.exec(select(UploadLog).where(UploadLog.uid == uid)).first()
if upload_log:
# Try to find a user that might be associated with this UID
# Check if there's a user with the same IP or similar identifier
all_users = db.exec(select(User)).all()
for potential_user in all_users:
# Use the first confirmed user as fallback (for orphaned UIDs)
if potential_user.confirmed:
user = potential_user
# Debug messages disabled
break
if not user:
# Debug messages disabled
raise HTTPException(status_code=404, detail="User not found")
if user.ip != ip:
raise HTTPException(status_code=403, detail="Unauthorized: IP address does not match")
# Delete user data from database using the original UID
# The original UID is what's stored in the database records
# Delete upload logs for all possible UIDs (original UID, email, username)
upload_logs_to_delete = []
# Check for upload logs with original UID
upload_logs_original = db.query(UploadLog).filter(UploadLog.uid == uid).all()
if upload_logs_original:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_original)
# Check for upload logs with user email
upload_logs_email = db.query(UploadLog).filter(UploadLog.uid == user.email).all()
if upload_logs_email:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_email)
# Check for upload logs with username
upload_logs_username = db.query(UploadLog).filter(UploadLog.uid == user.username).all()
if upload_logs_username:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_username)
# Delete all found upload log records
for log in upload_logs_to_delete:
try:
db.delete(log)
except Exception as e:
# Debug messages disabled
pass
# Debug messages disabled
# Delete user quota for both the original UID and user email (to cover all cases)
quota_original = db.get(UserQuota, uid)
if quota_original:
# Debug messages disabled
db.delete(quota_original)
quota_email = db.get(UserQuota, user.email)
if quota_email:
# Debug messages disabled
db.delete(quota_email)
# Delete user sessions
sessions = db.query(DBSession).filter(DBSession.user_id == user.username).all()
# Debug messages disabled
for session in sessions:
db.delete(session)
# Delete public stream entries for all possible UIDs
# Use select() instead of get() to find all matching records
public_streams_to_delete = []
# Check for public stream with original UID
public_stream_original = db.query(PublicStream).filter(PublicStream.uid == uid).first()
if public_stream_original:
# Debug messages disabled
public_streams_to_delete.append(public_stream_original)
# Check for public stream with user email
public_stream_email = db.query(PublicStream).filter(PublicStream.uid == user.email).first()
if public_stream_email:
# Debug messages disabled
public_streams_to_delete.append(public_stream_email)
# Check for public stream with username
public_stream_username = db.query(PublicStream).filter(PublicStream.uid == user.username).first()
if public_stream_username:
# Debug messages disabled
public_streams_to_delete.append(public_stream_username)
# Delete all found public stream records
for ps in public_streams_to_delete:
try:
# Debug messages disabled
db.delete(ps)
except Exception as e:
# Debug messages disabled
pass
# Debug messages disabled
# Delete user directory BEFORE deleting user record - check all possible locations
import shutil
# Try to delete directory with UID (email) - current standard
uid_dir = os.path.join('data', uid)
if os.path.exists(uid_dir):
# Debug messages disabled
shutil.rmtree(uid_dir, ignore_errors=True)
# Also try to delete directory with email (in case of different UID formats)
email_dir = os.path.join('data', user.email)
if os.path.exists(email_dir) and email_dir != uid_dir:
# Debug messages disabled
shutil.rmtree(email_dir, ignore_errors=True)
# Also try to delete directory with username (legacy format)
username_dir = os.path.join('data', user.username)
if os.path.exists(username_dir) and username_dir != uid_dir and username_dir != email_dir:
# Debug messages disabled
shutil.rmtree(username_dir, ignore_errors=True)
# Delete user account AFTER directory cleanup
db.delete(user)
db.commit()
# Debug messages disabled
return {"status": "success", "message": "Account deleted successfully"}
except HTTPException:
raise
except Exception as e:
# Debug messages disabled
db.rollback()
raise HTTPException(status_code=500, detail=f"Failed to delete account: {str(e)}")
# Cleanup endpoint for orphaned public streams
@app.post("/api/cleanup-streams")
async def cleanup_orphaned_streams(request: Request, db: Session = Depends(get_db)):
try:
# Get request data
data = await request.json()
admin_secret = data.get("admin_secret")
# Verify admin access
if admin_secret != ADMIN_SECRET:
raise HTTPException(status_code=403, detail="Unauthorized") raise HTTPException(status_code=403, detail="Unauthorized")
# Delete user quota and user using ORM # Find orphaned public streams (streams without corresponding user accounts)
quota = db.get(UserQuota, uid) all_streams = db.query(PublicStream).all()
if quota: all_users = db.query(User).all()
db.delete(quota)
user_obj = db.get(User, user.email)
if user_obj:
db.delete(user_obj)
db.commit()
import shutil # Create sets of valid UIDs from user accounts
user_dir = os.path.join('data', user.username) valid_uids = set()
real_user_dir = os.path.realpath(user_dir) for user in all_users:
if not real_user_dir.startswith(os.path.realpath('data')): valid_uids.add(user.email)
raise HTTPException(status_code=400, detail="Invalid user directory") valid_uids.add(user.username)
if os.path.exists(real_user_dir):
shutil.rmtree(real_user_dir, ignore_errors=True)
return {"message": "User deleted"} orphaned_streams = []
for stream in all_streams:
from fastapi.concurrency import run_in_threadpool if stream.uid not in valid_uids:
# from detect_content_type_whisper_ollama import detect_content_type_whisper_ollama # Broken import: module not found orphaned_streams.append(stream)
content_type = None
if content_type in ["music", "singing"]:
os.remove(raw_path)
log_violation("UPLOAD", ip, uid, f"Rejected content: {content_type}")
return JSONResponse(status_code=403, content={"error": f"{content_type.capitalize()} uploads are not allowed."})
# Delete orphaned streams
deleted_count = 0
for stream in orphaned_streams:
try: try:
subprocess.run([ print(f"[CLEANUP] Deleting orphaned stream: {stream.uid} (username: {stream.username})")
"ffmpeg", "-y", "-i", raw_path, db.delete(stream)
"-ac", "1", "-ar", "48000", deleted_count += 1
"-c:a", "libopus", "-b:a", "60k",
final_path
], check=True)
except subprocess.CalledProcessError as e:
os.remove(raw_path)
log_violation("FFMPEG", ip, uid, f"ffmpeg failed: {e}")
raise HTTPException(status_code=500, detail="Encoding failed")
os.remove(raw_path)
try:
actual_bytes = int(subprocess.check_output(["du", "-sb", user_dir]).split()[0])
q = db.get(UserQuota, uid)
if q:
q.storage_bytes = actual_bytes
db.add(q)
db.commit()
except Exception as e: except Exception as e:
log_violation("QUOTA", ip, uid, f"Quota update failed: {e}") print(f"[CLEANUP] Error deleting stream {stream.uid}: {e}")
return {} db.commit()
print(f"[CLEANUP] Deleted {deleted_count} orphaned public streams")
return {
"status": "success",
"message": f"Deleted {deleted_count} orphaned public streams",
"deleted_streams": [s.uid for s in orphaned_streams]
}
except HTTPException:
raise
except Exception as e:
print(f"[CLEANUP] Error: {str(e)}")
db.rollback()
raise HTTPException(status_code=500, detail=f"Cleanup failed: {str(e)}")
# Original delete account endpoint has been moved to account_router.py
@app.delete("/uploads/{uid}/{filename}") @app.delete("/uploads/{uid}/{filename}")
def delete_file(uid: str, filename: str, request: Request, db: Session = Depends(get_db)): async def delete_file(uid: str, filename: str, request: Request):
"""
Delete a file for a specific user.
Args:
uid: The username of the user (used as UID in routes)
filename: The name of the file to delete
request: The incoming request object
db: Database session
Returns:
Dict with status message
"""
try:
# Get the user by username (which is used as UID in routes)
user = get_user_by_uid(uid) user = get_user_by_uid(uid)
if not user: if not user:
raise HTTPException(status_code=403, detail="Invalid user ID") raise HTTPException(status_code=404, detail="User not found")
# Get client IP and verify it matches the user's IP
ip = request.client.host ip = request.client.host
if user.ip != ip: if user.ip != ip:
raise HTTPException(status_code=403, detail="Device/IP mismatch") raise HTTPException(status_code=403, detail="Device/IP mismatch. Please log in again.")
user_dir = os.path.join('data', user.username) # Set up user directory using email (matching upload logic)
user_dir = os.path.join('data', user.email)
os.makedirs(user_dir, exist_ok=True)
# Decode URL-encoded filename
from urllib.parse import unquote
filename = unquote(filename)
# Debug: Print the user info and filename being used
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
if os.path.exists(user_dir):
# Debug messages disabled
pass
# Construct and validate target path
target_path = os.path.join(user_dir, filename) target_path = os.path.join(user_dir, filename)
# Prevent path traversal attacks
real_target_path = os.path.realpath(target_path) real_target_path = os.path.realpath(target_path)
real_user_dir = os.path.realpath(user_dir) real_user_dir = os.path.realpath(user_dir)
# Debug: Print the constructed paths
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Security check: Ensure the target path is inside the user's directory
if not real_target_path.startswith(real_user_dir + os.sep): if not real_target_path.startswith(real_user_dir + os.sep):
raise HTTPException(status_code=403, detail="Invalid path") # Debug messages disabled
raise HTTPException(status_code=403, detail="Invalid file path")
# Check if file exists
if not os.path.isfile(real_target_path): if not os.path.isfile(real_target_path):
raise HTTPException(status_code=404, detail="File not found") # Debug: List files in the directory to help diagnose the issue
os.remove(real_target_path)
log_violation("DELETE", ip, uid, f"Deleted {filename}")
subprocess.run(["/root/scripts/refresh_user_playlist.sh", user.username])
try: try:
actual_bytes = int(subprocess.check_output(["du", "-sb", user_dir]).split()[0]) # Debug messages disabled
q = db.get(UserQuota, uid) # Debug messages disabled
if q: # Debug messages disabled
q.storage_bytes = actual_bytes
db.add(q) if os.path.exists(real_user_dir):
db.commit() files_in_dir = os.listdir(real_user_dir)
# Debug messages disabled
# Print detailed file info
for f in files_in_dir:
full_path = os.path.join(real_user_dir, f)
try:
# Debug messages disabled
pass
except Exception as e: except Exception as e:
log_violation("QUOTA", ip, uid, f"Quota update after delete failed: {e}") # Debug messages disabled
pass
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Try to find a matching file (case-insensitive, partial match)
matching_files = [f for f in files_in_dir if filename.lower() in f.lower()]
if matching_files:
# Debug messages disabled
# Use the first matching file
real_target_path = os.path.join(real_user_dir, matching_files[0])
# Debug messages disabled
# Debug messages disabled
else:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"File not found: {filename}")
else:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"User directory not found")
except HTTPException:
raise
except Exception as e:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"File not found: {filename}")
# Delete both the target file and its UUID-only variant
deleted_files = []
try:
# First delete the requested file (with log ID prefix)
if os.path.exists(real_target_path):
os.remove(real_target_path)
deleted_files.append(filename)
log_violation("DELETE", ip, uid, f"Deleted {filename}")
# Then try to find and delete the UUID-only variant (without log ID prefix)
if '_' in filename: # If filename has a log ID prefix (e.g., "123_uuid.opus")
uuid_part = filename.split('_', 1)[1] # Get the part after the first underscore
uuid_path = os.path.join(user_dir, uuid_part)
if os.path.exists(uuid_path):
os.remove(uuid_path)
deleted_files.append(uuid_part)
log_violation("DELETE", ip, uid, f"Deleted UUID variant: {uuid_part}")
file_deleted = len(deleted_files) > 0
if not file_deleted:
log_violation("DELETE_WARNING", ip, uid, f"No files found to delete for: {filename}")
except Exception as e:
log_violation("DELETE_ERROR", ip, uid, f"Error deleting file {filename}: {str(e)}")
file_deleted = False
# Try to refresh the user's playlist, but don't fail if we can't
try:
subprocess.run(["/root/scripts/refresh_user_playlist.sh", user.username],
check=False, stderr=subprocess.DEVNULL, stdout=subprocess.DEVNULL)
except Exception as e:
log_violation("PLAYLIST_REFRESH_WARNING", ip, uid,
f"Failed to refresh playlist: {str(e)}")
# Clean up the database record for this file
try:
with get_db() as db:
try:
# Find and delete the upload log entry
log_entry = db.query(UploadLog).filter(
UploadLog.uid == uid,
UploadLog.processed_filename == filename
).first()
if log_entry:
db.delete(log_entry)
db.commit()
log_violation("DB_CLEANUP", ip, uid, f"Removed DB record for {filename}")
except Exception as e:
db.rollback()
raise e
except Exception as e:
log_violation("DB_CLEANUP_ERROR", ip, uid, f"Failed to clean up DB record: {str(e)}")
# Regenerate stream.opus after file deletion
try:
from concat_opus import concat_opus_files
from pathlib import Path
user_dir_path = Path(user_dir)
stream_path = user_dir_path / "stream.opus"
concat_opus_files(user_dir_path, stream_path)
log_violation("STREAM_UPDATE", ip, uid, "Regenerated stream.opus after file deletion")
except Exception as e:
log_violation("STREAM_UPDATE_ERROR", ip, uid, f"Failed to regenerate stream.opus: {str(e)}")
# Update user quota in a separate try-except to not fail the entire operation
try:
with get_db() as db:
try:
# Use verify_and_fix_quota to ensure consistency between disk and DB
total_size = verify_and_fix_quota(db, user.username, user_dir)
log_violation("QUOTA_UPDATE", ip, uid,
f"Updated quota: {total_size} bytes")
except Exception as e:
db.rollback()
raise e
except Exception as e:
log_violation("QUOTA_ERROR", ip, uid, f"Quota update failed: {str(e)}")
return {"status": "deleted"} return {"status": "deleted"}
except Exception as e:
# Log the error and re-raise with a user-friendly message
error_detail = str(e)
log_violation("DELETE_ERROR", request.client.host, uid, f"Failed to delete {filename}: {error_detail}")
if not isinstance(e, HTTPException):
raise HTTPException(status_code=500, detail=f"Failed to delete file: {error_detail}")
raise
@app.get("/confirm/{uid}") @app.get("/confirm/{uid}")
def confirm_user(uid: str, request: Request): def confirm_user(uid: str, request: Request):
ip = request.client.host ip = request.client.host
@ -331,26 +807,169 @@ def confirm_user(uid: str, request: Request):
raise HTTPException(status_code=403, detail="Unauthorized") raise HTTPException(status_code=403, detail="Unauthorized")
return {"username": user.username, "email": user.email} return {"username": user.username, "email": user.email}
def verify_and_fix_quota(db: Session, uid: str, user_dir: str) -> int:
"""
Verify and fix the user's quota based on the size of stream.opus file.
Returns the size of stream.opus in bytes.
"""
stream_opus_path = os.path.join(user_dir, 'stream.opus')
total_size = 0
# Only consider stream.opus for quota
if os.path.isfile(stream_opus_path):
try:
total_size = os.path.getsize(stream_opus_path)
# Debug messages disabled
except (OSError, FileNotFoundError) as e:
# Debug messages disabled
pass
else:
# Debug messages disabled
pass
# Update quota in database
q = db.get(UserQuota, uid) or UserQuota(uid=uid, storage_bytes=0)
q.storage_bytes = total_size
db.add(q)
# Clean up any database records for files that don't exist
# BUT only for records older than 5 minutes to avoid race conditions with recent uploads
from datetime import datetime, timedelta
cutoff_time = datetime.utcnow() - timedelta(minutes=5)
uploads = db.query(UploadLog).filter(
UploadLog.uid == uid,
UploadLog.created_at < cutoff_time # Only check older records
).all()
for upload in uploads:
if upload.processed_filename: # Only check if processed_filename exists
stored_filename = f"{upload.id}_{upload.processed_filename}"
file_path = os.path.join(user_dir, stored_filename)
if not os.path.isfile(file_path):
# Debug messages disabled
db.delete(upload)
try:
db.commit()
# Debug messages disabled
except Exception as e:
# Debug messages disabled
db.rollback()
raise
return total_size
@app.get("/me/{uid}") @app.get("/me/{uid}")
def get_me(uid: str, request: Request, db: Session = Depends(get_db)): def get_me(uid: str, request: Request, response: Response):
ip = request.client.host # Add headers to prevent caching
user = get_user_by_uid(uid) response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
if not user or user.ip != ip: response.headers["Pragma"] = "no-cache"
raise HTTPException(status_code=403, detail="Unauthorized access") response.headers["Expires"] = "0"
user_dir = os.path.join('data', user.username) # Debug messages disabled
# Use the database session context manager for all database operations
with get_db() as db:
try:
# Get user info
user = db.query(User).filter((User.username == uid) | (User.email == uid)).first()
if not user:
print(f"[ERROR] User with UID {uid} not found")
raise HTTPException(status_code=404, detail="User not found")
# Only enforce IP check in production
if not debug_mode:
if user.ip != request.client.host:
print(f"[WARNING] IP mismatch for UID {uid}: {request.client.host} != {user.ip}")
# In production, we might want to be more strict
if not debug_mode:
raise HTTPException(status_code=403, detail="IP address mismatch")
# Get user directory
user_dir = os.path.join('data', uid)
os.makedirs(user_dir, exist_ok=True)
# Get all upload logs for this user using the query interface
upload_logs = db.query(UploadLog).filter(
UploadLog.uid == uid
).order_by(UploadLog.created_at.desc()).all()
# Debug messages disabled
# Build file list from database records, checking if files exist on disk
files = [] files = []
if os.path.exists(user_dir): seen_files = set() # Track seen files to avoid duplicates
for f in os.listdir(user_dir):
path = os.path.join(user_dir, f)
if os.path.isfile(path):
files.append({"name": f, "size": os.path.getsize(path)})
q = db.get(UserQuota, uid) # Debug messages disabled
quota_mb = round(q.storage_bytes / (1024 * 1024), 2) if q else 0
return { for i, log in enumerate(upload_logs):
if not log.filename or not log.processed_filename:
# Debug messages disabled
continue
"files": files, # The actual filename on disk has the log ID prepended
"quota": quota_mb stored_filename = f"{log.id}_{log.processed_filename}"
file_path = os.path.join(user_dir, stored_filename)
# Skip if we've already seen this file
if stored_filename in seen_files:
# Debug messages disabled
continue
seen_files.add(stored_filename)
# Only include the file if it exists on disk and is not stream.opus
if os.path.isfile(file_path) and stored_filename != 'stream.opus':
try:
# Get the actual file size in case it changed
file_size = os.path.getsize(file_path)
file_info = {
"name": stored_filename,
"original_name": log.filename,
"size": file_size
} }
files.append(file_info)
# Debug messages disabled
except OSError as e:
print(f"[WARNING] Could not access file {stored_filename}: {e}")
else:
# Debug messages disabled
pass
# Log all files being returned
# Debug messages disabled
# for i, file_info in enumerate(files, 1):
# print(f" {i}. {file_info['name']} (original: {file_info['original_name']}, size: {file_info['size']} bytes)")
# Verify and fix quota based on actual files on disk
total_size = verify_and_fix_quota(db, uid, user_dir)
quota_mb = round(total_size / (1024 * 1024), 2)
max_quota_mb = round(MAX_QUOTA_BYTES / (1024 * 1024), 2)
# Debug messages disabled
response_data = {
"files": files,
"quota": {
"used": quota_mb,
"max": max_quota_mb,
"used_bytes": total_size,
"max_bytes": MAX_QUOTA_BYTES,
"percentage": round((total_size / MAX_QUOTA_BYTES) * 100, 2) if MAX_QUOTA_BYTES > 0 else 0
}
}
# Debug messages disabled
return response_data
except HTTPException:
# Re-raise HTTP exceptions as they are
raise
except Exception as e:
# Log the full traceback for debugging
import traceback
error_trace = traceback.format_exc()
print(f"[ERROR] Error in /me/{uid} endpoint: {str(e)}\n{error_trace}")
# Rollback any database changes in case of error
db.rollback()
# Return a 500 error with a generic message
raise HTTPException(status_code=500, detail="Internal server error")

73
middleware.py Normal file
View File

@ -0,0 +1,73 @@
"""Custom middleware for the dicta2stream application"""
import time
from fastapi import Request, HTTPException
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
from starlette.responses import Response
from starlette.types import ASGIApp
class RateLimitMiddleware(BaseHTTPMiddleware):
"""Middleware to implement rate limiting"""
def __init__(self, app: ASGIApp, limit: int = 100, window: int = 60):
super().__init__(app)
self.limit = limit
self.window = window
self.requests = {}
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
# Get client IP
if "x-forwarded-for" in request.headers:
ip = request.headers["x-forwarded-for"].split(",")[0]
else:
ip = request.client.host or "unknown"
# Get current timestamp
current_time = int(time.time())
# Clean up old entries
self.requests = {
k: v
for k, v in self.requests.items()
if current_time - v["timestamp"] < self.window
}
# Check rate limit
if ip in self.requests:
self.requests[ip]["count"] += 1
if self.requests[ip]["count"] > self.limit:
raise HTTPException(
status_code=429,
detail="Too many requests. Please try again later."
)
else:
self.requests[ip] = {"count": 1, "timestamp": current_time}
# Process the request
response = await call_next(request)
return response
class SecurityHeadersMiddleware(BaseHTTPMiddleware):
"""Middleware to add security headers to responses"""
async def dispatch(self, request: Request, call_next):
response = await call_next(request)
# Add security headers
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "DENY"
response.headers["X-XSS-Protection"] = "1; mode=block"
response.headers["Referrer-Policy"] = "strict-origin-when-cross-origin"
# Content Security Policy
csp_parts = [
"default-src 'self'",
"script-src 'self' 'unsafe-inline'",
"style-src 'self' 'unsafe-inline'",
"img-src 'self' data:",
"media-src 'self' blob: data:",
"connect-src 'self' https: wss:",
"frame-ancestors 'none'"
]
response.headers["Content-Security-Policy"] = "; ".join(csp_parts)
return response

13
migrate_dbsession_fk.sql Normal file
View File

@ -0,0 +1,13 @@
-- Migration script to update DBSession foreign key to reference user.email
-- Run this when no active sessions exist to avoid deadlocks
BEGIN;
-- Step 1: Drop the existing foreign key constraint
ALTER TABLE dbsession DROP CONSTRAINT IF EXISTS dbsession_user_id_fkey;
-- Step 2: Add the new foreign key constraint referencing user.email
ALTER TABLE dbsession ADD CONSTRAINT dbsession_user_id_fkey
FOREIGN KEY (user_id) REFERENCES "user"(email);
COMMIT;

168
migrate_uid_to_email.py Normal file
View File

@ -0,0 +1,168 @@
#!/usr/bin/env python3
"""
UID Migration Script - Complete migration from username-based to email-based UIDs
This script completes the UID migration by updating remaining username-based UIDs
in the database to use proper email format.
Based on previous migration history:
- devuser -> oib@bubuit.net (as per migration memory)
- oibchello -> oib@chello.at (already completed)
"""
import psycopg2
import sys
from datetime import datetime
# Database connection string
DATABASE_URL = "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream"
def log_message(message):
"""Log message with timestamp"""
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
print(f"[{timestamp}] {message}")
def check_current_state(cursor):
"""Check current state of UID migration"""
log_message("Checking current UID state...")
# Check publicstream table
cursor.execute("SELECT uid, username FROM publicstream WHERE uid NOT LIKE '%@%'")
non_email_uids = cursor.fetchall()
if non_email_uids:
log_message(f"Found {len(non_email_uids)} non-email UIDs in publicstream:")
for uid, username in non_email_uids:
log_message(f" - UID: {uid}, Username: {username}")
else:
log_message("All UIDs in publicstream are already in email format")
# Check userquota table
cursor.execute("SELECT uid FROM userquota WHERE uid NOT LIKE '%@%'")
quota_non_email_uids = cursor.fetchall()
if quota_non_email_uids:
log_message(f"Found {len(quota_non_email_uids)} non-email UIDs in userquota:")
for (uid,) in quota_non_email_uids:
log_message(f" - UID: {uid}")
else:
log_message("All UIDs in userquota are already in email format")
return non_email_uids, quota_non_email_uids
def migrate_uids(cursor):
"""Migrate remaining username-based UIDs to email format"""
log_message("Starting UID migration...")
# Migration mapping based on previous migration history
uid_mapping = {
'devuser': 'oib@bubuit.net'
}
migration_count = 0
for old_uid, new_uid in uid_mapping.items():
log_message(f"Migrating UID: {old_uid} -> {new_uid}")
# Update publicstream table
cursor.execute(
"UPDATE publicstream SET uid = %s WHERE uid = %s",
(new_uid, old_uid)
)
publicstream_updated = cursor.rowcount
# Update userquota table
cursor.execute(
"UPDATE userquota SET uid = %s WHERE uid = %s",
(new_uid, old_uid)
)
userquota_updated = cursor.rowcount
# Update uploadlog table (if any records exist)
cursor.execute(
"UPDATE uploadlog SET uid = %s WHERE uid = %s",
(new_uid, old_uid)
)
uploadlog_updated = cursor.rowcount
log_message(f" - Updated {publicstream_updated} records in publicstream")
log_message(f" - Updated {userquota_updated} records in userquota")
log_message(f" - Updated {uploadlog_updated} records in uploadlog")
migration_count += publicstream_updated + userquota_updated + uploadlog_updated
return migration_count
def verify_migration(cursor):
"""Verify migration was successful"""
log_message("Verifying migration...")
# Check for any remaining non-email UIDs
cursor.execute("""
SELECT 'publicstream' as table_name, uid FROM publicstream WHERE uid NOT LIKE '%@%'
UNION ALL
SELECT 'userquota' as table_name, uid FROM userquota WHERE uid NOT LIKE '%@%'
UNION ALL
SELECT 'uploadlog' as table_name, uid FROM uploadlog WHERE uid NOT LIKE '%@%'
""")
remaining_non_email = cursor.fetchall()
if remaining_non_email:
log_message("WARNING: Found remaining non-email UIDs:")
for table_name, uid in remaining_non_email:
log_message(f" - {table_name}: {uid}")
return False
else:
log_message("SUCCESS: All UIDs are now in email format")
return True
def main():
"""Main migration function"""
log_message("Starting UID migration script")
try:
# Connect to database
log_message("Connecting to database...")
conn = psycopg2.connect(DATABASE_URL)
cursor = conn.cursor()
# Check current state
non_email_uids, quota_non_email_uids = check_current_state(cursor)
if not non_email_uids and not quota_non_email_uids:
log_message("No migration needed - all UIDs are already in email format")
return
# Perform migration
migration_count = migrate_uids(cursor)
# Commit changes
conn.commit()
log_message(f"Migration committed - {migration_count} records updated")
# Verify migration
if verify_migration(cursor):
log_message("UID migration completed successfully!")
else:
log_message("UID migration completed with warnings - manual review needed")
except psycopg2.Error as e:
log_message(f"Database error: {e}")
if conn:
conn.rollback()
sys.exit(1)
except Exception as e:
log_message(f"Unexpected error: {e}")
if conn:
conn.rollback()
sys.exit(1)
finally:
if cursor:
cursor.close()
if conn:
conn.close()
log_message("Database connection closed")
if __name__ == "__main__":
main()

View File

@ -8,7 +8,7 @@ from database import engine
class User(SQLModel, table=True): class User(SQLModel, table=True):
token_created: datetime = Field(default_factory=datetime.utcnow) token_created: datetime = Field(default_factory=datetime.utcnow)
email: str = Field(primary_key=True) email: str = Field(primary_key=True)
username: str username: str = Field(unique=True, index=True)
token: str token: str
confirmed: bool = False confirmed: bool = False
ip: str = Field(default="") ip: str = Field(default="")
@ -23,13 +23,83 @@ class UploadLog(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True) id: Optional[int] = Field(default=None, primary_key=True)
uid: str uid: str
ip: str ip: str
filename: Optional[str] filename: Optional[str] # Original filename
processed_filename: Optional[str] # Processed filename (UUID.opus)
size_bytes: int size_bytes: int
created_at: datetime = Field(default_factory=datetime.utcnow) created_at: datetime = Field(default_factory=datetime.utcnow)
class DBSession(SQLModel, table=True):
token: str = Field(primary_key=True)
uid: str = Field(foreign_key="user.email") # This references User.email (primary key)
ip_address: str
user_agent: str
created_at: datetime = Field(default_factory=datetime.utcnow)
expires_at: datetime
is_active: bool = True
last_activity: datetime = Field(default_factory=datetime.utcnow)
class PublicStream(SQLModel, table=True):
"""Stores public stream metadata for all users"""
uid: str = Field(primary_key=True)
username: Optional[str] = Field(default=None, index=True)
storage_bytes: int = 0
mtime: int = Field(default_factory=lambda: int(datetime.utcnow().timestamp()))
last_updated: Optional[datetime] = Field(default_factory=datetime.utcnow)
created_at: datetime = Field(default_factory=datetime.utcnow)
updated_at: datetime = Field(default_factory=datetime.utcnow)
def get_user_by_uid(uid: str) -> Optional[User]: def get_user_by_uid(uid: str) -> Optional[User]:
"""
Retrieve a user by their UID (email).
Note: In this application, UIDs are consistently email-based.
The User model uses email as primary key, and all user references
throughout the system use email format.
Args:
uid: The email to look up
Returns:
User object if found, None otherwise
"""
with Session(engine) as session: with Session(engine) as session:
# Primary lookup by email (which is what we're using as UID)
statement = select(User).where(User.email == uid)
user = session.exec(statement).first()
# Fallback: try by username for legacy compatibility
if not user and '@' not in uid:
statement = select(User).where(User.username == uid) statement = select(User).where(User.username == uid)
result = session.exec(statement).first() user = session.exec(statement).first()
return result
return user
def verify_session(db: Session, token: str) -> DBSession:
"""Verify a session token and return the session if valid"""
from datetime import datetime
# Find the session
session = db.query(DBSession).filter(
DBSession.token == token,
DBSession.is_active == True, # noqa: E712
DBSession.expires_at > datetime.utcnow()
).first()
if not session:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or expired session",
headers={"WWW-Authenticate": "Bearer"},
)
# Update last activity
session.last_activity = datetime.utcnow()
db.add(session)
db.commit()
db.refresh(session)
return session

View File

@ -1 +0,0 @@
{"uid":"devuser","size":22455090,"mtime":1747563720}

View File

@ -7,69 +7,146 @@ from database import get_db
import uuid import uuid
import smtplib import smtplib
from email.message import EmailMessage from email.message import EmailMessage
from pathlib import Path
import os
router = APIRouter() router = APIRouter()
MAGIC_FROM = "noreply@dicta2stream.net" MAGIC_FROM = "noreply@dicta2stream.net"
MAGIC_DOMAIN = "https://dicta2stream.net" MAGIC_DOMAIN = "https://dicta2stream.net"
DATA_ROOT = Path("./data")
def initialize_user_directory(uid: str):
"""Initialize user directory with a silent stream.opus file"""
try:
user_dir = DATA_ROOT / uid
default_stream_path = DATA_ROOT / "stream.opus"
# Debug messages disabled
# Create the directory if it doesn't exist
user_dir.mkdir(parents=True, exist_ok=True)
# Debug messages disabled
# Create stream.opus by copying the default stream.opus file
user_stream_path = user_dir / "stream.opus"
# Debug messages disabled
if not user_stream_path.exists():
if default_stream_path.exists():
import shutil
shutil.copy2(default_stream_path, user_stream_path)
# Debug messages disabled
else:
print(f"[ERROR] Default stream.opus not found at {default_stream_path}")
# Fallback: create an empty file to prevent errors
with open(user_stream_path, 'wb') as f:
f.write(b'')
return True
except Exception as e:
print(f"Error initializing user directory for {uid}: {str(e)}")
return False
@router.post("/register") @router.post("/register")
def register(request: Request, email: str = Form(...), user: str = Form(...), db: Session = Depends(get_db)): def register(request: Request, email: str = Form(...), user: str = Form(...)):
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
# Try to find user by email or username
existing_user = db.get(User, email)
if not existing_user:
# Try by username (since username is not primary key, need to query)
stmt = select(User).where(User.username == user)
existing_user = db.exec(stmt).first()
token = str(uuid.uuid4())
if existing_user:
# Update token, timestamp, and ip, set confirmed False
from datetime import datetime from datetime import datetime
existing_user.token = token
existing_user.token_created = datetime.utcnow() # Use the database session context manager
existing_user.confirmed = False with get_db() as db:
existing_user.ip = request.client.host
db.add(existing_user)
try: try:
# Check if user exists by email
existing_user_by_email = db.get(User, email)
# Check if user exists by username
existing_user_by_username = db.query(User).filter(User.username == user).first()
token = str(uuid.uuid4())
action = None
# Case 1: Email and username match in db - it's a login
if existing_user_by_email and existing_user_by_username and existing_user_by_email.email == existing_user_by_username.email:
# Update token for existing user (login)
existing_user_by_email.token = token
existing_user_by_email.token_created = datetime.utcnow()
existing_user_by_email.confirmed = False
existing_user_by_email.ip = request.client.host
db.add(existing_user_by_email)
db.commit() db.commit()
except Exception as e: action = "login"
db.rollback()
raise HTTPException(status_code=500, detail=f"Database error: {e}") # Case 2: Email matches but username does not - only one account per email
else: elif existing_user_by_email and (not existing_user_by_username or existing_user_by_email.email != existing_user_by_username.email):
raise HTTPException(status_code=409, detail="📧 This email is already registered with a different username.\nOnly one account per email is allowed.")
# Case 3: Email does not match but username is in db - username already taken
elif not existing_user_by_email and existing_user_by_username:
raise HTTPException(status_code=409, detail="👤 This username is already taken.\nPlease choose a different username.")
# Case 4: Neither email nor username exist - create new user
elif not existing_user_by_email and not existing_user_by_username:
# Register new user # Register new user
db.add(User(email=email, username=user, token=token, confirmed=False, ip=request.client.host)) new_user = User(email=email, username=user, token=token, confirmed=False, ip=request.client.host)
db.add(UserQuota(uid=user)) new_quota = UserQuota(uid=email) # Use email as UID for quota tracking
try:
db.add(new_user)
db.add(new_quota)
db.commit() db.commit()
action = "register"
# Initialize user directory after successful registration
if not initialize_user_directory(email):
print(f"[WARNING] Failed to initialize user directory for {email}")
# If we get here, we've either logged in or registered successfully
if action not in ["login", "register"]:
raise HTTPException(status_code=400, detail="Invalid registration request")
# Store the email for use after the session is committed
user_email = email
# Only after successful commit, initialize the user directory
initialize_user_directory(email)
except Exception as e: except Exception as e:
db.rollback() db.rollback()
if isinstance(e, IntegrityError): if isinstance(e, IntegrityError):
# Race condition: user created after our check # Race condition: user created after our check
# Try again as login # Check which constraint was violated to provide specific feedback
stmt = select(User).where((User.email == email) | (User.username == user)) error_str = str(e).lower()
existing_user = db.exec(stmt).first()
if existing_user: if 'username' in error_str or 'user_username_key' in error_str:
existing_user.token = token raise HTTPException(status_code=409, detail="👤 This username is already taken.\nPlease choose a different username.")
existing_user.confirmed = False elif 'email' in error_str or 'user_pkey' in error_str:
existing_user.ip = request.client.host raise HTTPException(status_code=409, detail="📧 This email is already registered with a different username.\nOnly one account per email is allowed.")
db.add(existing_user)
db.commit()
else: else:
raise HTTPException(status_code=409, detail="Username or email already exists.") # Generic fallback if we can't determine the specific constraint
raise HTTPException(status_code=409, detail="⚠️ Registration failed due to a conflict.\nPlease try again with different credentials.")
else: else:
raise HTTPException(status_code=500, detail=f"Database error: {e}") raise HTTPException(status_code=500, detail=f"Database error: {e}")
# Send magic link
# Send magic link with appropriate message based on action
msg = EmailMessage() msg = EmailMessage()
msg["From"] = MAGIC_FROM msg["From"] = MAGIC_FROM
msg["To"] = email msg["To"] = email
if action == "login":
msg["Subject"] = "Your magic login link" msg["Subject"] = "Your magic login link"
msg.set_content( msg.set_content(
f"Hello {user},\n\nClick to confirm your account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time login." f"Hello {user},\n\nClick to log in to your account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time login."
) )
response_message = "📧 Check your email for a magic login link!"
else: # registration
msg["Subject"] = "Welcome to dicta2stream - Confirm your account"
msg.set_content(
f"Hello {user},\n\nWelcome to dicta2stream! Click to confirm your new account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time confirmation."
)
response_message = "🎉 Account created! Check your email for a magic login link!"
try: try:
with smtplib.SMTP("localhost") as smtp: with smtplib.SMTP("localhost") as smtp:
smtp.send_message(msg) smtp.send_message(msg)
except Exception as e: except Exception as e:
raise HTTPException(status_code=500, detail=f"Email failed: {e}") raise HTTPException(status_code=500, detail=f"Email failed: {e}")
return { "message": "Confirmation sent" }
return {"message": response_message, "action": action}

BIN
silent.opus Normal file

Binary file not shown.

107
simple_db_cleanup.py Normal file
View File

@ -0,0 +1,107 @@
#!/usr/bin/env python3
"""
Simple Database Cleanup Script
Uses the provided connection string to fix legacy data issues
"""
import psycopg2
import sys
# Database connection string provided by user
DATABASE_URL = "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream"
def execute_query(conn, query, description):
"""Execute a query and report results"""
print(f"\n{description}")
print(f"Query: {query}")
print("[DEBUG] Starting query execution...")
try:
print("[DEBUG] Creating cursor...")
with conn.cursor() as cur:
print("[DEBUG] Executing query...")
cur.execute(query)
print("[DEBUG] Query executed successfully")
if query.strip().upper().startswith('SELECT'):
print("[DEBUG] Fetching results...")
rows = cur.fetchall()
print(f"Result: {len(rows)} rows")
for row in rows:
print(f" {row}")
else:
print("[DEBUG] Committing transaction...")
conn.commit()
print(f"✅ Success: {cur.rowcount} rows affected")
print("[DEBUG] Query completed successfully")
return True
except Exception as e:
print(f"❌ Error: {e}")
print(f"[DEBUG] Error type: {type(e).__name__}")
print("[DEBUG] Rolling back transaction...")
conn.rollback()
return False
def main():
"""Execute database cleanup step by step"""
print("=== DATABASE LEGACY DATA CLEANUP ===")
print(f"Attempting to connect to: {DATABASE_URL}")
try:
print("[DEBUG] Creating database connection...")
conn = psycopg2.connect(DATABASE_URL)
print("✅ Connected to database successfully")
print(f"[DEBUG] Connection status: {conn.status}")
print(f"[DEBUG] Database info: {conn.get_dsn_parameters()}")
# Step 1: Check current state
print("\n=== STEP 1: Check Current State ===")
execute_query(conn, 'SELECT email, username, display_name FROM "user"', "Check user table")
execute_query(conn, 'SELECT COUNT(*) as expired_active FROM dbsession WHERE expires_at < NOW() AND is_active = true', "Check expired sessions")
# Step 2: Mark expired sessions as inactive (this was successful before)
print("\n=== STEP 2: Fix Expired Sessions ===")
execute_query(conn, 'UPDATE dbsession SET is_active = false WHERE expires_at < NOW() AND is_active = true', "Mark expired sessions inactive")
# Step 3: Handle foreign key constraint by dropping it temporarily
print("\n=== STEP 3: Handle Foreign Key Constraint ===")
execute_query(conn, 'ALTER TABLE dbsession DROP CONSTRAINT IF EXISTS dbsession_user_id_fkey', "Drop foreign key constraint")
# Step 4: Update user table
print("\n=== STEP 4: Update User Table ===")
execute_query(conn, """UPDATE "user"
SET username = email,
display_name = CASE
WHEN display_name = '' OR display_name IS NULL
THEN split_part(email, '@', 1)
ELSE display_name
END
WHERE email = 'oib@chello.at'""", "Update user username to email")
# Step 5: Update session references
print("\n=== STEP 5: Update Session References ===")
execute_query(conn, "UPDATE dbsession SET user_id = 'oib@chello.at' WHERE user_id = 'oibchello'", "Update session user_id")
# Step 6: Recreate foreign key constraint
print("\n=== STEP 6: Recreate Foreign Key ===")
execute_query(conn, 'ALTER TABLE dbsession ADD CONSTRAINT dbsession_user_id_fkey FOREIGN KEY (user_id) REFERENCES "user"(username)', "Recreate foreign key")
# Step 7: Final verification
print("\n=== STEP 7: Final Verification ===")
execute_query(conn, 'SELECT email, username, display_name FROM "user"', "Verify user table")
execute_query(conn, 'SELECT DISTINCT user_id FROM dbsession', "Verify session user_id")
execute_query(conn, 'SELECT uid, username FROM publicstream', "Check publicstream")
print("\n✅ Database cleanup completed successfully!")
except Exception as e:
print(f"❌ Database connection error: {e}")
return 1
finally:
if 'conn' in locals():
conn.close()
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@ -1,414 +1,65 @@
// app.js — Frontend upload + minimal native player logic with slide-in and pulse effect // app.js - Main application entry point
function getCookie(name) { import { initPersonalPlayer } from './personal-player.js';
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) return parts.pop().split(';').shift();
return null;
}
import { playBeep } from "./sound.js"; /**
import { showToast } from "./toast.js"; * Initializes the primary navigation and routing system.
* This function sets up event listeners for navigation links and handles hash-based routing.
*/
function initNavigation() {
const navLinks = document.querySelectorAll('nav a, .dashboard-nav a, .footer-links a');
const handleNavClick = (e) => {
const link = e.target.closest('a');
if (!link) return;
// Log debug messages to server const href = link.getAttribute('href');
export function logToServer(msg) { const target = link.getAttribute('data-target');
const xhr = new XMLHttpRequest();
xhr.open("POST", "/log", true);
xhr.setRequestHeader("Content-Type", "application/json");
xhr.send(JSON.stringify({ msg }));
}
// Expose for debugging if (href && (href.startsWith('http') || href.startsWith('mailto:'))) {
window.logToServer = logToServer; return; // External link
// Handle magic link login redirect
(function handleMagicLoginRedirect() {
const params = new URLSearchParams(window.location.search);
if (params.get('login') === 'success' && params.get('confirmed_uid')) {
const username = params.get('confirmed_uid');
localStorage.setItem('uid', username);
logToServer(`[DEBUG] localStorage.setItem('uid', '${username}')`);
localStorage.setItem('confirmed_uid', username);
logToServer(`[DEBUG] localStorage.setItem('confirmed_uid', '${username}')`);
const uidTime = Date.now().toString();
localStorage.setItem('uid_time', uidTime);
logToServer(`[DEBUG] localStorage.setItem('uid_time', '${uidTime}')`);
// Set uid as cookie for backend authentication
document.cookie = "uid=" + encodeURIComponent(username) + "; path=/";
// Remove query params from URL
window.history.replaceState({}, document.title, window.location.pathname);
// Reload to show dashboard as logged in
location.reload();
return;
}
})();
document.addEventListener("DOMContentLoaded", () => {
// (Removed duplicate logToServer definition)
// Guest vs. logged-in toggling is now handled by dashboard.js
// --- Public profile view logic ---
function showProfilePlayerFromUrl() {
const params = new URLSearchParams(window.location.search);
const profileUid = params.get("profile");
if (profileUid) {
const mePage = document.getElementById("me-page");
if (mePage) {
document.querySelectorAll("main > section").forEach(sec => sec.hidden = sec.id !== "me-page");
// Hide upload/delete/copy-url controls for guest view
const uploadArea = document.getElementById("upload-area");
if (uploadArea) uploadArea.hidden = true;
const copyUrlBtn = document.getElementById("copy-url");
if (copyUrlBtn) copyUrlBtn.style.display = "none";
const deleteBtn = document.getElementById("delete-account");
if (deleteBtn) deleteBtn.style.display = "none";
// Update heading and description for guest view
const meHeading = document.querySelector("#me-page h2");
if (meHeading) meHeading.textContent = `${profileUid}'s Stream 🎙️`;
const meDesc = document.querySelector("#me-page p");
if (meDesc) meDesc.textContent = `This is ${profileUid}'s public stream.`;
// Show a Play Stream button for explicit user action
const streamInfo = document.getElementById("stream-info");
if (streamInfo) {
streamInfo.innerHTML = "";
const playBtn = document.createElement('button');
playBtn.textContent = "▶ Play Stream";
playBtn.onclick = () => {
loadProfileStream(profileUid);
playBtn.disabled = true;
};
streamInfo.appendChild(playBtn);
streamInfo.hidden = false;
}
// Do NOT call loadProfileStream(profileUid) automatically!
}
}
} }
// --- Only run showProfilePlayerFromUrl after session/profile checks are complete --- e.preventDefault();
function runProfilePlayerIfSessionValid() { e.stopPropagation();
if (typeof checkSessionValidity === "function" && !checkSessionValidity()) return;
showProfilePlayerFromUrl();
}
document.addEventListener("DOMContentLoaded", () => {
setTimeout(runProfilePlayerIfSessionValid, 200);
});
window.addEventListener('popstate', () => {
setTimeout(runProfilePlayerIfSessionValid, 200);
});
window.showProfilePlayerFromUrl = showProfilePlayerFromUrl;
// Global audio state let sectionId = target || (href ? href.substring(1) : 'welcome-page');
let globalAudio = null; if (sectionId === 'me' || sectionId === 'account') {
let currentStreamUid = null; sectionId = sectionId + '-page';
let audioPlaying = false;
let lastPosition = 0;
// Expose main audio element for other scripts
window.getMainAudio = () => globalAudio;
window.stopMainAudio = () => {
if (globalAudio) {
globalAudio.pause();
audioPlaying = false;
updatePlayPauseButton();
} }
window.location.hash = sectionId;
}; };
function getOrCreateAudioElement() { const handleHashChange = () => {
if (!globalAudio) { let hash = window.location.hash.substring(1);
globalAudio = document.getElementById('me-audio'); if (!hash || !document.getElementById(hash)) {
if (!globalAudio) { hash = 'welcome-page';
console.error('Audio element not found');
return null;
}
// Set up audio element properties
globalAudio.preload = 'metadata'; // Preload metadata for better performance
globalAudio.crossOrigin = 'use-credentials'; // Use credentials for authenticated requests
globalAudio.setAttribute('crossorigin', 'use-credentials'); // Explicitly set the attribute
// Set up event listeners
globalAudio.addEventListener('play', () => {
audioPlaying = true;
updatePlayPauseButton();
});
globalAudio.addEventListener('pause', () => {
audioPlaying = false;
updatePlayPauseButton();
});
globalAudio.addEventListener('timeupdate', () => lastPosition = globalAudio.currentTime);
// Add error handling
globalAudio.addEventListener('error', (e) => {
console.error('Audio error:', e);
showToast('❌ Audio playback error');
});
}
return globalAudio;
} }
// Function to update play/pause button state document.querySelectorAll('main > section').forEach(section => {
function updatePlayPauseButton() { section.classList.remove('active');
const audio = getOrCreateAudioElement();
if (playPauseButton && audio) {
playPauseButton.textContent = audio.paused ? '▶' : '⏸️';
}
}
// Initialize play/pause button
const playPauseButton = document.getElementById('play-pause');
if (playPauseButton) {
// Set initial state
updatePlayPauseButton();
// Add click handler
playPauseButton.addEventListener('click', () => {
const audio = getOrCreateAudioElement();
if (audio) {
if (audio.paused) {
// Stop any playing public streams first
const publicPlayers = document.querySelectorAll('.stream-player audio');
publicPlayers.forEach(player => {
if (!player.paused) {
player.pause();
const button = player.closest('.stream-player').querySelector('.play-pause');
if (button) {
button.textContent = '▶';
}
}
}); });
audio.play().catch(e => { const activeSection = document.getElementById(hash);
console.error('Play failed:', e); if (activeSection) {
audioPlaying = false; activeSection.classList.add('active');
}
navLinks.forEach(link => {
const linkTarget = link.getAttribute('data-target') || (link.getAttribute('href') ? link.getAttribute('href').substring(1) : '');
const isActive = (linkTarget === hash) || (linkTarget === 'me' && hash === 'me-page');
link.classList.toggle('active', isActive);
}); });
} else { };
audio.pause();
}
updatePlayPauseButton();
}
});
}
document.body.addEventListener('click', handleNavClick);
window.addEventListener('hashchange', handleHashChange);
// Preload audio without playing it handleHashChange(); // Initial call
function preloadAudio(src) {
return new Promise((resolve) => {
const audio = new Audio();
audio.preload = 'auto';
audio.crossOrigin = 'anonymous';
audio.src = src;
audio.load();
audio.oncanplaythrough = () => resolve(audio);
});
}
// Load and play a stream
async function loadProfileStream(uid) {
const audio = getOrCreateAudioElement();
if (!audio) return null;
// Always reset current stream and update audio source
currentStreamUid = uid;
audio.pause();
audio.src = '';
// Wait a moment to ensure the previous source is cleared
await new Promise(resolve => setTimeout(resolve, 50));
// Set new source with cache-busting timestamp
audio.src = `/audio/${encodeURIComponent(uid)}/stream.opus?t=${Date.now()}`;
// Try to play immediately
try {
await audio.play();
audioPlaying = true;
} catch (e) {
console.error('Play failed:', e);
audioPlaying = false;
}
// Show stream info
const streamInfo = document.getElementById("stream-info");
if (streamInfo) streamInfo.hidden = false;
// Update button state
updatePlayPauseButton();
return audio;
}
// Load and play a stream
async function loadProfileStream(uid) {
const audio = getOrCreateAudioElement();
if (!audio) return null;
// Hide playlist controls
const mePrevBtn = document.getElementById("me-prev");
if (mePrevBtn) mePrevBtn.style.display = "none";
const meNextBtn = document.getElementById("me-next");
if (meNextBtn) meNextBtn.style.display = "none";
// Handle navigation to "Your Stream"
const mePageLink = document.getElementById("show-me");
if (mePageLink) {
mePageLink.addEventListener("click", async (e) => {
e.preventDefault();
const uid = localStorage.getItem("uid");
if (!uid) return;
// Show loading state
const streamInfo = document.getElementById("stream-info");
if (streamInfo) {
streamInfo.hidden = false;
streamInfo.innerHTML = '<p>Loading stream...</p>';
}
try {
// Load the stream but don't autoplay
await loadProfileStream(uid);
// Update URL without triggering a full page reload
if (window.location.pathname !== '/') {
window.history.pushState({}, '', '/');
}
// Show the me-page section
const mePage = document.getElementById('me-page');
if (mePage) {
document.querySelectorAll('main > section').forEach(s => s.hidden = s.id !== 'me-page');
}
// Clear loading state
const streamInfo = document.getElementById('stream-info');
if (streamInfo) {
streamInfo.innerHTML = '';
}
} catch (error) {
console.error('Error loading stream:', error);
const streamInfo = document.getElementById('stream-info');
if (streamInfo) {
streamInfo.innerHTML = '<p>Error loading stream. Please try again.</p>';
}
}
});
}
// Always reset current stream and update audio source
currentStreamUid = uid;
audio.pause();
audio.src = '';
// Wait a moment to ensure the previous source is cleared
await new Promise(resolve => setTimeout(resolve, 50));
// Set new source with cache-busting timestamp
audio.src = `/audio/${encodeURIComponent(uid)}/stream.opus?t=${Date.now()}`;
// Try to play immediately
try {
await audio.play();
audioPlaying = true;
} catch (e) {
console.error('Play failed:', e);
audioPlaying = false;
}
// Show stream info
const streamInfo = document.getElementById("stream-info");
if (streamInfo) streamInfo.hidden = false;
// Update button state
updatePlayPauseButton();
return audio;
} }
// Export the function for use in other modules // Initialize the application when DOM is loaded
window.loadProfileStream = loadProfileStream;
document.addEventListener("DOMContentLoaded", () => { document.addEventListener("DOMContentLoaded", () => {
// Initialize play/pause button initNavigation();
const playPauseButton = document.getElementById('play-pause'); initPersonalPlayer();
if (playPauseButton) {
// Set initial state
audioPlaying = false;
updatePlayPauseButton();
// Add event listener
playPauseButton.addEventListener('click', () => {
const audio = getMainAudio();
if (audio) {
if (audio.paused) {
audio.play();
} else {
audio.pause();
}
updatePlayPauseButton();
}
});
}
// Add bot protection for registration form
const registerForm = document.getElementById('register-form');
if (registerForm) {
registerForm.addEventListener('submit', (e) => {
const botTrap = e.target.elements.bot_trap;
if (botTrap && botTrap.value) {
e.preventDefault();
showToast('❌ Bot detected! Please try again.');
return false;
}
return true;
});
}
// Initialize navigation
document.querySelectorAll('#links a[data-target]').forEach(link => {
link.addEventListener('click', (e) => {
e.preventDefault();
const target = link.getAttribute('data-target');
// Only hide other sections when not opening #me-page
if (target !== 'me-page') fadeAllSections();
const section = document.getElementById(target);
if (section) {
section.hidden = false;
section.classList.add("slide-in");
section.scrollIntoView({ behavior: "smooth" });
}
const burger = document.getElementById('burger-toggle');
if (burger && burger.checked) burger.checked = false;
});
});
// Initialize profile player if valid session
setTimeout(runProfilePlayerIfSessionValid, 200);
window.addEventListener('popstate', () => {
setTimeout(runProfilePlayerIfSessionValid, 200);
});
});
// Initialize navigation
document.querySelectorAll('#links a[data-target]').forEach(link => {
link.addEventListener('click', (e) => {
e.preventDefault();
const target = link.getAttribute('data-target');
// Only hide other sections when not opening #me-page
if (target !== 'me-page') fadeAllSections();
const section = document.getElementById(target);
if (section) {
section.hidden = false;
section.classList.add("slide-in");
section.scrollIntoView({ behavior: "smooth" });
}
const burger = document.getElementById('burger-toggle');
if (burger && burger.checked) burger.checked = false;
});
});
// Initialize profile player if valid session
setTimeout(runProfilePlayerIfSessionValid, 200);
window.addEventListener('popstate', () => {
setTimeout(runProfilePlayerIfSessionValid, 200);
});
}); });

636
static/audio-player.js Normal file
View File

@ -0,0 +1,636 @@
/**
* Audio Player Module
* A shared audio player implementation based on the working "Your Stream" player
*/
import { globalAudioManager } from './global-audio-manager.js';
export class AudioPlayer {
constructor() {
// Audio state
this.audioElement = null;
this.currentUid = null;
this.isPlaying = false;
this.currentButton = null;
this.audioUrl = '';
this.lastPlayTime = 0;
this.isLoading = false;
this.loadTimeout = null; // For tracking loading timeouts
this.retryCount = 0;
this.maxRetries = 3;
this.retryDelay = 3000; // 3 seconds
this.buffering = false;
this.bufferRetryTimeout = null;
this.lastLoadTime = 0;
this.minLoadInterval = 2000; // 2 seconds between loads
this.pendingLoad = false;
// Create a single audio element that we'll reuse
this.audioElement = new Audio();
this.audioElement.preload = 'none';
this.audioElement.crossOrigin = 'anonymous';
// Bind methods
this.loadAndPlay = this.loadAndPlay.bind(this);
this.stop = this.stop.bind(this);
this.cleanup = this.cleanup.bind(this);
this.handlePlayError = this.handlePlayError.bind(this);
this.handleStalled = this.handleStalled.bind(this);
this.handleWaiting = this.handleWaiting.bind(this);
this.handlePlaying = this.handlePlaying.bind(this);
this.handleEnded = this.handleEnded.bind(this);
// Set up event listeners
this.setupEventListeners();
// Register with global audio manager to handle stop requests from other players
globalAudioManager.addListener('personal', () => {
console.log('[audio-player] Received stop request from global audio manager');
this.stop();
});
}
/**
* Load and play audio for a specific UID
* @param {string} uid - The user ID for the audio stream
* @param {HTMLElement} button - The play/pause button element
*/
/**
* Validates that a UID is in the correct UUID format
* @param {string} uid - The UID to validate
* @returns {boolean} True if valid, false otherwise
*/
isValidUuid(uid) {
// UUID v4 format: xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
return uuidRegex.test(uid);
}
/**
* Logs an error and updates the button state
* @param {HTMLElement} button - The button to update
* @param {string} message - Error message to log
*/
handleError(button, message) {
console.error(message);
if (button) {
this.updateButtonState(button, 'error');
}
}
async loadAndPlay(uid, button) {
const now = Date.now();
// Prevent rapid successive load attempts
if (this.pendingLoad || (now - this.lastLoadTime < this.minLoadInterval)) {
console.log('[AudioPlayer] Skipping duplicate load request');
return;
}
// Validate UID exists and is in correct format
if (!uid) {
this.handleError(button, 'No UID provided for audio playback');
return;
}
// For logging purposes
const requestId = Math.random().toString(36).substr(2, 8);
console.log(`[AudioPlayer] Load request ${requestId} for UID: ${uid}`);
this.pendingLoad = true;
this.lastLoadTime = now;
// If we're in the middle of loading, check if it's for the same UID
if (this.isLoading) {
// If same UID, ignore duplicate request
if (this.currentUid === uid) {
console.log(`[AudioPlayer] Already loading this UID, ignoring duplicate request: ${uid}`);
this.pendingLoad = false;
return;
}
// If different UID, queue the new request
console.log(`[AudioPlayer] Already loading, queuing request for UID: ${uid}`);
setTimeout(() => {
this.pendingLoad = false;
this.loadAndPlay(uid, button);
}, 500);
return;
}
// If we're in the middle of loading, check if it's for the same UID
if (this.isLoading) {
// If same UID, ignore duplicate request
if (this.currentUid === uid) {
console.log('Already loading this UID, ignoring duplicate request:', uid);
return;
}
// If different UID, queue the new request
console.log('Already loading, queuing request for UID:', uid);
setTimeout(() => this.loadAndPlay(uid, button), 500);
return;
}
// If already playing this stream, just toggle pause/play
if (this.currentUid === uid && this.audioElement) {
try {
if (this.isPlaying) {
console.log('Pausing current playback');
try {
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
this.isPlaying = false;
this.updateButtonState(button, 'paused');
} catch (pauseError) {
console.warn('Error pausing audio, continuing with state update:', pauseError);
this.isPlaying = false;
this.updateButtonState(button, 'paused');
}
} else {
console.log('Resuming playback from time:', this.lastPlayTime);
try {
// If we have a last play time, seek to it
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
await this.audioElement.play();
this.isPlaying = true;
this.updateButtonState(button, 'playing');
} catch (playError) {
console.error('Error resuming playback, reloading source:', playError);
// If resume fails, try reloading the source
this.currentUid = null; // Force reload of the source
return this.loadAndPlay(uid, button);
}
}
return; // Exit after handling pause/resume
} catch (error) {
console.error('Error toggling playback:', error);
this.updateButtonState(button, 'error');
return;
}
}
// If we get here, we're loading a new stream
this.isLoading = true;
this.currentUid = uid;
this.currentButton = button;
this.isPlaying = true;
this.updateButtonState(button, 'loading');
// Notify global audio manager that personal player is starting
globalAudioManager.startPlayback('personal', uid);
try {
// Only clean up if switching streams
if (this.currentUid !== uid) {
this.cleanup();
}
// Store the current button reference
this.currentButton = button;
this.currentUid = uid;
// Create a new audio element if we don't have one
if (!this.audioElement) {
this.audioElement = new Audio();
} else if (this.audioElement.readyState > 0) {
// If we already have a loaded source, just play it
try {
await this.audioElement.play();
this.isPlaying = true;
this.updateButtonState(button, 'playing');
return;
} catch (playError) {
console.warn('Error playing existing source, will reload:', playError);
// Continue to load a new source
}
}
// Clear any existing sources
while (this.audioElement.firstChild) {
this.audioElement.removeChild(this.audioElement.firstChild);
}
// Set the source URL with proper encoding and cache-busting timestamp
// Using the format: /audio/{uid}/stream.opus?t={timestamp}
// Only update timestamp if we're loading a different UID or after a retry
const timestamp = this.retryCount > 0 ? new Date().getTime() : this.lastLoadTime;
this.audioUrl = `/audio/${encodeURIComponent(uid)}/stream.opus?t=${timestamp}`;
console.log(`[AudioPlayer] Loading audio from URL: ${this.audioUrl} (attempt ${this.retryCount + 1}/${this.maxRetries})`);
console.log('Loading audio from URL:', this.audioUrl);
this.audioElement.src = this.audioUrl;
// Load the new source (don't await, let canplay handle it)
try {
this.audioElement.load();
// If load() doesn't throw, we'll wait for canplay event
} catch (e) {
// Ignore abort errors as they're expected during rapid toggling
if (e.name !== 'AbortError') {
console.error('Error loading audio source:', e);
this.isLoading = false;
this.updateButtonState(button, 'error');
}
}
// Reset the current time when loading a new source
this.audioElement.currentTime = 0;
this.lastPlayTime = 0;
// Set up error handling
this.audioElement.onerror = (e) => {
console.error('Audio element error:', e, this.audioElement.error);
this.isLoading = false;
this.updateButtonState(button, 'error');
};
// Handle when audio is ready to play
const onCanPlay = () => {
this.audioElement.removeEventListener('canplay', onCanPlay);
this.isLoading = false;
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
this.audioElement.play().then(() => {
this.isPlaying = true;
this.updateButtonState(button, 'playing');
}).catch(e => {
console.error('Error playing after load:', e);
this.updateButtonState(button, 'error');
});
};
// Define the error handler
const errorHandler = (e) => {
console.error('Audio element error:', e, this.audioElement.error);
this.isLoading = false;
this.updateButtonState(button, 'error');
};
// Define the play handler
const playHandler = () => {
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
this.audioElement.removeEventListener('canplay', playHandler);
this.isLoading = false;
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
this.audioElement.play().then(() => {
this.isPlaying = true;
this.updateButtonState(button, 'playing');
}).catch(e => {
console.error('Error playing after load:', e);
this.isPlaying = false;
this.updateButtonState(button, 'error');
});
};
// Add event listeners
this.audioElement.addEventListener('error', errorHandler, { once: true });
this.audioElement.addEventListener('canplay', playHandler, { once: true });
// Load and play the new source
try {
await this.audioElement.load();
// Don't await play() here, let the canplay handler handle it
// Set a timeout to handle cases where canplay doesn't fire
this.loadTimeout = setTimeout(() => {
if (this.isLoading) {
console.warn('Audio loading timed out for UID:', uid);
this.isLoading = false;
this.updateButtonState(button, 'error');
}
}, 10000); // 10 second timeout
} catch (e) {
console.error('Error loading audio:', e);
this.isLoading = false;
this.updateButtonState(button, 'error');
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
}
} catch (error) {
console.error('Error in loadAndPlay:', error);
// Only cleanup and show error if we're still on the same track
if (this.currentUid === uid) {
this.cleanup();
this.updateButtonState(button, 'error');
}
}
}
/**
* Stop playback and clean up resources
*/
stop() {
try {
if (this.audioElement) {
console.log('Stopping audio playback');
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
this.isPlaying = false;
// Notify global audio manager that personal player has stopped
globalAudioManager.stopPlayback('personal');
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
}
} catch (error) {
console.error('Error stopping audio:', error);
// Don't throw, just log the error
}
}
/**
* Set up event listeners for the audio element
*/
setupEventListeners() {
if (!this.audioElement) return;
// Remove any existing listeners to prevent duplicates
this.audioElement.removeEventListener('error', this.handlePlayError);
this.audioElement.removeEventListener('stalled', this.handleStalled);
this.audioElement.removeEventListener('waiting', this.handleWaiting);
this.audioElement.removeEventListener('playing', this.handlePlaying);
this.audioElement.removeEventListener('ended', this.handleEnded);
// Add new listeners
this.audioElement.addEventListener('error', this.handlePlayError);
this.audioElement.addEventListener('stalled', this.handleStalled);
this.audioElement.addEventListener('waiting', this.handleWaiting);
this.audioElement.addEventListener('playing', this.handlePlaying);
this.audioElement.addEventListener('ended', this.handleEnded);
}
/**
* Handle play errors
*/
handlePlayError(event) {
console.error('[AudioPlayer] Playback error:', {
event: event.type,
error: this.audioElement.error,
currentTime: this.audioElement.currentTime,
readyState: this.audioElement.readyState,
networkState: this.audioElement.networkState,
src: this.audioElement.src
});
this.isPlaying = false;
this.buffering = false;
this.pendingLoad = false;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'error');
}
// Auto-retry logic
if (this.retryCount < this.maxRetries) {
this.retryCount++;
console.log(`Retrying playback (attempt ${this.retryCount}/${this.maxRetries})...`);
setTimeout(() => {
if (this.currentUid && this.currentButton) {
this.loadAndPlay(this.currentUid, this.currentButton);
}
}, this.retryDelay);
} else {
console.error('Max retry attempts reached');
this.retryCount = 0; // Reset for next time
}
}
/**
* Handle stalled audio (buffering issues)
*/
handleStalled() {
console.log('[AudioPlayer] Playback stalled, attempting to recover...');
this.buffering = true;
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
}
this.bufferRetryTimeout = setTimeout(() => {
if (this.buffering) {
console.log('[AudioPlayer] Buffer recovery timeout, attempting to reload...');
if (this.currentUid && this.currentButton) {
// Only retry if we're still supposed to be playing
if (this.isPlaying) {
this.retryCount++;
if (this.retryCount <= this.maxRetries) {
console.log(`[AudioPlayer] Retry ${this.retryCount}/${this.maxRetries} for UID: ${this.currentUid}`);
this.loadAndPlay(this.currentUid, this.currentButton);
} else {
console.error('[AudioPlayer] Max retry attempts reached');
this.retryCount = 0;
this.updateButtonState(this.currentButton, 'error');
}
}
}
}
}, 5000); // 5 second buffer recovery timeout
}
/**
* Handle waiting event (buffering)
*/
handleWaiting() {
console.log('Audio waiting for data...');
this.buffering = true;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'loading');
}
}
/**
* Handle playing event (playback started/resumed)
*/
handlePlaying() {
console.log('Audio playback started/resumed');
this.buffering = false;
this.retryCount = 0; // Reset retry counter on successful playback
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
this.bufferRetryTimeout = null;
}
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'playing');
}
}
/**
* Handle ended event (playback completed)
*/
handleEnded() {
console.log('Audio playback ended');
this.isPlaying = false;
this.buffering = false;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
}
/**
* Clean up resources
*/
cleanup() {
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
this.bufferRetryTimeout = null;
}
// Update button state if we have a reference to the current button
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
// Pause the audio and store the current time
if (this.audioElement) {
try {
// Remove event listeners to prevent memory leaks
this.audioElement.removeEventListener('error', this.handlePlayError);
this.audioElement.removeEventListener('stalled', this.handleStalled);
this.audioElement.removeEventListener('waiting', this.handleWaiting);
this.audioElement.removeEventListener('playing', this.handlePlaying);
this.audioElement.removeEventListener('ended', this.handleEnded);
try {
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
} catch (e) {
console.warn('Error pausing audio during cleanup:', e);
}
try {
// Clear any existing sources
while (this.audioElement.firstChild) {
this.audioElement.removeChild(this.audioElement.firstChild);
}
// Clear the source and reset the audio element
this.audioElement.removeAttribute('src');
try {
this.audioElement.load();
} catch (e) {
console.warn('Error in audio load during cleanup:', e);
}
} catch (e) {
console.warn('Error cleaning up audio sources:', e);
}
} catch (e) {
console.warn('Error during audio cleanup:', e);
}
}
// Reset state
this.currentUid = null;
this.currentButton = null;
this.audioUrl = '';
this.isPlaying = false;
this.buffering = false;
this.retryCount = 0;
// Notify global audio manager that personal player has stopped
globalAudioManager.stopPlayback('personal');
}
/**
* Update the state of a play/pause button
* @param {HTMLElement} button - The button to update
* @param {string} state - The state to set ('playing', 'paused', 'loading', 'error')
*/
updateButtonState(button, state) {
if (!button) return;
// Only update the current button's state
if (state === 'playing') {
// If this button is now playing, update all buttons
document.querySelectorAll('.play-pause-btn').forEach(btn => {
btn.classList.remove('playing', 'paused', 'loading', 'error');
if (btn === button) {
btn.classList.add('playing');
} else {
btn.classList.add('paused');
}
});
} else {
// For other states, just update the target button
button.classList.remove('playing', 'paused', 'loading', 'error');
if (state) {
button.classList.add(state);
}
}
// Update button icon and aria-label for the target button
const icon = button.querySelector('i');
if (icon) {
if (state === 'playing') {
icon.className = 'fas fa-pause';
button.setAttribute('aria-label', 'Pause');
} else {
icon.className = 'fas fa-play';
button.setAttribute('aria-label', 'Play');
}
}
}
}
// Create a singleton instance
export const audioPlayer = new AudioPlayer();
// Export utility functions for direct use
export function initAudioPlayer(container = document) {
// Set up event delegation for play/pause buttons
container.addEventListener('click', (e) => {
const playButton = e.target.closest('.play-pause-btn');
if (!playButton) return;
e.preventDefault();
e.stopPropagation();
const uid = playButton.dataset.uid;
if (!uid) return;
audioPlayer.loadAndPlay(uid, playButton);
});
// Set up event delegation for stop buttons if they exist
container.addEventListener('click', (e) => {
const stopButton = e.target.closest('.stop-btn');
if (!stopButton) return;
e.preventDefault();
e.stopPropagation();
audioPlayer.stop();
});
}
// Auto-initialize if this is the main module
if (typeof document !== 'undefined') {
document.addEventListener('DOMContentLoaded', () => {
initAudioPlayer();
});
}

688
static/auth-manager.js Normal file
View File

@ -0,0 +1,688 @@
/**
* Centralized Authentication Manager
*
* This module consolidates all authentication logic from auth.js, magic-login.js,
* and cleanup-auth.js into a single, maintainable module.
*/
import { showToast } from './toast.js';
class AuthManager {
constructor() {
this.DEBUG_AUTH_STATE = false;
this.AUTH_CHECK_DEBOUNCE = 1000; // 1 second
this.AUTH_CHECK_INTERVAL = 30000; // 30 seconds
this.CACHE_TTL = 5000; // 5 seconds
// Authentication state cache
this.authStateCache = {
timestamp: 0,
value: null,
ttl: this.CACHE_TTL
};
// Track auth check calls
this.lastAuthCheckTime = 0;
this.authCheckCounter = 0;
this.wasAuthenticated = null;
// Bind all methods that will be used as event handlers
this.checkAuthState = this.checkAuthState.bind(this);
this.handleMagicLoginRedirect = this.handleMagicLoginRedirect.bind(this);
this.logout = this.logout.bind(this);
this.deleteAccount = this.deleteAccount.bind(this);
this.handleStorageEvent = this.handleStorageEvent.bind(this);
this.handleVisibilityChange = this.handleVisibilityChange.bind(this);
// Initialize
this.initialize = this.initialize.bind(this);
}
/**
* Validate UID format - must be a valid email address
*/
validateUidFormat(uid) {
if (!uid || typeof uid !== 'string') {
// Debug messages disabled
return false;
}
// Email regex pattern - RFC 5322 compliant basic validation
const emailRegex = /^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/;
const isValid = emailRegex.test(uid);
if (!isValid) {
// Debug messages disabled
} else {
// Debug messages disabled
}
return isValid;
}
/**
* Sanitize and validate UID - ensures consistent format
*/
sanitizeUid(uid) {
if (!uid || typeof uid !== 'string') {
// Debug messages disabled
return null;
}
// Trim whitespace and convert to lowercase
const sanitized = uid.trim().toLowerCase();
// Validate the sanitized UID
if (!this.validateUidFormat(sanitized)) {
// Debug messages disabled
return null;
}
// Debug messages disabled
return sanitized;
}
/**
* Check if current stored UID is valid and fix if needed
*/
validateStoredUid() {
const storedUid = localStorage.getItem('uid');
if (!storedUid) {
// Debug messages disabled
return null;
}
const sanitizedUid = this.sanitizeUid(storedUid);
if (!sanitizedUid) {
// Debug messages disabled
this.clearAuthState();
return null;
}
// Update stored UID if sanitization changed it
if (sanitizedUid !== storedUid) {
// Debug messages disabled
localStorage.setItem('uid', sanitizedUid);
// Update cookies as well
document.cookie = `uid=${sanitizedUid}; path=/; SameSite=Lax; Secure`;
}
return sanitizedUid;
}
/**
* Get cookie value by name
*/
getCookieValue(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) {
return parts.pop().split(';').shift();
}
return null;
}
/**
* Initialize the authentication manager
*/
async initialize() {
// Debug messages disabled
// Validate stored UID format and fix if needed
const validUid = this.validateStoredUid();
if (validUid) {
// Debug messages disabled
} else {
// Debug messages disabled
}
// Handle magic link login if present
await this.handleMagicLoginRedirect();
// Setup authentication state polling
this.setupAuthStatePolling();
// Setup event listeners
document.addEventListener('visibilitychange', this.handleVisibilityChange);
this.setupEventListeners();
// Debug messages disabled
}
/**
* Fetch user information from the server
*/
async fetchUserInfo() {
try {
// Get the auth token from cookies
const authToken = this.getCookieValue('authToken') || localStorage.getItem('authToken');
// Debug messages disabled
const headers = {
'Accept': 'application/json',
'Content-Type': 'application/json'
};
// Add Authorization header if we have a token
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
// Debug messages disabled
} else {
// Debug messages disabled
}
// Debug messages disabled
const response = await fetch('/api/me', {
method: 'GET',
credentials: 'include',
headers: headers
});
// Debug messages disabled
if (response.ok) {
const contentType = response.headers.get('content-type');
// Debug messages disabled
if (contentType && contentType.includes('application/json')) {
const userInfo = await response.json();
// Debug messages disabled
return userInfo;
} else {
const text = await response.text();
// Debug messages disabled
}
} else {
const errorText = await response.text();
// Debug messages disabled
}
return null;
} catch (error) {
// Debug messages disabled
return null;
}
}
/**
* Set authentication state in localStorage and cookies
*/
setAuthState(userEmail, username, authToken = null) {
// Debug messages disabled
// Validate and sanitize the UID (email)
const sanitizedUid = this.sanitizeUid(userEmail);
if (!sanitizedUid) {
// Debug messages disabled
throw new Error(`Invalid UID format: ${userEmail}. UID must be a valid email address.`);
}
// Validate username (basic check)
if (!username || typeof username !== 'string' || username.trim().length === 0) {
// Debug messages disabled
throw new Error(`Invalid username: ${username}. Username cannot be empty.`);
}
const sanitizedUsername = username.trim();
// Generate auth token if not provided
if (!authToken) {
authToken = 'token-' + Math.random().toString(36).substring(2, 15);
}
// Debug messages disabled
// Set localStorage for client-side access (not sent to server)
localStorage.setItem('uid', sanitizedUid); // Primary UID is email
localStorage.setItem('username', sanitizedUsername); // Username for display
localStorage.setItem('uid_time', Date.now().toString());
// Set cookies for server authentication (sent with requests)
document.cookie = `uid=${encodeURIComponent(sanitizedUid)}; path=/; SameSite=Lax`;
document.cookie = `authToken=${authToken}; path=/; SameSite=Lax; Secure`;
// Note: isAuthenticated is determined by presence of valid authToken, no need to duplicate
// Clear cache to force refresh
this.authStateCache.timestamp = 0;
}
/**
* Clear authentication state
*/
clearAuthState() {
// Debug messages disabled
// Clear localStorage (client-side data only)
const authKeys = ['uid', 'username', 'uid_time'];
authKeys.forEach(key => localStorage.removeItem(key));
// Clear cookies
document.cookie.split(';').forEach(cookie => {
const eqPos = cookie.indexOf('=');
const name = eqPos > -1 ? cookie.substr(0, eqPos).trim() : cookie.trim();
document.cookie = `${name}=;expires=Thu, 01 Jan 1970 00:00:00 GMT;path=/; SameSite=Lax`;
});
// Clear cache
this.authStateCache.timestamp = 0;
}
/**
* Check if user is currently authenticated
*/
isAuthenticated() {
const now = Date.now();
// Use cached value if still valid
if (this.authStateCache.timestamp > 0 &&
(now - this.authStateCache.timestamp) < this.authStateCache.ttl) {
return this.authStateCache.value;
}
// Check authentication state - simplified approach
const hasUid = !!(document.cookie.includes('uid=') || localStorage.getItem('uid'));
const hasAuthToken = !!document.cookie.includes('authToken=');
const isAuth = hasUid && hasAuthToken;
// Update cache
this.authStateCache.timestamp = now;
this.authStateCache.value = isAuth;
return isAuth;
}
/**
* Get current user data
*/
getCurrentUser() {
if (!this.isAuthenticated()) {
return null;
}
return {
uid: localStorage.getItem('uid'),
email: localStorage.getItem('uid'), // uid is the email
username: localStorage.getItem('username'),
authToken: this.getCookieValue('authToken') // authToken is in cookies
};
}
/**
* Handle magic link login redirect
*/
async handleMagicLoginRedirect() {
const params = new URLSearchParams(window.location.search);
// Handle secure token-based magic login only
const token = params.get('token');
if (token) {
// Debug messages disabled
// Clean up URL immediately
const url = new URL(window.location.href);
url.searchParams.delete('token');
window.history.replaceState({}, document.title, url.pathname + url.search);
await this.processTokenLogin(token);
return true;
}
return false;
}
/**
* Process token-based login
*/
async processTokenLogin(token) {
try {
// Debug messages disabled
const formData = new FormData();
formData.append('token', token);
// Debug messages disabled
const response = await fetch('/magic-login', {
method: 'POST',
body: formData,
});
// Debug messages disabled
// Handle successful token login response
const contentType = response.headers.get('content-type');
// Debug messages disabled
if (contentType && contentType.includes('application/json')) {
const data = await response.json();
// Debug messages disabled
if (data && data.success && data.user) {
// Debug messages disabled
// Use the user data and token from the response
const { email, username } = data.user;
const authToken = data.token; // Get token from JSON response
// Debug messages disabled
// Set auth state with the token from the response
this.setAuthState(email, username, authToken);
this.updateUIState(true);
await this.initializeUserSession(username, email);
showToast('✅ Login successful!');
this.navigateToProfile();
return;
} else {
// Debug messages disabled
throw new Error('Invalid user data received from server');
}
} else {
const text = await response.text();
// Debug messages disabled
throw new Error(`Unexpected response format: ${text || 'No details available'}`);
}
} catch (error) {
// Debug messages disabled
showToast(`Login failed: ${error.message}`, 'error');
}
}
/**
* Initialize user session after login
*/
async initializeUserSession(username, userEmail) {
// Initialize dashboard
if (window.initDashboard) {
await window.initDashboard(username);
} else {
// Debug messages disabled
}
// Fetch and display file list
if (window.fetchAndDisplayFiles) {
// Debug messages disabled
await window.fetchAndDisplayFiles(userEmail);
} else {
// Debug messages disabled
}
}
/**
* Navigate to user profile
*/
navigateToProfile() {
if (window.showOnly) {
// Debug messages disabled
window.showOnly('me-page');
} else if (window.location.hash !== '#me-page') {
window.location.hash = '#me-page';
}
}
/**
* Update UI state based on authentication
*/
updateUIState(isAuthenticated) {
if (isAuthenticated) {
document.body.classList.add('authenticated');
document.body.classList.remove('guest');
// Note: Removed auto-loading of profile stream to prevent auto-play on page load
// Profile stream will only play when user clicks the play button
} else {
document.body.classList.remove('authenticated');
document.body.classList.add('guest');
}
this.updateAccountDeletionVisibility(isAuthenticated);
// Force reflow
void document.body.offsetHeight;
}
/**
* Update account deletion section visibility
*/
updateAccountDeletionVisibility(isAuthenticated) {
const accountDeletionSection = document.getElementById('account-deletion-section');
const deleteAccountFromPrivacy = document.getElementById('delete-account-from-privacy');
if (isAuthenticated) {
this.showElement(accountDeletionSection);
this.showElement(deleteAccountFromPrivacy);
} else {
this.hideElement(accountDeletionSection);
this.hideElement(deleteAccountFromPrivacy);
}
}
showElement(element) {
if (element) {
element.style.display = 'block';
element.style.visibility = 'visible';
}
}
hideElement(element) {
if (element) {
element.style.display = 'none';
}
}
/**
* Check authentication state with caching and debouncing
*/
checkAuthState(force = false) {
const now = Date.now();
// Debounce frequent calls
if (!force && (now - this.lastAuthCheckTime) < this.AUTH_CHECK_DEBOUNCE) {
return this.authStateCache.value;
}
this.lastAuthCheckTime = now;
this.authCheckCounter++;
if (this.DEBUG_AUTH_STATE) {
// Debug messages disabled
}
const isAuthenticated = this.isAuthenticated();
// Only update UI if state changed or forced
if (force || this.wasAuthenticated !== isAuthenticated) {
if (this.DEBUG_AUTH_STATE) {
// Debug messages disabled
}
// Handle logout detection
if (this.wasAuthenticated === true && isAuthenticated === false) {
// Debug messages disabled
this.logout();
return false;
}
this.updateUIState(isAuthenticated);
this.wasAuthenticated = isAuthenticated;
}
return isAuthenticated;
}
/**
* Setup authentication state polling
*/
setupAuthStatePolling() {
// Initial check
this.checkAuthState(true);
// Periodic checks
setInterval(() => {
this.checkAuthState(!document.hidden);
}, this.AUTH_CHECK_INTERVAL);
// Storage event listener
window.addEventListener('storage', this.handleStorageEvent);
// Visibility change listener
document.addEventListener('visibilitychange', this.handleVisibilityChange);
}
/**
* Handle storage events
*/
handleStorageEvent(e) {
if (['isAuthenticated', 'authToken', 'uid'].includes(e.key)) {
this.checkAuthState(true);
}
}
/**
* Handle visibility change events
*/
handleVisibilityChange() {
if (!document.hidden) {
this.checkAuthState(true);
}
}
/**
* Setup event listeners
*/
setupEventListeners() {
document.addEventListener('click', (e) => {
// Delete account buttons
if (e.target.closest('#delete-account') || e.target.closest('#delete-account-from-privacy')) {
this.deleteAccount(e);
return;
}
});
}
/**
* Delete user account
*/
async deleteAccount(e) {
if (e) e.preventDefault();
if (this.deleteAccount.inProgress) return;
if (!confirm('Are you sure you want to delete your account?\nThis action is permanent.')) {
return;
}
this.deleteAccount.inProgress = true;
const deleteBtn = e?.target.closest('button');
const originalText = deleteBtn?.textContent;
if (deleteBtn) {
deleteBtn.disabled = true;
deleteBtn.textContent = 'Deleting...';
}
try {
const response = await fetch('/api/delete-account', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
credentials: 'include',
body: JSON.stringify({ uid: localStorage.getItem('uid') })
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({ detail: 'Failed to delete account.' }));
throw new Error(errorData.detail);
}
showToast('Account deleted successfully.', 'success');
this.logout();
} catch (error) {
// Debug messages disabled
showToast(error.message, 'error');
} finally {
this.deleteAccount.inProgress = false;
if (deleteBtn) {
deleteBtn.disabled = false;
deleteBtn.textContent = originalText;
}
}
}
/**
* Logout user
*/
logout() {
// Debug messages disabled
this.clearAuthState();
window.location.href = '/';
}
/**
* Cleanup authentication state (for migration/debugging)
*/
async cleanupAuthState(manualEmail = null) {
// Debug messages disabled
let userEmail = manualEmail;
// Try to get email from server if not provided
if (!userEmail) {
const userInfo = await this.fetchUserInfo();
userEmail = userInfo?.email;
if (!userEmail) {
userEmail = prompt('Please enter your email address (e.g., oib@chello.at):');
if (!userEmail || !userEmail.includes('@')) {
// Debug messages disabled
return { success: false, error: 'Invalid email' };
}
}
}
if (!userEmail) {
// Debug messages disabled
return { success: false, error: 'No email available' };
}
// Get current username for reference
const currentUsername = localStorage.getItem('username') || localStorage.getItem('uid');
// Clear and reset authentication state
this.clearAuthState();
this.setAuthState(userEmail, currentUsername || userEmail);
// Debug messages disabled
// Debug messages disabled
// Refresh if on profile page
if (window.location.hash === '#me-page') {
window.location.reload();
}
return {
email: userEmail,
username: currentUsername,
success: true
};
}
/**
* Destroy the authentication manager
*/
destroy() {
window.removeEventListener('storage', this.handleStorageEvent);
document.removeEventListener('visibilitychange', this.handleVisibilityChange);
}
}
// Create and export singleton instance
const authManager = new AuthManager();
// Export for global access
window.authManager = authManager;
export default authManager;

View File

@ -1,5 +1,5 @@
// static/auth-ui.js — navigation link and back-button handlers // static/auth-ui.js — navigation link and back-button handlers
import { showOnly } from './router.js'; import { showSection } from './nav.js';
// Data-target navigation (e.g., at #links) // Data-target navigation (e.g., at #links)
export function initNavLinks() { export function initNavLinks() {
@ -10,7 +10,7 @@ export function initNavLinks() {
if (!a || !linksContainer.contains(a)) return; if (!a || !linksContainer.contains(a)) return;
e.preventDefault(); e.preventDefault();
const target = a.dataset.target; const target = a.dataset.target;
if (target) showOnly(target); if (target) showSection(target);
const burger = document.getElementById('burger-toggle'); const burger = document.getElementById('burger-toggle');
if (burger && burger.checked) burger.checked = false; if (burger && burger.checked) burger.checked = false;
}); });
@ -22,7 +22,7 @@ export function initBackButtons() {
btn.addEventListener('click', e => { btn.addEventListener('click', e => {
e.preventDefault(); e.preventDefault();
const target = btn.dataset.back; const target = btn.dataset.back;
if (target) showOnly(target); if (target) showSection(target);
}); });
}); });
} }

31
static/auth.js Normal file
View File

@ -0,0 +1,31 @@
/**
* Simplified Authentication Module
*
* This file now uses the centralized AuthManager for all authentication logic.
* Legacy code has been replaced with the new consolidated approach.
*/
import authManager from './auth-manager.js';
import { loadProfileStream } from './personal-player.js';
// Initialize authentication manager when DOM is ready
document.addEventListener('DOMContentLoaded', async () => {
// Debug messages disabled
// Initialize the centralized auth manager
await authManager.initialize();
// Make loadProfileStream available globally for auth manager
window.loadProfileStream = loadProfileStream;
// Debug messages disabled
});
// Export auth manager for other modules to use
export { authManager };
// Legacy compatibility - expose some functions globally
window.getCurrentUser = () => authManager.getCurrentUser();
window.isAuthenticated = () => authManager.isAuthenticated();
window.logout = () => authManager.logout();
window.cleanupAuthState = (email) => authManager.cleanupAuthState(email);

38
static/cleanup-auth.js Normal file
View File

@ -0,0 +1,38 @@
/**
* Simplified Authentication Cleanup Module
*
* This file now uses the centralized AuthManager for authentication cleanup.
* The cleanup logic has been moved to the AuthManager.
*/
import authManager from './auth-manager.js';
/**
* Clean up authentication state - now delegated to AuthManager
* This function is kept for backward compatibility.
*/
async function cleanupAuthState(manualEmail = null) {
console.log('[CLEANUP] Starting authentication state cleanup via AuthManager...');
// Delegate to the centralized AuthManager
return await authManager.cleanupAuthState(manualEmail);
}
// Auto-run cleanup if this script is loaded directly
if (typeof window !== 'undefined') {
// Export function for manual use
window.cleanupAuthState = cleanupAuthState;
// Auto-run if URL contains cleanup parameter
const urlParams = new URLSearchParams(window.location.search);
if (urlParams.get('cleanup') === 'auth') {
cleanupAuthState().then(result => {
if (result && result.success) {
console.log('[CLEANUP] Auto-cleanup completed successfully');
}
});
}
}
// Export for ES6 modules
export { cleanupAuthState };

208
static/css/base.css Normal file
View File

@ -0,0 +1,208 @@
/* Base styles and resets */
:root {
/* Colors */
--color-primary: #4a90e2;
--color-primary-dark: #2a6fc9;
--color-text: #333;
--color-text-light: #666;
--color-bg: #f8f9fa;
--color-border: #e9ecef;
--color-white: #fff;
--color-black: #000;
/* Typography */
--font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
--font-size-base: 1rem;
--line-height-base: 1.5;
/* Spacing */
--spacing-xs: 0.25rem;
--spacing-sm: 0.5rem;
--spacing-md: 1rem;
--spacing-lg: 1.5rem;
--spacing-xl: 2rem;
/* Border radius */
--border-radius-sm: 4px;
--border-radius-md: 8px;
--border-radius-lg: 12px;
/* Transitions */
--transition-base: all 0.2s ease;
--transition-slow: all 0.3s ease;
}
/* Reset and base styles */
*,
*::before,
*::after {
box-sizing: border-box;
margin: 0;
padding: 0;
}
html {
height: 100%;
font-size: 16px;
-webkit-text-size-adjust: 100%;
-webkit-tap-highlight-color: transparent;
}
body {
margin: 0;
min-height: 100%;
font-family: var(--font-family);
line-height: var(--line-height-base);
color: var(--color-text);
background: var(--color-bg);
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
/* Main content */
.container {
max-width: 1200px;
margin: 0 auto;
padding: 6rem 1.5rem 2rem; /* Add top padding to account for fixed header */
min-height: calc(100vh - 200px); /* Ensure footer stays at bottom */
}
/* Sections */
section {
margin: 2rem 0;
padding: 2rem;
background: rgba(255, 255, 255, 0.05);
border-radius: 8px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
section h2 {
color: var(--color-primary);
margin-top: 0;
margin-bottom: 1.5rem;
font-size: 2rem;
}
section p {
color: var(--color-text);
line-height: 1.6;
margin-bottom: 1.5rem;
}
.main-heading {
font-size: 2.5rem;
margin: 0 0 2rem 0;
color: var(--color-text);
font-weight: 700;
line-height: 1.2;
display: flex;
align-items: center;
justify-content: center;
gap: 1rem;
text-align: center;
}
.main-heading .mic-icon {
display: inline-flex;
animation: pulse 2s infinite;
transform-origin: center;
}
@keyframes pulse {
0% { transform: scale(1); }
50% { transform: scale(1.2); }
100% { transform: scale(1); }
}
/* Typography */
h1, h2, h3, h4, h5, h6 {
margin-top: 0;
margin-bottom: var(--spacing-md);
font-weight: 600;
line-height: 1.2;
}
p {
margin-top: 0;
margin-bottom: var(--spacing-md);
}
a {
color: var(--color-primary);
text-decoration: none;
transition: var(--transition-base);
}
a:hover {
color: var(--color-primary-dark);
text-decoration: underline;
}
/* Images */
img {
max-width: 100%;
height: auto;
vertical-align: middle;
border-style: none;
}
/* Lists */
ul, ol {
padding-left: var(--spacing-lg);
margin-bottom: var(--spacing-md);
}
/* Loading animation */
.app-loading {
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
display: flex;
justify-content: center;
align-items: center;
background: var(--color-white);
z-index: 9999;
transition: opacity var(--transition-slow);
text-align: center;
padding: 2rem;
color: var(--color-text);
}
.app-loading > div:first-child {
margin-bottom: 1rem;
font-size: 2rem;
}
.app-loading.hidden {
opacity: 0;
pointer-events: none;
}
.app-content {
opacity: 1;
transition: opacity var(--transition-slow);
}
/* This class can be used for initial fade-in if needed */
.app-content.initial-load {
opacity: 0;
}
.app-content.loaded {
opacity: 1;
}
/* Utility classes */
.sr-only {
position: absolute;
width: 1px;
height: 1px;
padding: 0;
margin: -1px;
overflow: hidden;
clip: rect(0, 0, 0, 0);
white-space: nowrap;
border: 0;
}

69
static/css/colors.css Normal file
View File

@ -0,0 +1,69 @@
/*
* Color System Documentation
*
* This file documents the color variables used throughout the application.
* All colors should be defined as CSS variables in :root, and these variables
* should be used consistently across all CSS and JavaScript files.
*/
:root {
/* Primary Colors */
--primary-color: #4a6fa5; /* Main brand color */
--primary-hover: #3a5a8c; /* Darker shade for hover states */
/* Text Colors */
--text-color: #f0f0f0; /* Main text color */
--text-muted: #888; /* Secondary text, less important info */
--text-light: #999; /* Lighter text for disabled states */
--text-lighter: #bbb; /* Very light text, e.g., placeholders */
/* Background Colors */
--background: #1a1a1a; /* Main background color */
--surface: #2a2a2a; /* Surface color for cards, panels, etc. */
--code-bg: #222; /* Background for code blocks */
/* Border Colors */
--border: #444; /* Default border color */
--border-light: #555; /* Lighter border */
--border-lighter: #666; /* Even lighter border */
/* Status Colors */
--success: #2e8b57; /* Success messages, confirmations */
--warning: #ff6600; /* Warnings, important notices */
--error: #ff4444; /* Error messages, destructive actions */
--error-hover: #ff6666; /* Hover state for error buttons */
--info: #1e90ff; /* Informational messages, links */
--link-hover: #74c0fc; /* Hover state for links */
/* Transitions */
--transition: all 0.2s ease; /* Default transition */
}
/*
* Usage Examples:
*
* .button {
* background-color: var(--primary-color);
* color: var(--text-color);
* border: 1px solid var(--border);
* transition: var(--transition);
* }
*
* .button:hover {
* background-color: var(--primary-hover);
* }
*
* .error-message {
* color: var(--error);
* background-color: color-mix(in srgb, var(--error) 10%, transparent);
* border-left: 3px solid var(--error);
* }
*/
/*
* Accessibility Notes:
* - Ensure text has sufficient contrast with its background
* - Use semantic color names that describe the purpose, not the color
* - Test with color blindness simulators for accessibility
* - Maintain consistent color usage throughout the application
*/

View File

@ -0,0 +1,289 @@
/* File upload and list styles */
#user-upload-area {
border: 2px dashed var(--border);
border-radius: 8px;
padding: 2rem;
text-align: center;
margin: 1rem 0;
cursor: pointer;
transition: all 0.2s ease-in-out;
background-color: var(--surface);
}
#user-upload-area:hover,
#user-upload-area.highlight {
border-color: var(--primary);
background-color: rgba(var(--primary-rgb), 0.05);
}
#user-upload-area p {
margin: 0;
color: var(--text-secondary);
}
#file-list {
list-style: none;
padding: 0;
margin: 1rem 0 0;
}
#file-list {
margin: 1.5rem 0;
padding: 0;
}
#file-list li {
display: flex;
flex-direction: column;
padding: 0.75rem 1rem;
margin: 0.5rem 0;
background-color: var(--surface);
border-radius: 6px;
border: 1px solid var(--border);
transition: all 0.2s ease-in-out;
}
#file-list li:hover {
box-shadow: 0 2px 6px rgba(0, 0, 0, 0.1);
transform: translateY(-1px);
}
#file-list li.no-files,
#file-list li.loading-message,
#file-list li.error-message {
display: block;
text-align: center;
color: var(--text-muted);
padding: 2rem 1.5rem;
background-color: transparent;
border: 2px dashed var(--border);
margin: 1rem 0;
border-radius: 8px;
font-size: 1.1em;
}
#file-list li.loading-message {
color: var(--primary);
font-style: italic;
}
#file-list li.error-message {
color: var(--error);
border-color: var(--error);
}
#file-list li.error-message .login-link {
color: var(--primary);
text-decoration: none;
font-weight: bold;
margin-left: 0.3em;
}
#file-list li.error-message .login-link:hover {
text-decoration: underline;
}
#file-list li.no-files:hover {
background-color: rgba(var(--primary-rgb), 0.05);
border-color: var(--primary);
transform: none;
box-shadow: none;
}
.file-item {
width: 100%;
}
.file-info {
display: flex;
align-items: flex-start;
flex: 1;
min-width: 0;
flex-direction: column;
gap: 0.25rem;
}
.file-header {
display: flex;
align-items: flex-start;
justify-content: space-between;
width: 100%;
gap: 0.75rem;
}
.file-name {
color: var(--text-color);
word-break: break-word;
overflow-wrap: break-word;
line-height: 1.3;
flex: 1;
font-size: 0.95em;
}
.file-size {
color: var(--text-muted);
font-size: 0.8em;
white-space: nowrap;
flex-shrink: 0;
font-style: italic;
align-self: flex-start;
}
.delete-file {
align-self: center;
background: none;
border: none;
font-size: 1.1em;
cursor: pointer;
padding: 0.3rem 0.5rem;
border-radius: 4px;
transition: all 0.2s ease;
color: var(--text-muted);
margin-top: 0.2rem;
}
.delete-file:hover {
background-color: var(--error);
color: white;
transform: scale(1.1);
}
.file-actions {
display: flex;
gap: 0.5rem;
margin-left: 1rem;
flex-shrink: 0;
}
.download-button,
.delete-button {
display: inline-flex;
align-items: center;
gap: 0.5rem;
padding: 0.4rem 0.8rem;
border-radius: 4px;
font-size: 0.85rem;
cursor: pointer;
transition: all 0.2s ease;
text-decoration: none;
border: 1px solid transparent;
}
.download-button {
background-color: var(--primary);
color: white;
}
.download-button:hover {
background-color: var(--primary-hover);
transform: translateY(-1px);
}
.delete-button {
background-color: transparent;
color: var(--error);
border-color: var(--error);
}
.delete-button:hover {
background-color: rgba(var(--error-rgb), 0.1);
}
.button-icon {
font-size: 1em;
}
.button-text {
display: none;
}
/* Show text on larger screens */
@media (min-width: 640px) {
.button-text {
display: inline;
}
.download-button,
.delete-button {
padding: 0.4rem 1rem;
}
}
/* Responsive adjustments */
@media (max-width: 480px) {
#file-list li {
flex-direction: column;
align-items: flex-start;
gap: 0.75rem;
}
.file-actions {
width: 100%;
margin-left: 0;
justify-content: flex-end;
}
.file-name {
max-width: 100%;
}
}
#file-list li a {
color: var(--primary);
text-decoration: none;
flex-grow: 1;
margin-right: 1rem;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
#file-list li a:hover {
text-decoration: underline;
}
.file-size {
color: var(--text-secondary);
font-size: 0.9em;
margin-left: 0.5rem;
}
.delete-file {
background: none;
border: none;
color: var(--error);
cursor: pointer;
padding: 0.25rem 0.5rem;
border-radius: 4px;
transition: background-color 0.2s;
}
.delete-file:hover {
background-color: rgba(var(--error-rgb), 0.1);
}
/* Loading state */
#file-list.loading {
opacity: 0.7;
pointer-events: none;
}
/* Mobile optimizations */
@media (max-width: 768px) {
#user-upload-area {
padding: 1.5rem 1rem;
}
#file-list li {
padding: 0.5rem;
font-size: 0.9rem;
}
.file-size {
display: block;
margin-left: 0;
margin-top: 0.25rem;
}
}

View File

@ -0,0 +1,80 @@
/* Footer styles */
footer {
background: #2c3e50;
color: var(--text-color);
padding: 2rem 0;
margin-top: 3rem;
width: 100%;
}
.footer-content {
max-width: 1200px;
margin: 0 auto;
padding: 0 1.5rem;
display: flex;
flex-direction: column;
align-items: center;
text-align: center;
}
.footer-links {
display: flex;
flex-wrap: wrap;
justify-content: center;
gap: 1rem;
margin-top: 1rem;
}
.footer-links a {
color: var(--text-color);
text-decoration: none;
transition: color 0.2s;
}
.footer-links a:hover,
.footer-links a:focus {
color: var(--info);
text-decoration: underline;
}
.separator {
color: var(--text-muted);
margin: 0 0.25rem;
}
.footer-hint {
margin-top: 1rem;
font-size: 0.9rem;
color: var(--text-light);
}
.footer-hint a {
color: var(--info);
text-decoration: none;
}
.footer-hint a:hover,
.footer-hint a:focus {
text-decoration: underline;
}
/* Responsive adjustments */
@media (max-width: 767px) {
footer {
padding: 1.5rem 1rem;
}
.footer-links {
flex-direction: column;
gap: 0.5rem;
}
.separator {
display: none;
}
.footer-hint {
font-size: 0.85rem;
line-height: 1.5;
}
}

View File

@ -0,0 +1,149 @@
/* Header and navigation styles */
header {
width: 100%;
background: rgba(33, 37, 41, 0.95);
backdrop-filter: blur(10px);
-webkit-backdrop-filter: blur(10px);
box-shadow: 0 2px 10px rgba(0, 0, 0, 0.1);
position: fixed;
top: 0;
left: 0;
z-index: 1000;
padding: 0.5rem 0;
}
.header-content {
max-width: 1200px;
margin: 0 auto;
padding: 0 1.5rem;
display: flex;
justify-content: space-between;
align-items: center;
position: relative;
}
/* Logo */
.logo {
color: white;
font-size: 1.5rem;
font-weight: bold;
text-decoration: none;
padding: 0.5rem 0;
}
.logo:hover {
text-decoration: none;
opacity: 0.9;
}
/* Navigation */
.nav-wrapper {
display: flex;
align-items: center;
height: 100%;
}
/* Menu toggle button */
.menu-toggle {
background: none;
border: none;
color: white;
font-size: 1.5rem;
cursor: pointer;
padding: 0.5rem;
display: none; /* Hidden by default, shown on mobile */
}
/* Navigation list */
.nav-list {
display: flex;
list-style: none;
margin: 0;
padding: 0;
gap: 1rem;
align-items: center;
}
.nav-item {
margin: 0;
}
.nav-link {
color: white;
text-decoration: none;
padding: 0.5rem 1rem;
border-radius: 4px;
transition: background-color 0.2s, color 0.2s;
display: block;
}
.nav-link:hover,
.nav-link:focus {
background: rgba(255, 255, 255, 0.1);
text-decoration: none;
color: var(--text-color);
}
/* Active navigation item */
.nav-link.active {
background: rgba(255, 255, 255, 0.2);
font-weight: 500;
}
/* Mobile menu */
@media (max-width: 767px) {
.menu-toggle {
display: flex;
align-items: center;
justify-content: center;
width: 2.5rem;
height: 2.5rem;
background: transparent;
border: none;
color: white;
font-size: 1.5rem;
cursor: pointer;
z-index: 1001;
}
.nav-wrapper {
position: fixed;
top: 0;
right: -100%;
width: 80%;
max-width: 300px;
height: 100vh;
background: rgba(33, 37, 41, 0.98);
padding: 5rem 1.5rem 2rem;
transition: right 0.3s ease-in-out;
z-index: 1000;
overflow-y: auto;
display: block;
}
.nav-wrapper.active {
right: 0;
}
.nav-list {
display: flex;
flex-direction: column;
gap: 0.5rem;
padding: 0;
}
.nav-item {
width: 100%;
}
.nav-link {
display: block;
padding: 0.75rem 1rem;
border-radius: 4px;
}
.nav-link:hover,
.nav-link:focus {
background: rgba(255, 255, 255, 0.15);
}
}

View File

View File

View File

116
static/css/section.css Normal file
View File

@ -0,0 +1,116 @@
/* section.css - Centralized visibility control with class-based states */
/* Base section visibility - all sections hidden by default */
main > section {
display: none;
position: absolute;
overflow: hidden;
clip: rect(0, 0, 0, 0);
white-space: nowrap;
border: 0;
opacity: 0;
}
/* Active section styling - only visibility properties */
main > section.active {
display: block;
position: relative;
overflow: visible;
clip: auto;
white-space: normal;
opacity: 1;
}
/* Authentication-based visibility classes */
.guest-only { display: block; }
.auth-only {
display: none;
}
/* Show auth-only elements when authenticated */
body.authenticated .auth-only {
display: block;
}
/* Ensure me-page and its direct children are visible when me-page is active */
#me-page:not([hidden]) > .auth-only,
#me-page:not([hidden]) > section,
#me-page:not([hidden]) > article,
#me-page:not([hidden]) > div {
display: block !important;
visibility: visible !important;
opacity: 1 !important;
}
/* Show auth-only elements when authenticated */
body.authenticated .auth-only {
display: block !important;
visibility: visible !important;
}
/* Account deletion section - improved width and formatting */
#account-deletion {
margin: 2.5rem auto;
padding: 2.5rem;
background: rgba(255, 255, 255, 0.05);
border-radius: 10px;
box-shadow: 0 3px 6px rgba(0, 0, 0, 0.15);
max-width: 600px;
line-height: 1.6;
color: var(--text-color);
}
#account-deletion h3 {
color: var(--color-primary);
margin-top: 0;
margin-bottom: 1.5rem;
font-size: 1.5rem;
}
#account-deletion p {
color: var(--color-text);
line-height: 1.6;
margin-bottom: 1.5rem;
}
#account-deletion ul {
margin: 1rem 0 1.5rem 1.5rem;
padding: 0;
color: var(--color-text);
}
#account-deletion .centered-container {
text-align: center;
margin-top: 2rem;
}
#delete-account-from-privacy {
background-color: #ff4d4f;
color: white;
border: none;
padding: 0.75rem 1.5rem;
border-radius: 4px;
cursor: pointer;
font-weight: 600;
font-size: 1rem;
transition: background-color 0.2s ease;
display: inline-flex;
align-items: center;
gap: 0.5rem;
}
#delete-account-from-privacy:hover {
background-color: #ff6b6b;
text-decoration: none;
}
/* Hide guest-only elements when authenticated */
body.authenticated .guest-only {
display: none !important;
visibility: hidden !important;
display: none;
}
.always-visible {
display: block !important;
}

View File

View File

View File

@ -1,5 +1,7 @@
import { showToast } from "./toast.js"; import { showToast } from "./toast.js";
import { showSection } from './nav.js';
// Utility function to get cookie value by name
function getCookie(name) { function getCookie(name) {
const value = `; ${document.cookie}`; const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`); const parts = value.split(`; ${name}=`);
@ -8,186 +10,830 @@ function getCookie(name) {
} }
// dashboard.js — toggle guest vs. user dashboard and reposition streams link // dashboard.js — toggle guest vs. user dashboard and reposition streams link
async function initDashboard() { // Global state
// New dashboard toggling logic let isLoggingOut = false;
let dashboardInitialized = false;
async function handleLogout(event) {
// Debug messages disabled
// Prevent multiple simultaneous logout attempts
if (isLoggingOut) {
// Debug messages disabled
return;
}
isLoggingOut = true;
// Prevent default button behavior
if (event) {
event.preventDefault();
event.stopPropagation();
}
try {
// Get auth token before we clear it
const authToken = localStorage.getItem('authToken');
// 1. Clear all client-side state first (most important)
// Debug messages disabled
// Clear localStorage and sessionStorage
const storageKeys = [
'uid', 'uid_time', 'last_page',
'isAuthenticated', 'authToken', 'user', 'token', 'sessionid', 'sessionId'
];
storageKeys.forEach(key => {
localStorage.removeItem(key);
sessionStorage.removeItem(key);
});
// Get all current cookies for debugging
const allCookies = document.cookie.split(';');
// Debug messages disabled
// Clear ALL cookies (aggressive approach)
allCookies.forEach(cookie => {
const [name] = cookie.trim().split('=');
if (name) {
const cookieName = name.trim();
// Debug messages disabled
// Try multiple clearing strategies to ensure cookies are removed
const clearStrategies = [
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; SameSite=Lax;`,
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; domain=${window.location.hostname}; SameSite=Lax;`,
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; domain=.${window.location.hostname}; SameSite=Lax;`,
`${cookieName}=; max-age=0; path=/; SameSite=Lax;`,
`${cookieName}=; max-age=0; path=/; domain=${window.location.hostname}; SameSite=Lax;`
];
clearStrategies.forEach(strategy => {
document.cookie = strategy;
});
}
});
// Verify cookies are cleared
const remainingCookies = document.cookie.split(';').filter(c => c.trim());
// Debug messages disabled
// Update UI state
document.body.classList.remove('authenticated', 'logged-in');
document.body.classList.add('guest');
// 2. Try to invalidate server session (non-blocking)
if (authToken) {
try {
// Debug messages disabled
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), 2000);
const response = await fetch('/api/logout', {
method: 'POST',
credentials: 'include',
signal: controller.signal,
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${authToken}`
},
});
clearTimeout(timeoutId);
// Debug messages disabled
} catch (error) {
// Debug messages disabled
}
}
// 3. Final redirect
// Debug messages disabled
window.location.href = '/?logout=' + Date.now();
} catch (error) {
// Debug messages disabled
if (window.showToast) {
showToast('Logout failed. Please try again.');
}
// Even if there's an error, force redirect to clear state
window.location.href = '/?logout=error-' + Date.now();
} finally {
isLoggingOut = false;
}
}
// Delete account function
async function handleDeleteAccount() {
try {
const uid = localStorage.getItem('uid');
if (!uid) {
showToast('No user session found. Please log in again.');
return;
}
// Show confirmation dialog
const confirmed = confirm('⚠️ WARNING: This will permanently delete your account and all your data. This action cannot be undone.\n\nAre you sure you want to delete your account?');
if (!confirmed) {
return; // User cancelled the deletion
}
// Show loading state
const deleteButton = document.getElementById('delete-account-from-privacy');
const originalText = deleteButton.textContent;
deleteButton.disabled = true;
deleteButton.textContent = 'Deleting...';
// Call the delete account endpoint
const response = await fetch(`/api/delete-account`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ uid }),
});
const result = await response.json();
if (response.ok) {
showToast('Account deleted successfully');
// Use comprehensive logout logic to clear all cookies and storage
console.log('🧹 Account deleted - clearing all authentication data...');
// Clear all authentication-related data from localStorage
const keysToRemove = [
'uid', 'uid_time', 'last_page',
'isAuthenticated', 'authToken', 'user', 'token', 'sessionid'
];
keysToRemove.forEach(key => {
if (localStorage.getItem(key)) {
console.log(`Removing localStorage key: ${key}`);
localStorage.removeItem(key);
}
});
// Clear sessionStorage completely
sessionStorage.clear();
console.log('Cleared sessionStorage');
// Clear all cookies using multiple strategies
const clearCookie = (cookieName) => {
const clearStrategies = [
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; SameSite=Lax;`,
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; domain=${window.location.hostname}; SameSite=Lax;`,
`${cookieName}=; expires=Thu, 01 Jan 1970 00:00:00 UTC; path=/; domain=.${window.location.hostname}; SameSite=Lax;`,
`${cookieName}=; max-age=0; path=/; SameSite=Lax;`,
`${cookieName}=; max-age=0; path=/; domain=${window.location.hostname}; SameSite=Lax;`
];
clearStrategies.forEach(strategy => {
document.cookie = strategy;
});
console.log(`Cleared cookie: ${cookieName}`);
};
// Clear all cookies by setting them to expire in the past
document.cookie.split(';').forEach(cookie => {
const [name] = cookie.trim().split('=');
if (name) {
clearCookie(name.trim());
}
});
// Also specifically clear known authentication cookies
const authCookies = ['authToken', 'isAuthenticated', 'sessionId', 'uid', 'token'];
authCookies.forEach(clearCookie);
// Log remaining cookies for verification
console.log('Remaining cookies after deletion cleanup:', document.cookie);
// Update UI state
document.body.classList.remove('authenticated');
document.body.classList.add('guest');
// Redirect to home page
setTimeout(() => {
window.location.href = '/';
}, 1000);
} else {
throw new Error(result.detail || 'Failed to delete account');
}
} catch (error) {
console.error('Delete account failed:', error);
showToast(`Failed to delete account: ${error.message}`);
// Reset button state
const deleteButton = document.getElementById('delete-account-from-privacy');
if (deleteButton) {
deleteButton.disabled = false;
deleteButton.textContent = '🗑️ Delete Account';
}
}
}
// Debug function to check element visibility and styles
function debugElementVisibility(elementId) {
const el = document.getElementById(elementId);
if (!el) {
console.error(`[DEBUG] Element ${elementId} not found`);
return {};
}
const style = window.getComputedStyle(el);
return {
id: elementId,
exists: true,
display: style.display,
visibility: style.visibility,
opacity: style.opacity,
hidden: el.hidden,
classList: Array.from(el.classList),
parentDisplay: el.parentElement ? window.getComputedStyle(el.parentElement).display : 'no-parent',
parentVisibility: el.parentElement ? window.getComputedStyle(el.parentElement).visibility : 'no-parent',
rect: el.getBoundingClientRect()
}
}
// Make updateQuotaDisplay available globally
window.updateQuotaDisplay = updateQuotaDisplay;
/**
* Initialize the dashboard and handle authentication state
*/
async function initDashboard(uid = null) {
// Debug messages disabled
try {
const guestDashboard = document.getElementById('guest-dashboard'); const guestDashboard = document.getElementById('guest-dashboard');
const userDashboard = document.getElementById('user-dashboard'); const userDashboard = document.getElementById('user-dashboard');
const userUpload = document.getElementById('user-upload-area'); const userUpload = document.getElementById('user-upload-area');
const logoutButton = document.getElementById('logout-button');
const deleteAccountButton = document.getElementById('delete-account-from-privacy');
const fileList = document.getElementById('file-list');
// Hide all by default // Only attach event listeners once to prevent duplicates
if (!dashboardInitialized) {
if (logoutButton) {
logoutButton.addEventListener('click', handleLogout);
}
// Delete account button is handled by auth.js delegated event listener
// Removed duplicate event listener to prevent double confirmation dialogs
dashboardInitialized = true;
}
const effectiveUid = uid || getCookie('uid') || localStorage.getItem('uid');
const isAuthenticated = !!effectiveUid;
if (isAuthenticated) {
document.body.classList.add('authenticated');
document.body.classList.remove('guest-mode');
if (userDashboard) userDashboard.style.display = 'block';
if (userUpload) userUpload.style.display = 'block';
if (guestDashboard) guestDashboard.style.display = 'none'; if (guestDashboard) guestDashboard.style.display = 'none';
if (userDashboard) userDashboard.style.display = 'none';
if (userUpload) userUpload.style.display = 'none';
const uid = getCookie('uid'); if (window.fetchAndDisplayFiles) {
if (!uid) { // Use email-based UID for file operations if available, fallback to effectiveUid
// Guest view: only nav const fileOperationUid = localStorage.getItem('uid') || effectiveUid; // uid is now email-based
if (guestDashboard) guestDashboard.style.display = ''; // Debug messages disabled
await window.fetchAndDisplayFiles(fileOperationUid);
}
} else {
document.body.classList.remove('authenticated');
document.body.classList.add('guest-mode');
if (guestDashboard) guestDashboard.style.display = 'block';
if (userDashboard) userDashboard.style.display = 'none'; if (userDashboard) userDashboard.style.display = 'none';
if (userUpload) userUpload.style.display = 'none'; if (userUpload) userUpload.style.display = 'none';
const mePage = document.getElementById('me-page'); if (fileList) {
if (mePage) mePage.style.display = 'none'; fileList.innerHTML = `<li>Please <a href="/#login" class="login-link">log in</a> to view your files.</li>`;
return; }
}
} catch (e) {
console.error('Dashboard initialization failed:', e);
const guestDashboard = document.getElementById('guest-dashboard');
const userDashboard = document.getElementById('user-dashboard');
if (userDashboard) userDashboard.style.display = 'none';
if (guestDashboard) guestDashboard.style.display = 'block';
document.body.classList.remove('authenticated');
}
} }
// Delete file function is defined below with more complete implementation
// Helper function to format file size
function formatFileSize(bytes) {
if (bytes === 0) return '0 Bytes';
const k = 1024;
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
}
// Function to fetch and display user's uploaded files
async function fetchAndDisplayFiles(uid) {
const fileList = document.getElementById('file-list');
if (!fileList) {
// Debug messages disabled
return;
}
// Debug messages disabled
fileList.innerHTML = '<li class="loading-message">Loading your files...</li>';
// Prepare headers with auth token if available
const authToken = localStorage.getItem('authToken');
const headers = {
'Accept': 'application/json',
'Content-Type': 'application/json'
};
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
}
// Debug messages disabled
try { try {
const res = await fetch(`/me/${uid}`); // The backend should handle authentication via session cookies
if (!res.ok) throw new Error('Not authorized'); // We include the auth token in headers if available, but don't rely on it for auth
const data = await res.json(); // Debug messages disabled
const response = await fetch(`/user-files/${uid}`, {
method: 'GET',
credentials: 'include', // Important: include cookies for session auth
headers: headers
});
// Logged-in view // Debug messages disabled
// Restore links section and show-me link // Debug messages disabled
const linksSection = document.getElementById('links');
if (linksSection) linksSection.style.display = '';
const showMeLink = document.getElementById('show-me');
if (showMeLink && showMeLink.parentElement) showMeLink.parentElement.style.display = '';
// Show me-page for logged-in users
const mePage = document.getElementById('me-page');
if (mePage) mePage.style.display = '';
// Ensure upload area is visible if last_page was me-page
const userUpload = document.getElementById('user-upload-area');
if (userUpload && localStorage.getItem('last_page') === 'me-page') {
// userUpload visibility is now only controlled by nav.js SPA logic
}
// Remove guest warning if present // Get response as text first to handle potential JSON parsing errors
const guestMsg = document.getElementById('guest-warning-msg'); const responseText = await response.text();
if (guestMsg && guestMsg.parentNode) guestMsg.parentNode.removeChild(guestMsg); // Debug messages disabled
userDashboard.style.display = '';
// Set audio source // Parse the JSON response
const meAudio = document.getElementById('me-audio'); let responseData = {};
if (meAudio && uid) { if (responseText && responseText.trim() !== '') {
meAudio.src = `/audio/${encodeURIComponent(uid)}/stream.opus`; try {
} responseData = JSON.parse(responseText);
// Debug messages disabled
// Update quota
const quotaBar = document.getElementById('quota-bar');
const quotaText = document.getElementById('quota-text');
if (quotaBar) quotaBar.value = data.quota;
if (quotaText) quotaText.textContent = `${data.quota} MB used`;
// Ensure Streams link remains in nav, not moved
// (No action needed if static)
} catch (e) { } catch (e) {
console.warn('Dashboard init error, treating as guest:', e); // Debug messages disabled
// Debug messages disabled
userUpload.style.display = ''; // If we have a non-JSON response but the status is 200, try to handle it
userDashboard.style.display = 'none'; if (response.ok) {
const registerLink = document.getElementById('guest-login'); // Debug messages disabled
const streamsLink = document.getElementById('guest-streams'); } else {
if (registerLink && streamsLink) { throw new Error(`Invalid JSON response from server: ${e.message}`);
registerLink.parentElement.insertAdjacentElement('afterend', streamsLink.parentElement); }
}
} else {
// Debug messages disabled
}
// Note: Authentication is handled by the parent component
// We'll just handle the response status without clearing auth state
if (response.ok) {
// Check if the response has the expected format
if (!responseData || !Array.isArray(responseData.files)) {
// Debug messages disabled
fileList.innerHTML = '<li>Error: Invalid response from server</li>';
return;
}
const files = responseData.files;
// Debug messages disabled
if (files.length === 0) {
fileList.innerHTML = '<li class="no-files">No files uploaded yet.</li>';
return;
}
// Clear the loading message
fileList.innerHTML = '';
// Use the new global function to render the files
window.displayUserFiles(uid, files);
} else {
// Handle non-OK responses
if (response.status === 401) {
// Parent component will handle authentication state
fileList.innerHTML = `
<li class="error-message">
Please <a href="/#login" class="login-link">log in</a> to view your files.
</li>`;
} else {
fileList.innerHTML = `
<li class="error-message">
Error loading files (${response.status}). Please try again later.
</li>`;
}
// Debug messages disabled
}
} catch (error) {
// Debug messages disabled
const fileList = document.getElementById('file-list');
if (fileList) {
fileList.innerHTML = `
<li class="error-message">
Error loading files: ${error.message || 'Unknown error'}
</li>`;
} }
} }
} }
document.addEventListener('DOMContentLoaded', initDashboard); // Function to update the quota display
async function updateQuotaDisplay(uid) {
// Debug messages disabled
try {
const authToken = localStorage.getItem('authToken');
const headers = {
'Accept': 'application/json',
'Content-Type': 'application/json'
};
// Registration form handler for guests if (authToken) {
// Handles the submit event on #register-form, sends data to /register, and alerts the user with the result headers['Authorization'] = `Bearer ${authToken}`;
}
document.addEventListener('DOMContentLoaded', () => { // Debug messages disabled
// Fetch user info which includes quota
const response = await fetch(`/me/${uid}`, {
method: 'GET',
credentials: 'include',
headers: headers
});
// Debug messages disabled
if (response.ok) {
const userData = await response.json();
// Debug messages disabled
// Update the quota display
const quotaText = document.getElementById('quota-text');
const quotaBar = document.getElementById('quota-bar');
// Debug messages disabled
// Debug messages disabled
if (quotaText && userData.quota) {
const usedMB = (userData.quota.used_bytes / (1024 * 1024)).toFixed(2);
const maxMB = (userData.quota.max_bytes / (1024 * 1024)).toFixed(2);
const percentage = userData.quota.percentage || 0;
// Debug messages disabled
const quotaDisplayText = `${usedMB} MB of ${maxMB} MB (${percentage}%)`;
quotaText.textContent = quotaDisplayText;
// Debug messages disabled
if (quotaBar) {
quotaBar.value = percentage;
// Debug messages disabled
}
} else {
// Debug messages disabled
}
} else {
// Debug messages disabled
}
} catch (error) {
// Debug messages disabled
}
}
// Make fetchAndDisplayFiles globally accessible
window.fetchAndDisplayFiles = fetchAndDisplayFiles;
// Function to handle file deletion
async function deleteFile(uid, fileName, listItem, displayName = '') {
const fileToDelete = displayName || fileName;
if (!confirm(`Are you sure you want to delete "${fileToDelete}"?`)) {
return;
}
// Show loading state
if (listItem) {
listItem.style.opacity = '0.6';
listItem.style.pointerEvents = 'none';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = true;
deleteButton.innerHTML = '<span class="button-icon">⏳</span><span class="button-text">Deleting...</span>';
}
}
try {
if (!uid) {
throw new Error('User not authenticated. Please log in again.');
}
// Debug messages disabled
const authToken = localStorage.getItem('authToken');
const headers = { 'Content-Type': 'application/json' };
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
}
// Use the provided UID in the URL
const response = await fetch(`/uploads/${uid}/${encodeURIComponent(fileName)}`, {
method: 'DELETE',
headers: headers,
credentials: 'include'
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `HTTP error! status: ${response.status}`);
}
// Remove the file from the UI immediately
if (listItem && listItem.parentNode) {
listItem.parentNode.removeChild(listItem);
}
// Show success message
showToast(`Successfully deleted "${fileToDelete}"`, 'success');
// If the file list is now empty, show a message
const fileList = document.getElementById('file-list');
if (fileList && fileList.children.length === 0) {
fileList.innerHTML = '<li class="no-files">No files uploaded yet.</li>';
}
} catch (error) {
// Debug messages disabled
showToast(`Error deleting "${fileToDelete}": ${error.message}`, 'error');
// Reset the button state if there was an error
if (listItem) {
listItem.style.opacity = '';
listItem.style.pointerEvents = '';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = false;
deleteButton.innerHTML = '🗑️';
}
}
}
}
// Initialize file upload functionality
function initFileUpload() {
const uploadArea = document.getElementById('user-upload-area');
const fileInput = document.getElementById('fileInputUser');
if (!uploadArea || !fileInput) {
// Debug messages disabled
return;
}
// Handle click on upload area
uploadArea.addEventListener('click', () => {
fileInput.click();
});
// Handle file selection
fileInput.addEventListener('change', async (e) => {
const file = e.target.files[0];
if (!file) return;
// Check file size (100MB limit)
if (file.size > 100 * 1024 * 1024) {
showToast('File is too large. Maximum size is 100MB.', 'error');
return;
}
// Show loading state
const originalText = uploadArea.innerHTML;
uploadArea.innerHTML = 'Uploading...';
try {
const formData = new FormData();
formData.append('file', file);
// Get UID from localStorage (parent UI ensures we're authenticated)
const uid = localStorage.getItem('uid');
formData.append('uid', uid);
// Proceed with the upload
const response = await fetch('/upload', {
method: 'POST',
body: formData,
credentials: 'include', // Include cookies for authentication
headers: {
'Accept': 'application/json' // Explicitly accept JSON response
}
});
if (!response.ok) {
const error = await response.text();
throw new Error(error || 'Upload failed');
}
const result = await response.json();
// Refresh file list
if (window.fetchAndDisplayFiles) {
window.fetchAndDisplayFiles(uid);
}
} catch (error) {
// Debug messages disabled
showToast(`Upload failed: ${error.message}`, 'error');
} finally {
// Reset file input and restore upload area text
fileInput.value = '';
uploadArea.innerHTML = originalText;
}
});
// Handle drag and drop
['dragenter', 'dragover', 'dragleave', 'drop'].forEach(eventName => {
uploadArea.addEventListener(eventName, preventDefaults, false);
});
function preventDefaults(e) {
e.preventDefault();
e.stopPropagation();
}
['dragenter', 'dragover'].forEach(eventName => {
uploadArea.addEventListener(eventName, highlight, false);
});
['dragleave', 'drop'].forEach(eventName => {
uploadArea.addEventListener(eventName, unhighlight, false);
});
function highlight() {
uploadArea.classList.add('highlight');
}
function unhighlight() {
uploadArea.classList.remove('highlight');
}
// Handle dropped files
uploadArea.addEventListener('drop', (e) => {
const dt = e.dataTransfer;
const files = dt.files;
if (files.length) {
fileInput.files = files;
const event = new Event('change');
fileInput.dispatchEvent(event);
}
});
}
// Main initialization when the DOM is fully loaded
document.addEventListener('DOMContentLoaded', async () => {
// Initialize dashboard components
await initDashboard(); // initFileUpload is called from within initDashboard
// Update quota display if user is logged in
const uid = localStorage.getItem('uid');
if (uid) {
updateQuotaDisplay(uid);
}
// Delegated event listener for clicks on the document
document.addEventListener('click', (e) => {
// Logout Button
if (e.target.closest('#logout-button')) {
e.preventDefault();
handleLogout(e);
return;
}
// Delete File Button
const deleteButton = e.target.closest('.delete-file');
if (deleteButton) {
e.preventDefault();
e.stopPropagation();
const listItem = deleteButton.closest('.file-item');
if (!listItem) return;
const uid = localStorage.getItem('uid');
if (!uid) {
showToast('You need to be logged in to delete files', 'error');
// Debug messages disabled
return;
}
const fileName = deleteButton.getAttribute('data-filename');
const displayName = deleteButton.getAttribute('data-original-name') || fileName;
deleteFile(uid, fileName, listItem, displayName);
}
});
// Make dashboard functions available globally
window.fetchAndDisplayFiles = fetchAndDisplayFiles;
window.initDashboard = initDashboard;
// Login/Register (guest)
const regForm = document.getElementById('register-form'); const regForm = document.getElementById('register-form');
if (regForm) { if (regForm) {
regForm.addEventListener('submit', async (e) => { regForm.addEventListener('submit', async (e) => {
e.preventDefault(); e.preventDefault();
const formData = new FormData(regForm); const formData = new FormData(regForm);
const submitButton = regForm.querySelector('button[type="submit"]');
const originalButtonText = submitButton.textContent;
try { try {
// Disable button during submission
submitButton.disabled = true;
submitButton.textContent = 'Sending...';
const res = await fetch('/register', { const res = await fetch('/register', {
method: 'POST', method: 'POST',
body: formData body: formData,
headers: {
'Accept': 'application/json'
}
}); });
let data; let data;
const contentType = res.headers.get('content-type'); const contentType = res.headers.get('content-type');
try {
if (contentType && contentType.includes('application/json')) { if (contentType && contentType.includes('application/json')) {
data = await res.json(); data = await res.json();
} else { } else {
data = { detail: await res.text() }; const text = await res.text();
data = { detail: text };
} }
if (res.ok) { if (res.ok) {
showToast('Confirmation sent! Check your email.'); showToast('Check your email for a magic login link!', 'success');
// Clear the form on success
regForm.reset();
} else { } else {
showToast('Registration failed: ' + (data.detail || res.status)); showToast(`Error: ${data.detail || 'Unknown error occurred'}`, 'error');
// Debug messages disabled
}
} catch (parseError) {
console.error('Error parsing response:', parseError);
showToast('Error processing the response. Please try again.', 'error');
} }
} catch (err) { } catch (err) {
showToast('Network error: ' + err); console.error('Network error:', err);
showToast('Network error. Please check your connection and try again.', 'error');
} finally {
// Re-enable button
submitButton.disabled = false;
submitButton.textContent = originalButtonText;
} }
}); });
} }
}); // Connect Login or Register link to register form
// All navigation is now handled by the global click and hashchange listeners in nav.js.
// The legacy setupPageNavigation function and manual nav link handlers have been removed.
});
// Handle drag and drop
const uploadArea = document.getElementById('upload-area');
if (uploadArea) {
['dragenter', 'dragover', 'dragleave', 'drop'].forEach(eventName => {
uploadArea.addEventListener(eventName, preventDefaults, false);
});
// Connect Login or Register link to register form ['dragenter', 'dragover'].forEach(eventName => {
uploadArea.addEventListener(eventName, highlight, false);
});
document.addEventListener('DOMContentLoaded', () => { ['dragleave', 'drop'].forEach(eventName => {
// Login/Register (guest) uploadArea.addEventListener(eventName, unhighlight, false);
const loginLink = document.getElementById('guest-login'); });
if (loginLink) {
loginLink.addEventListener('click', (e) => { function preventDefaults(e) {
e.preventDefault(); e.preventDefault();
document.querySelectorAll('main > section').forEach(sec => { e.stopPropagation();
sec.hidden = sec.id !== 'register-page';
});
const reg = document.getElementById('register-page');
if (reg) reg.hidden = false;
reg.scrollIntoView({behavior:'smooth'});
});
} }
// Terms of Service (all dashboards)
const termsLinks = [
document.getElementById('guest-terms'),
document.getElementById('user-terms')
];
termsLinks.forEach(link => {
if (link) {
link.addEventListener('click', (e) => {
e.preventDefault();
document.querySelectorAll('main > section').forEach(sec => {
sec.hidden = sec.id !== 'terms-page';
});
const terms = document.getElementById('terms-page');
if (terms) terms.hidden = false;
terms.scrollIntoView({behavior:'smooth'});
});
}
});
// Imprint (all dashboards) function highlight() {
const imprintLinks = [ uploadArea.classList.add('highlight');
document.getElementById('guest-imprint'),
document.getElementById('user-imprint')
];
imprintLinks.forEach(link => {
if (link) {
link.addEventListener('click', (e) => {
e.preventDefault();
document.querySelectorAll('main > section').forEach(sec => {
sec.hidden = sec.id !== 'imprint-page';
});
const imprint = document.getElementById('imprint-page');
if (imprint) imprint.hidden = false;
imprint.scrollIntoView({behavior:'smooth'});
});
} }
});
// Privacy Policy (all dashboards) function unhighlight() {
const privacyLinks = [ uploadArea.classList.remove('highlight');
document.getElementById('guest-privacy'), }
document.getElementById('user-privacy')
]; // Handle dropped files
privacyLinks.forEach(link => { uploadArea.addEventListener('drop', (e) => {
if (link) { const dt = e.dataTransfer;
link.addEventListener('click', (e) => { const files = dt.files;
e.preventDefault();
document.querySelectorAll('main > section').forEach(sec => { if (files.length) {
sec.hidden = sec.id !== 'privacy-page'; const fileInput = document.getElementById('file-input');
}); fileInput.files = files;
const privacy = document.getElementById('privacy-page'); const event = new Event('change');
if (privacy) privacy.hidden = false; fileInput.dispatchEvent(event);
privacy.scrollIntoView({behavior:'smooth'});
});
} }
}); });
}); }

231
static/desktop.css Normal file
View File

@ -0,0 +1,231 @@
/* Desktop-specific styles for screens 960px and wider */
@media (min-width: 960px) {
:root {
--content-max-width: 800px;
--content-padding: 1.25rem;
--section-spacing: 1.5rem;
}
html {
background-color: #111 !important;
background-image:
repeating-linear-gradient(
45deg,
rgba(188, 183, 107, 0.1) 0,
rgba(188, 183, 107, 0.1) 1px,
transparent 1px,
transparent 20px
),
repeating-linear-gradient(
-45deg,
rgba(188, 183, 107, 0.1) 0,
rgba(188, 183, 107, 0.1) 1px,
transparent 1px,
transparent 20px
) !important;
background-size: 40px 40px !important;
background-repeat: repeat !important;
background-attachment: fixed !important;
min-height: 100% !important;
}
body {
background: transparent !important;
min-height: 100vh !important;
display: flex;
flex-direction: column;
}
/* Main content container */
main {
flex: 1;
width: 100%;
max-width: var(--content-max-width);
margin: 0 auto;
padding: 0 var(--content-padding);
box-sizing: border-box;
}
/* Ensure h2 in legal pages matches other pages */
#privacy-page > article > h2:first-child,
#imprint-page > article > h2:first-child {
margin-top: 0;
padding-top: 0;
}
/* Streams Page Specific Styles */
#streams-page section {
width: 100%;
max-width: var(--content-max-width);
margin: 0 auto;
padding: 2rem;
box-sizing: border-box;
}
.stream-card {
margin-bottom: 1rem;
background: var(--surface);
border-radius: 8px;
box-shadow: 0 2px 6px rgba(0, 0, 0, 0.1);
transition: transform 0.2s ease, box-shadow 0.2s ease;
}
.stream-card:last-child {
margin-bottom: 0;
}
.stream-card:hover {
transform: translateY(-2px);
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.15);
}
.stream-card .card-content {
padding: 1.25rem 1.5rem;
}
/* Section styles */
section {
width: 100%;
max-width: var(--content-max-width);
margin: 0 auto var(--section-spacing);
background: rgba(26, 26, 26, 0.9);
border: 1px solid rgba(255, 255, 255, 0.05);
border-radius: 10px;
padding: 2rem;
box-shadow: 0 4px 20px rgba(0, 0, 0, 0.2);
transition: transform 0.2s ease, box-shadow 0.2s ease;
box-sizing: border-box;
}
section:hover {
box-shadow: 0 6px 24px rgba(0, 0, 0, 0.25);
}
/* Navigation */
nav.dashboard-nav {
padding: 1rem 0;
margin-bottom: 2rem;
background: rgba(0, 0, 0, 0.7);
backdrop-filter: blur(5px);
display: block;
}
/* Desktop navigation visibility */
nav.dashboard-nav {
display: block;
}
/* Show desktop navigation */
section#links {
display: block;
}
/* Hide mobile navigation elements */
#burger-label,
#burger-toggle {
display: none !important;
}
/* Dashboard navigation */
#guest-dashboard,
#user-dashboard {
display: flex;
gap: 1rem;
}
nav.dashboard-nav a {
padding: 0.5rem 1rem;
margin: 0 0.5em;
border-radius: 4px;
transition: background-color 0.2s ease;
}
nav.dashboard-nav a:hover {
background-color: rgba(255, 255, 255, 0.1);
}
/* Form elements */
input[type="email"],
input[type="text"],
input[type="password"] {
width: 100%;
max-width: 400px;
padding: 0.75rem;
margin: 0.5rem 0;
border: 1px solid #444;
border-radius: 4px;
background: #2a2a2a;
color: #f0f0f0;
}
/* Buttons */
button,
.button {
padding: 0.75rem 1.5rem;
border: none;
border-radius: 4px;
background: #4a6fa5;
color: white;
cursor: pointer;
transition: background-color 0.2s ease;
}
button:hover,
.button:hover {
background: #5a8ad4;
}
/* Global article styles */
main > section > article,
#stream-page > article,
#stream-page #stream-list > li .stream-player {
max-width: 600px;
margin: 2em auto 2em auto;
padding: 2em;
background: #1e1e1e;
border: 1px solid #333;
border-radius: 8px;
transition: all 0.2s ease;
box-sizing: border-box;
}
/* Add top margin to all stream players except the first one */
#stream-page #stream-list > li:not(:first-child) .stream-player {
margin-top: 2px;
}
/* Stream player styles */
#stream-page #stream-list > li {
list-style: none;
margin: 0;
padding: 0;
border: none;
background: transparent;
}
#stream-page #stream-list {
padding: 0;
margin: 0 auto;
max-width: 600px;
width: 100%;
}
/* Stream player specific overrides can be added here if needed in the future */
/* Hover states moved to style.css for consistency */
/* Stream list desktop styles */
#stream-list {
max-width: 600px;
margin: 0 auto;
}
/* User upload area - matches article styling */
#user-upload-area {
max-width: 600px;
width: 100%;
margin: 2rem auto;
box-sizing: border-box;
}
}

220
static/file-display.js Normal file
View File

@ -0,0 +1,220 @@
// This function is responsible for rendering the list of files to the DOM.
// It is globally accessible via window.displayUserFiles.
window.displayUserFiles = function(uid, files) {
const fileList = document.getElementById('file-list');
if (!fileList) {
// Debug messages disabled
return;
}
if (!files || files.length === 0) {
fileList.innerHTML = '<li>You have no uploaded files yet.</li>';
return;
}
const fragment = document.createDocumentFragment();
const displayedFiles = new Set();
files.forEach(file => {
// Use original_name for display, stored_name for operations.
let displayName = file.original_name || file.stored_name || 'Unnamed File';
const storedFileName = file.stored_name || file.original_name;
// No UUID pattern replacement: always show the original_name from backend.
// Skip if no valid identifier is found or if it's a duplicate.
if (!storedFileName || displayedFiles.has(storedFileName)) {
return;
}
displayedFiles.add(storedFileName);
const listItem = document.createElement('li');
const fileUrl = `/user-uploads/${uid}/${encodeURIComponent(storedFileName)}`;
const fileSize = file.size ? (file.size / 1024 / 1024).toFixed(2) + ' MB' : 'N/A';
let fileIcon = '🎵'; // Default icon
const fileExt = displayName.split('.').pop().toLowerCase();
if (['mp3', 'wav', 'ogg', 'flac', 'm4a'].includes(fileExt)) {
fileIcon = '🎵';
} else if (['jpg', 'jpeg', 'png', 'gif', 'svg'].includes(fileExt)) {
fileIcon = '🖼️';
} else if (['pdf', 'doc', 'docx', 'txt'].includes(fileExt)) {
fileIcon = '📄';
}
listItem.innerHTML = `
<div class="file-info">
<div class="file-header">
<span class="file-name">${displayName}</span>
<span class="file-size">${fileSize}</span>
</div>
</div>
<button class="delete-file" title="Delete file" data-filename="${storedFileName}" data-display-name="${displayName}">🗑️</button>
`;
fragment.appendChild(listItem);
});
fileList.appendChild(fragment);
};
// Function to handle file deletion
async function deleteFile(uid, fileName, listItem, displayName = '') {
const fileToDelete = displayName || fileName;
if (!confirm(`Are you sure you want to delete "${fileToDelete}"?`)) {
return;
}
// Show loading state
if (listItem) {
listItem.style.opacity = '0.6';
listItem.style.pointerEvents = 'none';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = true;
deleteButton.textContent = '⏳';
}
}
try {
if (!uid) {
throw new Error('User not authenticated. Please log in again.');
}
// Debug messages disabled
const authToken = localStorage.getItem('authToken');
const headers = { 'Content-Type': 'application/json' };
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
}
// Get the email from localStorage (it's the UID)
const email = localStorage.getItem('uid');
if (!email) {
throw new Error('User not authenticated');
}
// The backend expects the full email as the UID in the path
// We need to ensure it's properly encoded for the URL
const username = email;
// Debug messages disabled
// Check if the filename is just a UUID (without log ID prefix)
const uuidPattern = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}\.\w+$/i;
let fileToDelete = fileName;
// If the filename is just a UUID, try to find the actual file with log ID prefix
if (uuidPattern.test(fileName)) {
// Debug messages disabled
try {
// First try to get the list of files to find the one with the matching UUID
const filesResponse = await fetch(`/user-files/${uid}`, {
method: 'GET',
headers: headers,
credentials: 'include'
});
if (filesResponse.ok) {
const filesData = await filesResponse.json();
if (filesData.files && Array.isArray(filesData.files)) {
// Look for a file that contains our UUID in its name
const matchingFile = filesData.files.find(f =>
f.stored_name && f.stored_name.includes(fileName)
);
if (matchingFile && matchingFile.stored_name) {
// Debug messages disabled
fileToDelete = matchingFile.stored_name;
}
}
}
} catch (e) {
// Debug messages disabled
// Continue with the original filename if there's an error
}
}
// Use the username in the URL with the correct filename
// Debug messages disabled
const response = await fetch(`/uploads/${username}/${encodeURIComponent(fileToDelete)}`, {
method: 'DELETE',
headers: headers,
credentials: 'include'
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `HTTP error! status: ${response.status}`);
}
// Remove the file from the UI immediately
if (listItem && listItem.parentNode) {
listItem.parentNode.removeChild(listItem);
}
// Show success message
window.showToast(`Successfully deleted "${fileToDelete}"`, 'success');
// If the file list is now empty, show a message
const fileList = document.getElementById('file-list');
if (fileList && fileList.children.length === 0) {
fileList.innerHTML = '<li class="no-files">No files uploaded yet.</li>';
}
// Refresh the file list and stream
const uid_current = localStorage.getItem('uid');
if (window.fetchAndDisplayFiles) {
// Use email-based UID for file operations if available, fallback to uid_current
const fileOperationUid = localStorage.getItem('uid') || uid_current; // uid is now email-based
// Debug messages disabled
await window.fetchAndDisplayFiles(fileOperationUid);
}
if (window.loadProfileStream) {
await window.loadProfileStream(uid_current);
}
} catch (error) {
// Debug messages disabled
window.showToast(`Error deleting "${fileToDelete}": ${error.message}`, 'error');
// Reset the button state if there was an error
if (listItem) {
listItem.style.opacity = '';
listItem.style.pointerEvents = '';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = false;
deleteButton.textContent = '🗑️';
}
}
}
}
// Add event delegation for delete buttons
document.addEventListener('DOMContentLoaded', () => {
const fileList = document.getElementById('file-list');
if (fileList) {
fileList.addEventListener('click', (e) => {
const deleteButton = e.target.closest('.delete-file');
if (deleteButton) {
e.preventDefault();
e.stopPropagation();
const listItem = deleteButton.closest('li');
if (!listItem) return;
const uid = localStorage.getItem('uid');
if (!uid) {
window.showToast('You need to be logged in to delete files', 'error');
// Debug messages disabled
return;
}
const fileName = deleteButton.getAttribute('data-filename');
const displayName = deleteButton.getAttribute('data-display-name') || fileName;
deleteFile(uid, fileName, listItem, displayName);
}
});
}
});

14
static/footer.html Normal file
View File

@ -0,0 +1,14 @@
<!-- Footer content -->
<footer>
<p>Built for public voice streaming • Opus | Mono | 48kHz | 60kbps</p>
<p class="footer-hint">Need more space? Contact <a href="mailto:Andreas.Fleckl@dicta2stream.net">Andreas.Fleckl@dicta2stream.net</a></p>
<div class="footer-links">
<a href="#" data-target="terms-page">Terms</a>
<span class="separator"></span>
<a href="#" data-target="privacy-page">Privacy</a>
<span class="separator"></span>
<a href="#" data-target="imprint-page">Imprint</a>
<span class="separator auth-only" style="display: none;"></span>
<a href="#" data-target="your-stream" class="auth-only" style="display: none;">Your Stream</a>
</div>
</footer>

View File

@ -0,0 +1,126 @@
/**
* Global Audio Manager
* Coordinates audio playback between different components to ensure only one audio plays at a time
*/
class GlobalAudioManager {
constructor() {
this.currentPlayer = null; // 'streams' or 'personal' or null
this.currentUid = null;
this.listeners = new Set();
// Bind methods
this.startPlayback = this.startPlayback.bind(this);
this.stopPlayback = this.stopPlayback.bind(this);
this.addListener = this.addListener.bind(this);
this.removeListener = this.removeListener.bind(this);
}
/**
* Register a player that wants to start playback
* @param {string} playerType - 'streams' or 'personal'
* @param {string} uid - The UID being played
* @param {Object} playerInstance - Reference to the player instance
*/
startPlayback(playerType, uid, playerInstance = null) {
// Debug messages disabled
// If the same player is already playing the same UID, allow it
if (this.currentPlayer === playerType && this.currentUid === uid) {
return true;
}
// Stop any currently playing audio
if (this.currentPlayer && this.currentPlayer !== playerType) {
this.notifyStop(this.currentPlayer);
}
// Update current state
this.currentPlayer = playerType;
this.currentUid = uid;
// Debug messages disabled
return true;
}
/**
* Notify that playback has stopped
* @param {string} playerType - 'streams' or 'personal'
*/
stopPlayback(playerType) {
if (this.currentPlayer === playerType) {
// Debug messages disabled
this.currentPlayer = null;
this.currentUid = null;
}
}
/**
* Get current playback state
*/
getCurrentState() {
return {
player: this.currentPlayer,
uid: this.currentUid
};
}
/**
* Check if a specific player is currently active
*/
isPlayerActive(playerType) {
return this.currentPlayer === playerType;
}
/**
* Add a listener for stop events
* @param {string} playerType - 'streams' or 'personal'
* @param {Function} callback - Function to call when this player should stop
*/
addListener(playerType, callback) {
const listener = { playerType, callback };
this.listeners.add(listener);
return listener;
}
/**
* Remove a listener
*/
removeListener(listener) {
this.listeners.delete(listener);
}
/**
* Notify a specific player type to stop
*/
notifyStop(playerType) {
// Debug messages disabled
this.listeners.forEach(listener => {
if (listener.playerType === playerType) {
try {
listener.callback();
} catch (error) {
console.error(`Error calling stop callback for ${playerType}:`, error);
}
}
});
}
/**
* Force stop all playback
*/
stopAll() {
if (this.currentPlayer) {
this.notifyStop(this.currentPlayer);
this.currentPlayer = null;
this.currentUid = null;
}
}
}
// Create singleton instance
export const globalAudioManager = new GlobalAudioManager();
// Make it available globally for debugging
if (typeof window !== 'undefined') {
window.globalAudioManager = globalAudioManager;
}

View File

@ -3,24 +3,29 @@
<html lang="en"> <html lang="en">
<head> <head>
<link rel="stylesheet" href="/static/style.css" media="all" /> <link rel="stylesheet" href="/static/style.css" media="all" />
<link rel="stylesheet" href="/static/desktop.css" media="(min-width: 960px)">
<link rel="stylesheet" href="/static/mobile.css" media="(max-width: 959px)">
<link rel="icon" href="data:image/svg+xml,<svg xmlns=%22http://www.w3.org/2000/svg%22 viewBox=%220 0 100 100%22><text y=%22.9em%22 font-size=%2290%22>🎙️</text></svg>"> <link rel="icon" href="data:image/svg+xml,<svg xmlns=%22http://www.w3.org/2000/svg%22 viewBox=%220 0 100 100%22><text y=%22.9em%22 font-size=%2290%22>🎙️</text></svg>">
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="dicta2stream is a minimalist voice streaming platform for looping your spoken audio anonymously." /> <meta name="description" content="dicta2stream is a minimalist voice streaming platform for looping your spoken audio anonymously." />
<title>dicta2stream</title> <title>dicta2stream</title>
<!-- Responsive burger menu display --> <!-- Section visibility and navigation styles -->
<link rel="stylesheet" href="/static/css/section.css" media="all" />
<style> <style>
#burger-label, #burger-toggle { display: none; } /* Hide mobile menu by default on larger screens */
@media (max-width: 959px) {
#burger-label { display: block; }
section#links { display: none; }
#burger-toggle:checked + #burger-label + section#links { display: block; }
}
@media (min-width: 960px) { @media (min-width: 960px) {
section#links { display: block; } #mobile-menu { display: none !important; }
#burger-label { display: none !important; }
} }
</style> </style>
<link rel="modulepreload" href="/static/sound.js" /> <link rel="modulepreload" href="/static/sound.js" />
<script src="/static/file-display.js?v=3"></script>
<script type="module" src="/static/dashboard.js?v=7"></script>
<script src="/static/streams-ui.js?v=3" type="module"></script>
<script src="/static/auth.js?v=5" type="module"></script>
<script src="/static/app.js?v=6" type="module"></script>
</head> </head>
<body> <body>
<header> <header>
@ -31,43 +36,76 @@
<main> <main>
<!-- Guest Dashboard --> <!-- Guest Dashboard -->
<nav id="guest-dashboard" class="dashboard-nav"> <nav id="guest-dashboard" class="dashboard-nav guest-only">
<a href="#" id="guest-welcome" data-target="welcome-page">Welcome</a> | <a href="#welcome-page" id="guest-welcome">Welcome</a>
<a href="#" id="guest-streams" data-target="stream-page">Streams</a> | <a href="#stream-page" id="guest-streams">Streams</a>
<a href="#" id="guest-login" data-target="register-page">Login or Register</a> <a href="#register-page" id="guest-login">Account</a>
</nav> </nav>
<!-- User Dashboard --> <!-- User Dashboard -->
<nav id="user-dashboard" class="dashboard-nav" style="display:none;"> <nav id="user-dashboard" class="dashboard-nav auth-only">
<a href="#" id="user-welcome" data-target="welcome-page">Welcome</a> | <a href="#welcome-page" id="user-welcome">Welcome</a>
<a href="#" id="user-streams" data-target="stream-page">Streams</a> | <a href="#stream-page" id="user-streams">Streams</a>
<a href="#" id="show-me" data-target="me-page">Your Stream</a> <a href="#me-page" id="show-me">Your Stream</a>
</nav> </nav>
<section id="me-page"> <section id="me-page" class="auth-only">
<div>
<h2 id="your-stream-heading">Your Stream</h2>
</div>
<article> <article>
<h2>Your Stream 🎙️</h2>
<p>This is your personal stream. Only you can upload to it.</p> <p>This is your personal stream. Only you can upload to it.</p>
<audio id="me-audio"></audio> <audio id="me-audio"></audio>
<div class="audio-controls"> <div class="audio-controls">
<button id="play-pause" type="button">▶️</button> <button class="play-pause-btn" type="button" aria-label="Play" data-uid="">▶️</button>
</div> </div>
</article> </article>
<section id="user-upload-area" class="dropzone"> <section id="user-upload-area" class="auth-only">
<p>🎙 Drag & drop your audio file here<br>or click to browse</p> <p>Drag & drop your audio file here<br>or click to browse</p>
<input type="file" id="fileInputUser" accept="audio/*" hidden /> <input type="file" id="fileInputUser" accept="audio/*" hidden />
</section> </section>
<article id="log-out" class="auth-only article--bordered logout-section">
<button id="logout-button" class="button">🚪 Log Out</button>
</article>
<section id="uploaded-files" class="auth-only">
<h3>Uploaded Files</h3>
<ul id="file-list" class="file-list">
<li>Loading files...</li>
</ul>
<p class="quota-meter">Quota: <progress id="quota-bar" value="0" max="100"></progress> <span id="quota-text">0 MB</span></p>
</section>
<!-- Account Deletion Section -->
<section id="account-deletion" class="article--bordered auth-only">
<h3>Account Deletion</h3>
<p>This action is irreversible and will permanently remove:</p>
<ul>
<li>Your account information</li>
<li>All uploaded audio files</li>
</ul>
<div class="centered-container">
<button id="delete-account-from-privacy" class="button">
🗑️ Delete My Account
</button>
</div>
</section>
</section> </section>
<div id="spinner" class="spinner"></div> <div id="spinner" class="spinner"></div>
<!-- Burger menu and legacy links section removed for clarity --> <!-- Burger menu and legacy links section removed for clarity -->
<section id="terms-page" hidden> <section id="terms-page" class="always-visible">
<article>
<h2>Terms of Service</h2> <h2>Terms of Service</h2>
<p>By accessing or using dicta2stream.net (the “Service”), you agree to be bound by these Terms of Service (“Terms”). If you do not agree, do not use the Service.</p> <article class="article--bordered">
<div class="alert alert-warning">
<strong>Beta Testing Notice:</strong> This service is currently in public beta. As such, you may encounter bugs or unexpected behavior.
Updates to the service may cause data loss. Please report any issues or suggestions to help us improve.
</div>
<p>By accessing or using dicta2stream.net (the "Service"), you agree to be bound by these Terms of Service ("Terms"). If you do not agree, do not use the Service.</p>
<ul> <ul>
<li>You must be at least 18 years old to register.</li> <li>You must be at least 18 years old to register.</li>
<li>Each account must be unique and used by only one person.</li> <li>Each account must be unique and used by only one person.</li>
@ -76,38 +114,40 @@
<li>The associated email address will be banned from recreating an account.</li> <li>The associated email address will be banned from recreating an account.</li>
<li>Uploads are limited to 100 MB and must be voice only.</li> <li>Uploads are limited to 100 MB and must be voice only.</li>
<li>Music/singing will be rejected.</li> <li>Music/singing will be rejected.</li>
<li>This is a beta service; data may be lost during updates or maintenance.</li>
<li>Please report any bugs or suggestions to help improve the service.</li>
</ul> </ul>
</article> </article>
</section> </section>
<section id="privacy-page" hidden> <section id="privacy-page" class="always-visible">
<article> <div>
<h2>Privacy Policy</h2> <h2>Privacy Policy</h2>
</div>
<article class="article--bordered">
<ul> <ul>
<li><strong>Users</strong>: Session uses both cookies and localStorage to store UID and authentication state.</li> <li><strong>Users</strong>: Session uses both cookies and localStorage to store UID and authentication state.</li>
<li><strong>Guests</strong>: No cookies are set. No persistent identifiers are stored.</li> <li><strong>Guests</strong>: No cookies are set. No persistent identifiers are stored.</li>
<li>We log IP + UID only for abuse protection and quota enforcement.</li> <li>We log IP + UID only for abuse protection and quota enforcement.</li>
<li>Uploads are scanned via Whisper+Ollama but not stored as transcripts.</li> <li>Data is never sold.</li>
<li>Data is never sold. Contact us for account deletion.</li>
</ul> </ul>
</article> </article>
<!-- Guest login message removed as per user request -->
</section> </section>
<section id="imprint-page" hidden> <section id="imprint-page" class="always-visible">
<article>
<h2>Imprint</h2> <h2>Imprint</h2>
<article class="article--bordered">
<p><strong>Andreas Michael Fleckl</strong></p> <p><strong>Andreas Michael Fleckl</strong></p>
<p>Johnstrassse 7/6<br>1140 Vienna<br>Austria / Europe</p> <p>Johnstrassse 7/6<br>1140 Vienna<br>Austria / Europe</p>
</article> </article>
</section> </section>
<section id="welcome-page"> <section id="welcome-page" class="always-visible">
<article>
<h2>Welcome</h2> <h2>Welcome</h2>
<p>dicta2stream is a minimalist voice streaming platform for your spoken audio anonymously under a nickname in a loop. <br><br> <article class="article--bordered">
<p>dicta2stream is a minimalist voice streaming platform for your spoken audio anonymously under a nickname in a loop. <span class="text-muted">(Opus | Mono | 48kHz | 60kbps)</span><br><br>
<strong>What you can do here:</strong></p> <strong>What you can do here:</strong></p>
<ul> <ul>
<li>🎧 Listen to public voice streams from others, instantly</li> <li>🎧 Listen to public voice streams from others, instantly</li>
@ -115,69 +155,52 @@
<li>🕵️ No sign-up required for listening</li> <li>🕵️ No sign-up required for listening</li>
<li>🔒 Optional registration for uploading and managing your own stream</li> <li>🔒 Optional registration for uploading and managing your own stream</li>
</ul> </ul>
<div class="email-section">
<a href="mailto:Andreas.Fleckl@dicta2stream.net" class="button">
Andreas.Fleckl@dicta2stream.net
</a>
</div>
</article> </article>
</section> </section>
<section id="stream-page" hidden> <section id="stream-page" class="always-visible">
<article> <h2>Public Streams</h2>
<h2>🎧 Public Streams</h2>
<!-- The list below is dynamically populated by streams-ui.js; shows 'Loading...' while fetching --> <!-- The list below is dynamically populated by streams-ui.js; shows 'Loading...' while fetching -->
<ul id="stream-list"><li>Loading...</li></ul> <ul id="stream-list"><li>Loading...</li></ul>
</article>
</section> </section>
<section id="register-page" hidden> <section id="register-page" class="guest-only">
<article> <h2>Account</h2>
<h2>Login or Register</h2> <article class="article--wide">
<form id="register-form"> <form id="register-form">
<p><label>Email<br><input type="email" name="email" required /></label></p> <p><label>Email<br><input type="email" name="email" required /></label></p>
<p><label>Username<br><input type="text" name="user" required /></label></p> <p><label>Username<br><input type="text" name="user" required /></label></p>
<p style="display: none;"> <p class="bot-trap">
<label>Leave this empty:<br> <label>Leave this empty:<br>
<input type="text" name="bot_trap" autocomplete="off" /> <input type="text" name="bot_trap" autocomplete="off" />
</label> </label>
</p> </p>
<p><button type="submit">Create Account</button></p> <p><button type="submit">Login / Create Account</button></p>
</form> </form>
<p><small>Youll receive a magic login link via email. No password required.</small></p> <p class="form-note">You'll receive a magic login link via email. No password required.</p>
<p style="font-size: 0.85em; opacity: 0.65; margin-top: 1em;">Your session expires after 1 hour. Shareable links redirect to homepage.</p>
</article> </article>
</section> </section>
<section id="quota-meter" hidden>
<p class="quota-meter">Quota: <progress id="quota-bar" value="0" max="100"></progress> <span id="quota-text">0 MB used</span></p>
</section>
</main> </main>
<footer> <footer>
<p>Built for public voice streaming • Opus | Mono | 48kHz | 60kbps</p>
<p class="footer-hint">Need more space? Contact<a href="mailto:Andreas.Fleckl@dicta2stream.net">Andreas.Fleckl@dicta2stream.net</a></p>
<p class="footer-links"> <p class="footer-links">
<a href="#" id="footer-terms" data-target="terms-page">Terms of Service</a> | <a href="#terms-page" id="footer-terms">Terms</a> |
<a href="#" id="footer-privacy" data-target="privacy-page">Privacy Policy</a> | <a href="#privacy-page" id="footer-privacy">Privacy</a> |
<a href="#" id="footer-imprint" data-target="imprint-page">Imprint</a> <a href="#imprint-page" id="footer-imprint">Imprint</a>
</p> </p>
</footer> </footer>
<script type="module" src="/static/dashboard.js"></script>
<script type="module" src="/static/app.js"></script>
<!-- Load public streams UI logic --> <!-- Load public streams UI logic -->
<script type="module" src="/static/streams-ui.js"></script> <script type="module" src="/static/streams-ui.js?v=3"></script>
<!-- Load upload functionality --> <!-- Load upload functionality -->
<script type="module" src="/static/upload.js"></script> <script type="module" src="/static/upload.js"></script>
<script type="module">
import "/static/nav.js";
window.addEventListener("pageshow", () => {
const dz = document.querySelector("#user-upload-area");
if (dz) dz.classList.remove("uploading");
const spinner = document.querySelector("#spinner");
if (spinner) spinner.style.display = "none";
});
</script>
<script type="module"> <script type="module">
import { initMagicLogin } from '/static/magic-login.js'; import { initMagicLogin } from '/static/magic-login.js';
const params = new URLSearchParams(window.location.search); const params = new URLSearchParams(window.location.search);
@ -189,5 +212,8 @@
} }
} }
</script> </script>
<script type="module" src="/static/init-personal-stream.js"></script>
<script type="module" src="/static/personal-player.js"></script>
</body> </body>
</html> </html>

View File

@ -0,0 +1,38 @@
// Initialize the personal stream play button with the user's UID
document.addEventListener('DOMContentLoaded', () => {
// Function to update the play button with UID
function updatePersonalStreamPlayButton() {
const playButton = document.querySelector('#me-page .play-pause-btn');
const streamPlayer = document.querySelector('#me-page .stream-player');
if (!playButton || !streamPlayer) return;
// Get UID from localStorage or cookie
const uid = localStorage.getItem('uid') || getCookie('uid');
if (uid) {
// Show the player and set the UID if not already set
streamPlayer.style.display = 'block';
if (!playButton.dataset.uid) {
playButton.dataset.uid = uid;
}
} else {
// Hide the player for guests
streamPlayer.style.display = 'none';
}
}
// Helper function to get cookie value by name
function getCookie(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) return parts.pop().split(';').shift();
return null;
}
// Initial update
updatePersonalStreamPlayButton();
// Also update when auth state changes (e.g., after login)
document.addEventListener('authStateChanged', updatePersonalStreamPlayButton);
});

6
static/logger.js Normal file
View File

@ -0,0 +1,6 @@
export function logToServer(msg) {
const xhr = new XMLHttpRequest();
xhr.open("POST", "/log", true);
xhr.setRequestHeader("Content-Type", "application/json");
xhr.send(JSON.stringify({ msg }));
}

View File

@ -1,63 +1,43 @@
// static/magic-login.js — handles magiclink token UI /**
import { showOnly } from './router.js'; * Simplified Magic Login Module
*
* This file now uses the centralized AuthManager for authentication logic.
* The token-based magic login is handled by the AuthManager.
*/
import authManager from './auth-manager.js';
import { showSection } from './nav.js';
let magicLoginSubmitted = false; let magicLoginSubmitted = false;
/**
* Initialize magic login - now delegated to AuthManager
* This function is kept for backward compatibility but the actual
* magic login logic is handled by the AuthManager during initialization.
*/
export async function initMagicLogin() { export async function initMagicLogin() {
console.debug('[magic-login] initMagicLogin called'); // Debug messages disabled
// The AuthManager handles both URL-based and token-based magic login
// during its initialization, so we just need to ensure it's initialized
if (!window.authManager) {
// Debug messages disabled
await authManager.initialize();
}
// Check if there was a magic login processed
const params = new URLSearchParams(location.search); const params = new URLSearchParams(location.search);
const token = params.get('token'); const token = params.get('token');
if (!token) {
console.debug('[magic-login] No token in URL'); if (token) {
return; // Debug messages disabled
}
// Remove token from URL immediately to prevent loops
const url = new URL(window.location.href);
url.searchParams.delete('token');
window.history.replaceState({}, document.title, url.pathname + url.search);
try {
const formData = new FormData();
formData.append('token', token);
const res = await fetch('/magic-login', {
method: 'POST',
body: formData,
});
if (res.redirected) {
// If redirected, backend should set cookie; but set localStorage for SPA
const url = new URL(res.url);
const confirmedUid = url.searchParams.get('confirmed_uid');
if (confirmedUid) {
document.cookie = "uid=" + encodeURIComponent(confirmedUid) + "; path=/";
// Set localStorage for SPA session logic instantly
localStorage.setItem('uid', confirmedUid);
localStorage.setItem('confirmed_uid', confirmedUid);
localStorage.setItem('uid_time', Date.now().toString());
}
window.location.href = res.url;
return;
}
// If not redirected, show error (shouldn't happen in normal flow)
let data;
const contentType = res.headers.get('content-type');
if (contentType && contentType.includes('application/json')) {
data = await res.json();
if (data && data.confirmed_uid) {
document.cookie = "uid=" + encodeURIComponent(data.confirmed_uid) + "; path=/";
// Set localStorage for SPA session logic
localStorage.setItem('uid', data.confirmed_uid);
localStorage.setItem('confirmed_uid', data.confirmed_uid);
localStorage.setItem('uid_time', Date.now().toString());
import('./toast.js').then(({ showToast }) => showToast('✅ Login successful!'));
// Optionally reload or navigate
setTimeout(() => location.reload(), 700);
return;
}
alert(data.detail || 'Login failed.');
} else { } else {
const text = await res.text(); // Debug messages disabled
alert(text || 'Login failed.');
}
} catch (err) {
alert('Network error: ' + err);
} }
} }
// Export for backward compatibility
export { magicLoginSubmitted };
// Make showSection available globally for AuthManager
window.showSection = showSection;

522
static/mobile.css Normal file
View File

@ -0,0 +1,522 @@
/* Mobile-specific styles for screens up to 959px */
@media (max-width: 959px) {
/* Base layout adjustments */
html {
height: 100%;
min-height: 100%;
margin: 0;
padding: 0;
}
body {
min-height: 100vh;
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
font-size: 16px;
overflow-x: hidden;
width: 100%;
max-width: 100%;
background: transparent !important;
}
main {
padding: 0.5rem 1rem;
margin: 0;
width: 100%;
max-width: 100%;
box-sizing: border-box;
box-shadow: none;
border: none;
background: none;
}
* {
box-sizing: border-box;
}
/* Mobile navigation - Enhanced with more specific selectors */
/* Show user dashboard only when authenticated */
body.authenticated #user-dashboard.dashboard-nav,
html body.authenticated #user-dashboard.dashboard-nav,
body.authenticated #user-dashboard.dashboard-nav:not(.hidden) {
display: flex !important;
visibility: visible !important;
opacity: 1 !important;
height: auto !important;
position: relative !important;
clip: auto !important;
}
/* Hide guest dashboard when authenticated - with more specific selectors */
body.authenticated #guest-dashboard.dashboard-nav,
html body.authenticated #guest-dashboard.dashboard-nav,
body.authenticated #guest-dashboard.dashboard-nav:not(.visible) {
display: none !important;
visibility: hidden !important;
opacity: 0 !important;
height: 0 !important;
width: 0 !important;
padding: 0 !important;
margin: 0 !important;
border: none !important;
position: absolute !important;
overflow: hidden !important;
clip: rect(0, 0, 0, 0) !important;
}
/* Show guest dashboard when not authenticated - with more specific selectors */
body:not(.authenticated) #guest-dashboard.dashboard-nav,
html body:not(.authenticated) #guest-dashboard.dashboard-nav,
body:not(.authenticated) #guest-dashboard.dashboard-nav:not(.hidden) {
display: flex !important;
visibility: visible !important;
opacity: 1 !important;
height: auto !important;
position: relative !important;
}
/* Ensure user dashboard is hidden when not authenticated */
body:not(.authenticated) #user-dashboard.dashboard-nav {
display: none !important;
visibility: hidden !important;
opacity: 0 !important;
height: 0 !important;
}
.dashboard-nav {
display: flex;
justify-content: space-around;
padding: 0.5rem 0;
background: var(--surface);
border-bottom: 1px solid var(--border);
position: sticky;
top: 0;
z-index: 100;
margin-bottom: 1rem;
}
.dashboard-nav a {
padding: 0.5rem 0.25rem;
text-align: center;
font-size: 0.9rem;
color: var(--text-color);
text-decoration: none;
flex: 1;
border-radius: 4px;
transition: background-color 0.2s ease;
}
.dashboard-nav a:hover,
.dashboard-nav a:focus {
background-color: var(--hover-bg);
outline: none;
}
/* Account Deletion Section */
#privacy-page.active #account-deletion,
#privacy-page:not(.active) #account-deletion {
display: block !important;
opacity: 1 !important;
position: relative !important;
clip: auto !important;
width: auto !important;
height: auto !important;
margin: 0 !important;
padding: 0 !important;
overflow: visible !important;
}
.account-deletion-section {
margin: 2rem 0;
padding: 1.75rem;
background: rgba(26, 26, 26, 0.8);
border-radius: 16px;
border: 1px solid rgba(255, 255, 255, 0.08);
box-shadow: 0 4px 20px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
}
.account-deletion-section h3 {
color: #fff;
font-size: 1.5rem;
margin-bottom: 1.25rem;
padding-bottom: 0.75rem;
border-bottom: 1px solid rgba(255, 255, 255, 0.1);
}
.account-deletion-section h3 {
color: #fff;
margin-bottom: 1rem;
font-size: 1.4rem;
}
.account-deletion-section ul {
margin: 1.5rem 0 2rem 1.5rem;
padding-left: 0.5rem;
}
.account-deletion-section li {
margin-bottom: 0.75rem;
color: #f0f0f0;
line-height: 1.5;
position: relative;
padding-left: 1.5rem;
}
.account-deletion-section li:before {
content: '•';
color: #ff5e57;
font-weight: bold;
font-size: 1.5rem;
position: absolute;
left: 0;
top: -0.25rem;
}
.danger-button {
background: linear-gradient(135deg, #ff3b30, #ff5e57);
color: white;
border: none;
padding: 1rem 1.5rem;
border-radius: 8px;
font-weight: 600;
cursor: pointer;
width: 100%;
max-width: 300px;
transition: all 0.2s ease;
box-shadow: 0 2px 8px rgba(255, 59, 48, 0.3);
text-align: center;
}
.danger-button:hover {
transform: translateY(-2px);
box-shadow: 0 4px 12px rgba(255, 59, 48, 0.4);
}
.danger-button:active {
transform: translateY(0);
}
.text-link {
color: #4dabf7;
text-decoration: none;
transition: color 0.2s ease;
}
.text-link:hover {
color: #74c0fc;
text-decoration: underline;
}
/* Hide desktop navigation in mobile */
nav.dashboard-nav {
display: none;
}
header {
padding: 0.5rem 1rem;
}
header h1 {
font-size: 1.8rem;
margin: 0.5rem 0;
}
header p {
font-size: 1rem;
margin: 0.25rem 0 1rem;
}
.dashboard-nav {
width: 100%;
padding: 0.5rem;
box-sizing: border-box;
text-align: center;
font-size: 0.9rem;
}
.dashboard-nav a {
padding: 0.5rem;
margin: 0 0.25rem;
display: inline-block;
}
main > section {
width: 100%;
max-width: 100%;
padding: 1rem;
box-sizing: border-box;
}
.btn {
width: 100%;
height: 48px;
padding: 0.75rem 1rem;
margin: 0.5rem 0;
font-size: 1rem;
box-sizing: border-box;
}
.audio-player {
width: 100%;
margin: 1rem 0;
}
.audio-controls {
flex-direction: column;
}
.audio-controls button {
margin: 0.25rem 0;
}
.dropzone {
padding: 2rem;
}
#quota-meter {
max-width: 600px;
width: 100%;
margin: 1rem auto;
padding: 0 1rem;
box-sizing: border-box;
}
.quota-meter {
height: 20px;
}
/* Stream item styles moved to .stream-player */
.stream-item {
padding: 0;
margin: 0;
border: none;
}
.modal-content {
width: 90%;
max-width: 90%;
}
footer {
padding: 1rem;
}
.footer-hint {
font-size: 0.9rem;
}
.desktop-only {
display: none !important;
}
#burger-label {
display: block;
}
section#links {
display: none;
position: absolute;
top: 100%;
left: 0;
right: 0;
background: #1e1e1e;
z-index: 1000;
}
#burger-toggle:checked + #burger-label + section#links {
display: block;
}
/* Make sure all interactive elements are touch-friendly */
a, [role="button"], label, select, textarea {
min-height: 44px;
min-width: 44px;
}
.dropzone {
padding: 1.5rem;
margin: 1rem 0;
}
.dropzone p {
font-size: 1rem;
margin: 0.5rem 0;
}
/* Adjust header text for better mobile display */
header h1 {
font-size: 1.5rem;
}
header p {
font-size: 0.9rem;
}
.dashboard-nav {
overflow-x: auto;
white-space: nowrap;
-webkit-overflow-scrolling: touch;
padding: 0.5rem 0;
}
.dashboard-nav::-webkit-scrollbar {
display: none;
}
nav.dashboard-nav a {
all: unset;
display: inline-block;
background-color: var(--surface);
color: var(--text-color);
padding: 0.5rem 1rem;
margin: 0 0.25rem;
border-radius: 4px;
font-size: 0.9rem;
cursor: pointer;
text-align: center;
min-width: 100px;
box-sizing: border-box;
transition: background-color 0.2s;
}
.dashboard-nav a:active {
background-color: var(--border);
}
/* Stream page specific styles */
#stream-page {
padding: 0.5rem;
}
#stream-page h2 {
font-size: 1.5rem;
}
#stream-page article {
padding: 1rem;
margin: 0.5rem 0;
}
#stream-list {
padding: 0 1rem;
margin: 0 auto;
max-width: 600px;
width: 100%;
box-sizing: border-box;
}
#stream-list li {
margin: 0;
padding: 0;
border: none;
background: transparent;
list-style: none;
}
.stream-player {
padding: 0.75rem;
}
.stream-player h3 {
font-size: 1.1rem;
}
.stream-info {
font-size: 0.9rem;
}
/* Stream list items are now handled by the rules above */
/* User upload area - matches article styling */
#user-upload-area {
margin: 2rem auto;
padding: 1.6875rem;
background: var(--surface);
border: 1px solid var(--border-color, #2a2a2a);
border-radius: 8px;
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.1);
text-align: center;
cursor: pointer;
max-width: 600px;
width: 100%;
box-sizing: border-box;
color: var(--text-color);
}
#user-upload-area p {
margin: 0.5rem 0;
}
/* Stream player adjustments */
.stream-player {
padding: 1rem;
margin: 0.5rem 0;
border: 1px solid #444;
border-radius: 8px;
background-color: #1e1e1e;
}
.stream-player h3 {
margin: 0 0 0.5rem 0;
font-size: 1.2rem;
}
.stream-info {
font-size: 0.9rem;
color: var(--text-muted);
margin-bottom: 0.5rem;
}
.stream-audio {
width: 100%;
}
/* Form elements */
input[type="text"],
input[type="email"],
input[type="password"],
textarea {
width: 100%;
max-width: 100%;
box-sizing: border-box;
-moz-box-sizing: border-box;
-webkit-box-sizing: border-box;
padding: 0.75rem;
margin: 0.5rem 0;
font-size: 1rem;
border-radius: 4px;
border: 1px solid #444;
background-color: #2a2a2a;
color: #f0f0f0;
}
/* Firefox mobile specific fixes */
@-moz-document url-prefix() {
input[type="email"] {
min-height: 2.5rem;
appearance: none;
}
}
/* Adjust audio element for mobile */
audio {
width: 100% !important;
max-width: 100% !important;
}
/* Toast notifications */
.toast {
width: 90%;
max-width: 100%;
left: 5%;
right: 5%;
transform: none;
margin: 0 auto;
}
}

View File

@ -7,194 +7,97 @@ function getCookie(name) {
return null; return null;
} }
document.addEventListener("DOMContentLoaded", () => { // Determines the correct section to show based on auth status and requested section
const Router = { function getValidSection(sectionId) {
sections: Array.from(document.querySelectorAll("main > section")), const isLoggedIn = !!getCookie('uid');
showOnly(id) { const protectedSections = ['me-page', 'account-page'];
this.sections.forEach(sec => { const guestOnlySections = ['login-page', 'register-page', 'magic-login-page'];
sec.hidden = sec.id !== id;
sec.tabIndex = -1;
});
// Show user-upload-area only when me-page is shown and user is logged in
const userUpload = document.getElementById("user-upload-area");
if (userUpload) {
const uid = getCookie("uid");
userUpload.style.display = (id === "me-page" && uid) ? '' : 'none';
}
localStorage.setItem("last_page", id);
const target = document.getElementById(id);
if (target) target.focus();
},
init() {
initNavLinks();
initBackButtons();
initStreamLinks(); if (isLoggedIn) {
// If logged in, guest-only sections are invalid, redirect to 'me-page'
if (guestOnlySections.includes(sectionId)) {
return 'me-page';
}
} else {
// If not logged in, protected sections are invalid, redirect to 'welcome-page'
if (protectedSections.includes(sectionId)) {
return 'welcome-page';
}
}
// If the section doesn't exist in the DOM, default to welcome page
if (!document.getElementById(sectionId)) {
return 'welcome-page';
}
return sectionId;
}
// Main function to show/hide sections
export function showSection(sectionId) {
const mainSections = Array.from(document.querySelectorAll('main > section'));
// Update body class for page-specific CSS
document.body.className = document.body.className.replace(/page-\S+/g, '');
document.body.classList.add(`page-${sectionId || 'welcome-page'}`);
// Update active state of navigation links
document.querySelectorAll('.dashboard-nav a').forEach(link => {
link.classList.remove('active');
if (link.getAttribute('href') === `#${sectionId}`) {
link.classList.add('active');
}
});
mainSections.forEach(section => {
section.hidden = section.id !== sectionId;
});
// Update URL hash without causing a page scroll, this is for direct calls to showSection
// Normal navigation is handled by the hashchange listener
const currentHash = `#${sectionId}`;
if (window.location.hash !== currentHash) {
if (history.pushState) {
if (sectionId && sectionId !== 'welcome-page') {
history.pushState(null, null, currentHash);
} else {
history.pushState(null, null, window.location.pathname + window.location.search);
}
}
}
}
document.addEventListener("DOMContentLoaded", () => {
const isLoggedIn = !!getCookie('uid');
document.body.classList.toggle('authenticated', isLoggedIn);
// Unified click handler for SPA navigation
document.body.addEventListener('click', (e) => {
const link = e.target.closest('a[href^="#"]');
// Ensure the link is not inside a component that handles its own navigation
if (!link || link.closest('.no-global-nav')) return;
e.preventDefault();
const newHash = link.getAttribute('href');
if (window.location.hash !== newHash) {
window.location.hash = newHash;
}
});
// Main routing logic on hash change
const handleNavigation = () => {
const sectionId = window.location.hash.substring(1) || 'welcome-page';
const validSectionId = getValidSection(sectionId);
if (sectionId !== validSectionId) {
window.location.hash = validSectionId; // This will re-trigger handleNavigation
} else {
showSection(validSectionId);
} }
}; };
const showOnly = Router.showOnly.bind(Router);
// Highlight active profile link on browser back/forward navigation window.addEventListener('hashchange', handleNavigation);
function highlightActiveProfileLink() {
const params = new URLSearchParams(window.location.search);
const profileUid = params.get('profile');
const ul = document.getElementById('stream-list');
if (!ul) return;
ul.querySelectorAll('a.profile-link').forEach(link => {
const url = new URL(link.href, window.location.origin);
const uidParam = url.searchParams.get('profile');
link.classList.toggle('active', uidParam === profileUid);
});
}
window.addEventListener('popstate', () => {
const params = new URLSearchParams(window.location.search);
const profileUid = params.get('profile');
if (profileUid) {
showOnly('me-page');
if (typeof window.showProfilePlayerFromUrl === 'function') {
window.showProfilePlayerFromUrl();
}
} else {
highlightActiveProfileLink();
}
});
/* restore last page (unless magiclink token present) */ // Initial page load
const params = new URLSearchParams(location.search); handleNavigation();
const token = params.get("token");
if (!token) {
const last = localStorage.getItem("last_page");
if (last && document.getElementById(last)) {
showOnly(last);
} else if (document.getElementById("welcome-page")) {
// Show Welcome page by default for all new/guest users
showOnly("welcome-page");
}
// Highlight active link on initial load
highlightActiveProfileLink();
}
/* token → show magiclogin page */
if (token) {
document.getElementById("magic-token").value = token;
showOnly("magic-login-page");
const err = params.get("error");
if (err) {
const box = document.getElementById("magic-error");
box.textContent = decodeURIComponent(err);
box.style.display = "block";
}
}
function renderStreamList(streams) {
const ul = document.getElementById("stream-list");
if (!ul) return;
if (streams.length) {
streams.sort();
ul.innerHTML = streams.map(uid => `
<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${uid}</a></li>
`).join("");
} else {
ul.innerHTML = "<li>No active streams.</li>";
}
// Ensure correct link is active after rendering
highlightActiveProfileLink();
}
// Initialize navigation listeners
function initNavLinks() {
const navIds = ["links", "user-dashboard", "guest-dashboard"];
navIds.forEach(id => {
const nav = document.getElementById(id);
if (!nav) return;
nav.addEventListener("click", e => {
const a = e.target.closest("a[data-target]");
if (!a || !nav.contains(a)) return;
e.preventDefault();
// Save audio state before navigation
const audio = document.getElementById('me-audio');
const wasPlaying = audio && !audio.paused;
const currentTime = audio ? audio.currentTime : 0;
const target = a.dataset.target;
if (target) showOnly(target);
// Handle stream page specifically
if (target === "stream-page" && typeof window.maybeLoadStreamsOnShow === "function") {
window.maybeLoadStreamsOnShow();
}
// Handle me-page specifically
else if (target === "me-page" && audio) {
// Restore audio state if it was playing
if (wasPlaying) {
audio.currentTime = currentTime;
audio.play().catch(e => console.error('Play failed:', e));
}
}
});
});
// Add click handlers for footer links with audio state saving
document.querySelectorAll(".footer-links a").forEach(link => {
link.addEventListener("click", (e) => {
e.preventDefault();
const target = link.dataset.target;
if (!target) return;
// Save audio state before navigation
const audio = document.getElementById('me-audio');
const wasPlaying = audio && !audio.paused;
const currentTime = audio ? audio.currentTime : 0;
showOnly(target);
// Handle me-page specifically
if (target === "me-page" && audio) {
// Restore audio state if it was playing
if (wasPlaying) {
audio.currentTime = currentTime;
audio.play().catch(e => console.error('Play failed:', e));
}
}
});
});
}
function initBackButtons() {
document.querySelectorAll('a[data-back]').forEach(btn => {
btn.addEventListener("click", e => {
e.preventDefault();
const target = btn.dataset.back;
if (target) showOnly(target);
// Ensure streams load instantly when stream-page is shown
if (target === "stream-page" && typeof window.maybeLoadStreamsOnShow === "function") {
window.maybeLoadStreamsOnShow();
}
});
});
}
function initStreamLinks() {
const ul = document.getElementById("stream-list");
if (!ul) return;
ul.addEventListener("click", e => {
const a = e.target.closest("a.profile-link");
if (!a || !ul.contains(a)) return;
e.preventDefault();
const url = new URL(a.href, window.location.origin);
const profileUid = url.searchParams.get("profile");
if (profileUid && window.location.search !== `?profile=${encodeURIComponent(profileUid)}`) {
window.profileNavigationTriggered = true;
window.history.pushState({}, '', `/?profile=${encodeURIComponent(profileUid)}`);
window.dispatchEvent(new Event("popstate"));
}
});
}
// Initialize Router
Router.init();
}); });

85
static/personal-player.js Normal file
View File

@ -0,0 +1,85 @@
import { showToast } from "./toast.js";
import { SharedAudioPlayer } from './shared-audio-player.js';
function getPersonalStreamUrl(uid) {
return `/audio/${encodeURIComponent(uid)}/stream.opus`;
}
function updatePlayPauseButton(button, isPlaying) {
if (button) button.textContent = isPlaying ? '⏸️' : '▶️';
// Optionally, update other UI elements here
}
const personalPlayer = new SharedAudioPlayer({
playerType: 'personal',
getStreamUrl: getPersonalStreamUrl,
onUpdateButton: updatePlayPauseButton
});
/**
* Finds or creates the audio element for the personal stream.
* @returns {HTMLAudioElement | null}
*/
function cleanupPersonalAudio() {
if (audioElement) {
try {
if (audioElement._eventHandlers) {
const { onPlay, onPause, onEnded, onError } = audioElement._eventHandlers;
if (onPlay) audioElement.removeEventListener('play', onPlay);
if (onPause) audioElement.removeEventListener('pause', onPause);
if (onEnded) audioElement.removeEventListener('ended', onEnded);
if (onError) audioElement.removeEventListener('error', onError);
}
audioElement.pause();
audioElement.removeAttribute('src');
audioElement.load();
if (audioElement._eventHandlers) delete audioElement._eventHandlers;
// Remove from DOM
if (audioElement.parentNode) audioElement.parentNode.removeChild(audioElement);
} catch (e) {
console.warn('[personal-player.js] Error cleaning up audio element:', e);
}
audioElement = null;
}
}
// Use the shared player for loading and playing the personal stream
export function loadProfileStream(uid, playPauseBtn) {
if (!uid) {
showToast('No UID provided for profile stream', 'error');
return;
}
personalPlayer.play(uid, playPauseBtn);
}
/**
* Initializes the personal audio player, setting up event listeners.
*/
export function initPersonalPlayer() {
const mePageSection = document.getElementById('me-page');
if (!mePageSection) return;
// Use a delegated event listener for the play button
mePageSection.addEventListener('click', (e) => {
const playPauseBtn = e.target.closest('.play-pause-btn');
if (!playPauseBtn) return;
e.stopPropagation();
const uid = localStorage.getItem('uid');
if (!uid) {
showToast('Please log in to play audio.', 'error');
return;
}
// Toggle play/pause
if (personalPlayer.audioElement && !personalPlayer.audioElement.paused && !personalPlayer.audioElement.ended) {
personalPlayer.pause();
} else {
loadProfileStream(uid, playPauseBtn);
}
});
// Make loadProfileStream globally accessible for upload.js
window.loadProfileStream = loadProfileStream;
}

View File

@ -0,0 +1,70 @@
/**
* Cleanup Script: Remove Redundant confirmed_uid from localStorage
*
* This script removes the redundant confirmed_uid field from localStorage
* for users who might have it stored from the old authentication system.
*/
(function() {
'use strict';
console.log('[CONFIRMED_UID_CLEANUP] Starting cleanup of redundant confirmed_uid field...');
// Check if confirmed_uid exists in localStorage
const confirmedUid = localStorage.getItem('confirmed_uid');
const currentUid = localStorage.getItem('uid');
if (confirmedUid) {
console.log(`[CONFIRMED_UID_CLEANUP] Found confirmed_uid: ${confirmedUid}`);
console.log(`[CONFIRMED_UID_CLEANUP] Current uid: ${currentUid}`);
// Verify that uid exists and is properly set
if (!currentUid) {
console.warn('[CONFIRMED_UID_CLEANUP] No uid found, setting uid from confirmed_uid');
localStorage.setItem('uid', confirmedUid);
} else if (currentUid !== confirmedUid) {
console.warn(`[CONFIRMED_UID_CLEANUP] UID mismatch - uid: ${currentUid}, confirmed_uid: ${confirmedUid}`);
console.log('[CONFIRMED_UID_CLEANUP] Keeping current uid value');
}
// Remove the redundant confirmed_uid
localStorage.removeItem('confirmed_uid');
console.log('[CONFIRMED_UID_CLEANUP] Removed redundant confirmed_uid from localStorage');
// Log the cleanup action
console.log('[CONFIRMED_UID_CLEANUP] Cleanup completed successfully');
} else {
console.log('[CONFIRMED_UID_CLEANUP] No confirmed_uid found, no cleanup needed');
}
// Also check for any other potential redundant fields
const redundantFields = [
'confirmed_uid', // Main target
'confirmedUid', // Camel case variant
'confirmed-uid' // Hyphenated variant
];
let removedCount = 0;
redundantFields.forEach(field => {
if (localStorage.getItem(field)) {
localStorage.removeItem(field);
removedCount++;
console.log(`[CONFIRMED_UID_CLEANUP] Removed redundant field: ${field}`);
}
});
if (removedCount > 0) {
console.log(`[CONFIRMED_UID_CLEANUP] Removed ${removedCount} redundant authentication fields`);
}
console.log('[CONFIRMED_UID_CLEANUP] Cleanup process completed');
})();
// Export for manual execution if needed
if (typeof window !== 'undefined') {
window.removeConfirmedUidCleanup = function() {
const script = document.createElement('script');
script.src = '/static/remove-confirmed-uid.js';
document.head.appendChild(script);
};
}

View File

@ -1,15 +0,0 @@
// static/router.js — core routing for SPA navigation
export const Router = {
sections: Array.from(document.querySelectorAll("main > section")),
showOnly(id) {
this.sections.forEach(sec => {
sec.hidden = sec.id !== id;
sec.tabIndex = -1;
});
localStorage.setItem("last_page", id);
const target = document.getElementById(id);
if (target) target.focus();
}
};
export const showOnly = Router.showOnly.bind(Router);

View File

@ -0,0 +1,162 @@
// shared-audio-player.js
// Unified audio player logic for both streams and personal player
import { globalAudioManager } from './global-audio-manager.js';
export class SharedAudioPlayer {
constructor({ playerType, getStreamUrl, onUpdateButton }) {
this.playerType = playerType; // 'streams' or 'personal'
this.getStreamUrl = getStreamUrl; // function(uid) => url
this.onUpdateButton = onUpdateButton; // function(button, isPlaying)
this.audioElement = null;
this.currentUid = null;
this.isPlaying = false;
this.currentButton = null;
this._eventHandlers = {};
// Register stop listener
globalAudioManager.addListener(playerType, () => {
this.stop();
});
}
pause() {
if (this.audioElement && !this.audioElement.paused && !this.audioElement.ended) {
this.audioElement.pause();
this.isPlaying = false;
if (this.onUpdateButton && this.currentButton) {
this.onUpdateButton(this.currentButton, false);
}
}
}
async play(uid, button) {
const ctx = `[SharedAudioPlayer][${this.playerType}]${uid ? `[${uid}]` : ''}`;
const isSameUid = this.currentUid === uid;
const isActive = this.audioElement && !this.audioElement.paused && !this.audioElement.ended;
// Guard: If already playing the requested UID and not paused/ended, do nothing
if (isSameUid && isActive) {
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, true);
return;
}
// If same UID but paused, resume
if (isSameUid && this.audioElement && this.audioElement.paused && !this.audioElement.ended) {
try {
await this.audioElement.play();
this.isPlaying = true;
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, true);
globalAudioManager.startPlayback(this.playerType, uid);
} catch (err) {
this.isPlaying = false;
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, false);
console.error(`${ctx} play() resume failed:`, err);
}
return;
}
// Otherwise, stop current and start new
if (!isSameUid && this.audioElement) {
} else {
}
this.stop();
this.currentUid = uid;
this.currentButton = button;
const url = this.getStreamUrl(uid);
this.audioElement = new Audio(url);
this.audioElement.preload = 'auto';
this.audioElement.crossOrigin = 'anonymous';
this.audioElement.style.display = 'none';
document.body.appendChild(this.audioElement);
this._attachEventHandlers();
try {
await this.audioElement.play();
this.isPlaying = true;
if (this.onUpdateButton) this.onUpdateButton(button, true);
globalAudioManager.startPlayback(this.playerType, uid);
} catch (err) {
this.isPlaying = false;
if (this.onUpdateButton) this.onUpdateButton(button, false);
console.error(`${ctx} play() failed:`, err);
}
}
stop() {
if (this.audioElement) {
this._removeEventHandlers();
try {
this.audioElement.pause();
this.audioElement.removeAttribute('src');
this.audioElement.load();
if (this.audioElement.parentNode) {
this.audioElement.parentNode.removeChild(this.audioElement);
}
} catch (e) {
console.warn('[shared-audio-player] Error cleaning up audio element:', e);
}
this.audioElement = null;
}
this.isPlaying = false;
this.currentUid = null;
if (this.currentButton && this.onUpdateButton) {
this.onUpdateButton(this.currentButton, false);
}
this.currentButton = null;
}
_attachEventHandlers() {
if (!this.audioElement) return;
const ctx = `[SharedAudioPlayer][${this.playerType}]${this.currentUid ? `[${this.currentUid}]` : ''}`;
const logEvent = (event) => {
// Debug logging disabled
};
// Core handlers
const onPlay = (e) => {
logEvent(e);
this.isPlaying = true;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, true);
};
const onPause = (e) => {
logEvent(e);
// console.trace(`${ctx} Audio pause stack trace:`);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
};
const onEnded = (e) => {
logEvent(e);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
};
const onError = (e) => {
logEvent(e);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
console.error(`${ctx} Audio error:`, e);
};
// Attach handlers
this.audioElement.addEventListener('play', onPlay);
this.audioElement.addEventListener('pause', onPause);
this.audioElement.addEventListener('ended', onEnded);
this.audioElement.addEventListener('error', onError);
// Attach debug logging for all relevant events
const debugEvents = [
'abort','canplay','canplaythrough','durationchange','emptied','encrypted','loadeddata','loadedmetadata',
'loadstart','playing','progress','ratechange','seeked','seeking','stalled','suspend','timeupdate','volumechange','waiting'
];
debugEvents.forEach(evt => {
this.audioElement.addEventListener(evt, logEvent);
}); // Logging now disabled
this._eventHandlers = { onPlay, onPause, onEnded, onError, debugEvents, logEvent };
}
_removeEventHandlers() {
if (!this.audioElement || !this._eventHandlers) return;
const { onPlay, onPause, onEnded, onError } = this._eventHandlers;
if (onPlay) this.audioElement.removeEventListener('play', onPlay);
if (onPause) this.audioElement.removeEventListener('pause', onPause);
if (onEnded) this.audioElement.removeEventListener('ended', onEnded);
if (onError) this.audioElement.removeEventListener('error', onError);
this._eventHandlers = {};
}
}

View File

@ -1,6 +1,15 @@
// sound.js — reusable Web Audio beep // sound.js — reusable Web Audio beep
export function playBeep(frequency = 432, duration = 0.2, type = 'sine') { export function playBeep(frequency = 432, duration = 0.2, type = 'sine') {
try {
// Validate parameters to prevent audio errors
if (!Number.isFinite(frequency) || frequency <= 0) {
frequency = 432; // fallback to default
}
if (!Number.isFinite(duration) || duration <= 0) {
duration = 0.2; // fallback to default
}
const ctx = new (window.AudioContext || window.webkitAudioContext)(); const ctx = new (window.AudioContext || window.webkitAudioContext)();
const osc = ctx.createOscillator(); const osc = ctx.createOscillator();
const gain = ctx.createGain(); const gain = ctx.createGain();
@ -14,4 +23,8 @@ export function playBeep(frequency = 432, duration = 0.2, type = 'sine') {
gain.gain.setValueAtTime(0.1, ctx.currentTime); // subtle volume gain.gain.setValueAtTime(0.1, ctx.currentTime); // subtle volume
osc.start(); osc.start();
osc.stop(ctx.currentTime + duration); osc.stop(ctx.currentTime + duration);
} catch (error) {
// Silently handle audio errors to prevent breaking upload flow
console.warn('[SOUND] Audio beep failed:', error.message);
}
} }

View File

@ -1,9 +1,22 @@
// static/streams-ui.js — public streams loader and profile-link handling // static/streams-ui.js — public streams loader and profile-link handling
import { showOnly } from './router.js';
import { globalAudioManager } from './global-audio-manager.js';
// Global variable to track if we should force refresh the stream list
let shouldForceRefresh = false;
// Function to refresh the stream list
window.refreshStreamList = function(force = true) {
shouldForceRefresh = force;
loadAndRenderStreams();
return new Promise((resolve) => {
// Resolve after a short delay to allow the stream list to update
setTimeout(resolve, 500);
});
};
// Removed loadingStreams and lastStreamsPageVisible guards for instant fetch // Removed loadingStreams and lastStreamsPageVisible guards for instant fetch
export function initStreamsUI() { export function initStreamsUI() {
initStreamLinks(); initStreamLinks();
window.addEventListener('popstate', () => { window.addEventListener('popstate', () => {
@ -12,6 +25,12 @@ export function initStreamsUI() {
}); });
document.addEventListener('visibilitychange', maybeLoadStreamsOnShow); document.addEventListener('visibilitychange', maybeLoadStreamsOnShow);
maybeLoadStreamsOnShow(); maybeLoadStreamsOnShow();
// Register with global audio manager to handle stop requests from other players
globalAudioManager.addListener('streams', () => {
// Debug messages disabled
stopPlayback();
});
} }
function maybeLoadStreamsOnShow() { function maybeLoadStreamsOnShow() {
@ -24,154 +43,354 @@ function maybeLoadStreamsOnShow() {
} }
window.maybeLoadStreamsOnShow = maybeLoadStreamsOnShow; window.maybeLoadStreamsOnShow = maybeLoadStreamsOnShow;
// Global variables for audio control
let currentlyPlayingAudio = null; let currentlyPlayingAudio = null;
let currentlyPlayingButton = null;
document.addEventListener('DOMContentLoaded', initStreamsUI); // Global variable to track the active SSE connection
let activeSSEConnection = null;
// Global cleanup function for SSE connections
const cleanupConnections = () => {
if (window._streamsSSE) {
if (window._streamsSSE.abort) {
window._streamsSSE.abort();
}
window._streamsSSE = null;
}
if (window.connectionTimeout) {
clearTimeout(window.connectionTimeout);
window.connectionTimeout = null;
}
activeSSEConnection = null;
};
// Initialize when DOM is loaded
document.addEventListener('DOMContentLoaded', () => {
initStreamsUI();
// Also try to load streams immediately in case the page is already loaded
setTimeout(() => {
loadAndRenderStreams();
}, 100);
});
function loadAndRenderStreams() { function loadAndRenderStreams() {
const ul = document.getElementById('stream-list'); const ul = document.getElementById('stream-list');
if (!ul) { if (!ul) {
console.warn('[streams-ui] #stream-list not found in DOM'); // Debug messages disabled
return; return;
} }
console.debug('[streams-ui] loadAndRenderStreams (SSE mode) called'); // Debug messages disabled
// Don't start a new connection if one is already active and we're not forcing a refresh
if (activeSSEConnection && !shouldForceRefresh) {
return;
}
// If we're forcing a refresh, clean up the existing connection
if (shouldForceRefresh && activeSSEConnection) {
// Clean up any existing connections
cleanupConnections();
shouldForceRefresh = false; // Reset the flag after handling
}
// Clear any existing error messages or retry buttons
ul.innerHTML = '<li>Loading public streams...</li>';
// Add a timestamp to prevent caching issues
const timestamp = new Date().getTime();
// Use the same protocol as the current page to avoid mixed content issues
const baseUrl = window.location.origin;
const sseUrl = `${baseUrl}/streams-sse?t=${timestamp}`;
ul.innerHTML = '<li>Loading...</li>';
let gotAny = false; let gotAny = false;
let streams = []; let streams = [];
// Close previous EventSource if any window.connectionTimeout = null;
if (window._streamsSSE) {
window._streamsSSE.close();
}
const evtSource = new window.EventSource('/streams-sse');
window._streamsSSE = evtSource;
evtSource.onmessage = function(event) { // Clean up any existing connections
console.debug('[streams-ui] SSE event received:', event.data); cleanupConnections();
try {
const data = JSON.parse(event.data); // Reset the retry count if we have a successful connection
if (data.end) { window.streamRetryCount = 0;
if (!gotAny) {
ul.innerHTML = '<li>No active streams.</li>'; if (window.connectionTimeout) {
clearTimeout(window.connectionTimeout);
window.connectionTimeout = null;
} }
evtSource.close();
// Use fetch with ReadableStream for better CORS handling
const controller = new AbortController();
const signal = controller.signal;
// Store the controller for cleanup
window._streamsSSE = controller;
// Track the active connection
activeSSEConnection = controller;
// Set a connection timeout with debug info
const connectionStartTime = Date.now();
const connectionTimeoutId = setTimeout(() => {
if (!gotAny) {
// Only log in development (localhost) or if explicitly enabled
const isLocalDevelopment = window.location.hostname === 'localhost' ||
window.location.hostname === '127.0.0.1';
if (isLocalDevelopment || window.DEBUG_STREAMS) {
const duration = Date.now() - connectionStartTime;
// Debug messages disabled
console.log(`Duration: ${duration}ms`);
console.log('Current time:', new Date().toISOString());
console.log('Streams received:', streams.length);
console.log('Active intervals:', window.activeIntervals ? window.activeIntervals.size : 'N/A');
console.log('Active timeouts:', window.activeTimeouts ? window.activeTimeouts.size : 'N/A');
console.groupEnd();
}
// Clean up and retry with backoff
controller.abort();
// Only retry if we haven't exceeded max retries
const retryCount = window.streamRetryCount || 0;
if (retryCount < 3) { // Max 3 retries
window.streamRetryCount = retryCount + 1;
const backoffTime = Math.min(1000 * Math.pow(2, retryCount), 10000); // Exponential backoff, max 10s
setTimeout(loadAndRenderStreams, backoffTime);
}
}
}, 15000); // 15 second timeout (increased from 10s)
// Store the timeout ID for cleanup
window.connectionTimeout = connectionTimeoutId;
// Make the fetch request with proper error handling
fetch(sseUrl, {
method: 'GET',
headers: {
'Accept': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
},
credentials: 'same-origin',
signal: signal,
mode: 'cors',
redirect: 'follow'
})
.then(response => {
if (!response.ok) {
// Try to get the response text for error details
return response.text().then(text => {
const error = new Error(`HTTP error! status: ${response.status}, statusText: ${response.statusText}`);
error.response = { status: response.status, statusText: response.statusText, body: text };
throw error;
}).catch(() => {
const error = new Error(`HTTP error! status: ${response.status}, statusText: ${response.statusText}`);
error.response = { status: response.status, statusText: response.statusText };
throw error;
});
}
if (!response.body) {
throw new Error('Response body is null or undefined');
}
// Get the readable stream
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = '';
// Process the stream
function processStream({ done, value }) {
// Debug messages disabled
if (done) {
// Debug messages disabled
// Process any remaining data in the buffer
if (buffer.trim()) {
// Debug messages disabled
try {
const data = JSON.parse(buffer);
// Debug messages disabled
processSSEEvent(data);
} catch (e) {
// Debug messages disabled
}
}
return;
}
// Decode the chunk and add to buffer
buffer += decoder.decode(value, { stream: true });
// Process complete events in the buffer
const events = buffer.split('\n\n');
buffer = events.pop() || ''; // Keep incomplete event in buffer
for (const event of events) {
if (!event.trim()) continue;
// Extract data field from SSE format
const dataMatch = event.match(/^data: (\{.*\})$/m);
if (dataMatch && dataMatch[1]) {
try {
const data = JSON.parse(dataMatch[1]);
processSSEEvent(data);
} catch (e) {
// Debug messages disabled
}
}
}
// Read the next chunk
return reader.read().then(processStream);
}
// Start reading the stream
return reader.read().then(processStream);
})
.catch(error => {
// Only handle the error if it's not an abort error
if (error.name !== 'AbortError') {
// Clean up the controller reference
window._streamsSSE = null;
activeSSEConnection = null;
// Clear the connection timeout
if (connectionTimeout) {
clearTimeout(connectionTimeout);
connectionTimeout = null;
}
// Show a user-friendly error message
const ul = document.getElementById('stream-list');
if (ul) {
let errorMessage = 'Error loading streams. ';
if (error.message && error.message.includes('Failed to fetch')) {
errorMessage += 'Unable to connect to the server. Please check your internet connection.';
} else if (error.message && error.message.includes('CORS')) {
errorMessage += 'A server configuration issue occurred. Please try again later.';
} else {
errorMessage += 'Please try again later.';
}
ul.innerHTML = `
<li class="error">
<p>${errorMessage}</p>
<button id="retry-loading" class="retry-button">
<span class="retry-icon">↻</span> Try Again
</button>
</li>
`;
// Add retry handler
const retryButton = document.getElementById('retry-loading');
if (retryButton) {
retryButton.addEventListener('click', () => {
ul.innerHTML = '<li>Loading streams...</li>';
loadAndRenderStreams();
});
}
}
}
});
// Function to process SSE events
function processSSEEvent(data) {
// Debug messages disabled
if (data.end) {
if (streams.length === 0) {
ul.innerHTML = '<li>No active streams.</li>';
return;
}
// Sort streams by mtime in descending order (newest first)
streams.sort((a, b) => (b.mtime || 0) - (a.mtime || 0));
// Clear the list
ul.innerHTML = '';
// Render each stream in sorted order
streams.forEach((stream, index) => {
const uid = stream.uid || `stream-${index}`;
const username = stream.username || 'Unknown User';
const sizeMb = stream.size ? (stream.size / (1024 * 1024)).toFixed(1) : '?';
const mtime = stream.mtime ? new Date(stream.mtime * 1000).toISOString().split('T')[0].replace(/-/g, '/') : '';
const li = document.createElement('li');
li.className = 'stream-item';
try {
li.innerHTML = `
<article class="stream-player" data-uid="${escapeHtml(uid)}">
<h3>${escapeHtml(username)}</h3>
<div class="audio-controls">
<button class="play-pause-btn" data-uid="${escapeHtml(uid)}" aria-label="Play">▶️</button>
</div>
<p class="stream-info" style='color:var(--text-muted);font-size:90%'>[${sizeMb} MB, ${mtime}]</p>
</article>
`;
ul.appendChild(li);
} catch (error) {
const errorLi = document.createElement('li');
errorLi.textContent = `Error loading stream: ${uid}`;
errorLi.style.color = 'var(--error)';
ul.appendChild(errorLi);
}
});
highlightActiveProfileLink(); highlightActiveProfileLink();
return; return;
} }
// Remove Loading... on any valid event
// Add stream to our collection
streams.push(data);
// If this is the first stream, clear the loading message
if (!gotAny) { if (!gotAny) {
ul.innerHTML = ''; ul.innerHTML = '';
gotAny = true; gotAny = true;
} }
streams.push(data); }
const uid = data.uid || '';
const sizeMb = data.size ? (data.size / (1024 * 1024)).toFixed(1) : '?'; // Function to handle SSE errors
const mtime = data.mtime ? new Date(data.mtime * 1000).toISOString().split('T')[0].replace(/-/g, '/') : ''; function handleSSEError(error) {
const li = document.createElement('li'); // Debug messages disabled
li.innerHTML = `
<article class="stream-player"> // Only show error if we haven't already loaded any streams
<h3>${uid}</h3> if (streams.length === 0) {
<audio id="audio-${uid}" class="stream-audio" preload="auto" crossOrigin="anonymous" src="/audio/${encodeURIComponent(uid)}/stream.opus"></audio> const errorMsg = 'Error connecting to stream server. Please try again.';
<div class="audio-controls">
<button id="play-pause-${uid}">▶</button> ul.innerHTML = `
</div> <li>${errorMsg}</li>
<p class="stream-info" style='color:gray;font-size:90%'>[${sizeMb} MB, ${mtime}]</p> <li><button id="reload-streams" onclick="loadAndRenderStreams()" class="retry-button">🔄 Retry</button></li>
</article>
`; `;
// Add play/pause handler after appending to DOM
ul.appendChild(li);
// Wait for DOM update
requestAnimationFrame(() => {
const playPauseButton = document.getElementById(`play-pause-${uid}`);
const audio = document.getElementById(`audio-${uid}`);
if (playPauseButton && audio) {
playPauseButton.addEventListener('click', () => {
try {
if (audio.paused) {
// Stop any currently playing audio first
if (currentlyPlayingAudio && currentlyPlayingAudio !== audio) {
currentlyPlayingAudio.pause();
if (currentlyPlayingButton) {
currentlyPlayingButton.textContent = '▶';
}
}
// Stop the main player if it's playing
if (typeof window.stopMainAudio === 'function') {
window.stopMainAudio();
}
audio.play().then(() => {
playPauseButton.textContent = '⏸️';
currentlyPlayingAudio = audio;
currentlyPlayingButton = playPauseButton;
}).catch(e => {
console.error('Play failed:', e);
// Reset button if play fails
playPauseButton.textContent = '▶';
currentlyPlayingAudio = null;
currentlyPlayingButton = null;
});
} else {
audio.pause();
playPauseButton.textContent = '▶';
if (currentlyPlayingAudio === audio) {
currentlyPlayingAudio = null;
currentlyPlayingButton = null;
}
}
} catch (e) {
console.error('Audio error:', e);
playPauseButton.textContent = '▶';
if (currentlyPlayingAudio === audio) {
currentlyPlayingAudio = null;
currentlyPlayingButton = null;
}
}
});
}
});
highlightActiveProfileLink();
ul.appendChild(li);
highlightActiveProfileLink();
} catch (e) {
// Remove Loading... even if JSON parse fails, to avoid stuck UI
if (!gotAny) {
ul.innerHTML = '';
gotAny = true;
}
console.error('[streams-ui] SSE parse error', e, event.data);
}
};
evtSource.onerror = function(err) {
console.error('[streams-ui] SSE error', err);
ul.innerHTML = '<li>Error loading stream list</li>';
if (typeof showToast === 'function') { if (typeof showToast === 'function') {
showToast('❌ Error loading public streams.'); showToast('❌ ' + errorMsg);
} }
evtSource.close();
// Add reload button if not present // Auto-retry after 5 seconds
const reloadButton = document.getElementById('reload-streams'); setTimeout(() => {
if (!reloadButton) { loadAndRenderStreams();
const reloadHtml = '<button id="reload-streams" onclick="loadAndRenderStreams()">Reload</button>'; }, 5000);
ul.insertAdjacentHTML('beforeend', reloadHtml);
} }
}; }
// Error and open handlers are now part of the fetch implementation
// Message handling is now part of the fetch implementation
// Error handling is now part of the fetch implementation
} }
export function renderStreamList(streams) { export function renderStreamList(streams) {
const ul = document.getElementById('stream-list'); const ul = document.getElementById('stream-list');
if (!ul) { if (!ul) {
console.warn('[streams-ui] renderStreamList: #stream-list not found'); // Debug messages disabled
return; return;
} }
console.debug('[streams-ui] Rendering stream list:', streams); // Debug messages disabled
// Debug messages disabled
if (Array.isArray(streams)) { if (Array.isArray(streams)) {
if (streams.length) { if (streams.length) {
// Sort by mtime descending (most recent first) // Sort by mtime descending (most recent first)
@ -179,9 +398,10 @@ export function renderStreamList(streams) {
ul.innerHTML = streams ul.innerHTML = streams
.map(stream => { .map(stream => {
const uid = stream.uid || ''; const uid = stream.uid || '';
const username = stream.username || 'Unknown User';
const sizeKb = stream.size ? (stream.size / 1024).toFixed(1) : '?'; const sizeKb = stream.size ? (stream.size / 1024).toFixed(1) : '?';
const mtime = stream.mtime ? new Date(stream.mtime * 1000).toLocaleString() : ''; const mtime = stream.mtime ? new Date(stream.mtime * 1000).toLocaleString() : '';
return `<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${uid}</a> <span style='color:gray;font-size:90%'>[${sizeKb} KB, ${mtime}]</span></li>`; return `<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${escapeHtml(username)}</a> <span style='color:var(--text-muted);font-size:90%'>[${sizeKb} KB, ${mtime}]</span></li>`;
}) })
.join(''); .join('');
} else { } else {
@ -189,10 +409,10 @@ export function renderStreamList(streams) {
} }
} else { } else {
ul.innerHTML = '<li>Error: Invalid stream data.</li>'; ul.innerHTML = '<li>Error: Invalid stream data.</li>';
console.error('[streams-ui] renderStreamList: streams is not an array', streams); // Debug messages disabled
} }
highlightActiveProfileLink(); highlightActiveProfileLink();
console.debug('[streams-ui] renderStreamList complete'); // Debug messages disabled
} }
export function highlightActiveProfileLink() { export function highlightActiveProfileLink() {
@ -208,7 +428,6 @@ export function highlightActiveProfileLink() {
} }
export function initStreamLinks() { export function initStreamLinks() {
const ul = document.getElementById('stream-list'); const ul = document.getElementById('stream-list');
if (!ul) return; if (!ul) return;
@ -232,3 +451,259 @@ export function initStreamLinks() {
} }
}); });
} }
// Helper function to safely escape HTML
function escapeHtml(unsafe) {
if (typeof unsafe !== 'string') return '';
return unsafe
.replace(/&/g, "&amp;")
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")
.replace(/"/g, "&quot;")
.replace(/'/g, "&#039;");
}
// Audio context for Web Audio API
let audioContext = null;
let audioSource = null;
let audioBuffer = null;
let isPlaying = false;
let currentUid = null;
let currentlyPlayingButton = null; // Controls the currently active play/pause button
let startTime = 0;
let pauseTime = 0;
let audioStartTime = 0;
let audioElement = null; // HTML5 Audio element for Opus playback
// Initialize audio context
function getAudioContext() {
if (!audioContext) {
audioContext = new (window.AudioContext || window.webkitAudioContext)();
}
return audioContext;
}
// Stop current playback completely
function stopPlayback() {
// Debug messages disabled
// Stop Web Audio API if active
if (audioSource) {
try {
// Don't try to stop if already stopped
if (audioSource.context && audioSource.context.state !== 'closed') {
audioSource.stop();
audioSource.disconnect();
}
} catch (e) {
// Ignore errors when stopping already stopped sources
if (!e.message.includes('has already been stopped') &&
!e.message.includes('has already finished playing')) {
console.warn('Error stopping audio source:', e);
}
}
audioSource = null;
}
// Stop HTML5 Audio element if active
if (audioElement) {
try {
// Remove all event listeners first
if (audioElement._eventHandlers) {
const { onPlay, onPause, onEnded, onError } = audioElement._eventHandlers;
if (onPlay) audioElement.removeEventListener('play', onPlay);
if (onPause) audioElement.removeEventListener('pause', onPause);
if (onEnded) audioElement.removeEventListener('ended', onEnded);
if (onError) audioElement.removeEventListener('error', onError);
}
// Pause and reset the audio element
audioElement.pause();
audioElement.removeAttribute('src');
audioElement.load();
// Clear references
if (audioElement._eventHandlers) {
delete audioElement._eventHandlers;
}
// Nullify the element to allow garbage collection
audioElement = null;
} catch (e) {
console.warn('Error cleaning up audio element:', e);
}
}
// Reset state
audioBuffer = null;
isPlaying = false;
startTime = 0;
pauseTime = 0;
audioStartTime = 0;
// Notify global audio manager that streams player has stopped
globalAudioManager.stopPlayback('streams');
// Update UI
if (currentlyPlayingButton) {
updatePlayPauseButton(currentlyPlayingButton, false);
currentlyPlayingButton = null;
}
// Clear current playing reference
currentlyPlayingAudio = null;
}
// --- Shared Audio Player Integration ---
import { SharedAudioPlayer } from './shared-audio-player.js';
function getStreamUrl(uid) {
return `/audio/${encodeURIComponent(uid)}/stream.opus`;
}
function updatePlayPauseButton(button, isPlaying) {
if (button) button.textContent = isPlaying ? '⏸️' : '▶️';
// Optionally, update other UI elements here
}
// Only this definition should remain; remove any other updatePlayPauseButton functions.
const streamsPlayer = new SharedAudioPlayer({
playerType: 'streams',
getStreamUrl,
onUpdateButton: updatePlayPauseButton
});
// Load and play audio using SharedAudioPlayer
function loadAndPlayAudio(uid, playPauseBtn) {
streamsPlayer.play(uid, playPauseBtn);
}
// Handle audio ended event
function handleAudioEnded() {
isPlaying = false;
if (currentlyPlayingButton) {
updatePlayPauseButton(currentlyPlayingButton, false);
}
cleanupAudio();
}
// Clean up audio resources
function cleanupAudio() {
// Debug messages disabled
// Clean up Web Audio API resources if they exist
if (audioSource) {
try {
if (isPlaying) {
audioSource.stop();
}
audioSource.disconnect();
} catch (e) {
console.warn('Error cleaning up audio source:', e);
}
audioSource = null;
}
// Clean up HTML5 Audio element if it exists
if (audioElement) {
try {
// Remove event listeners first
if (audioElement._eventHandlers) {
const { onPlay, onPause, onEnded, onError } = audioElement._eventHandlers;
if (onPlay) audioElement.removeEventListener('play', onPlay);
if (onPause) audioElement.removeEventListener('pause', onPause);
if (onEnded) audioElement.removeEventListener('ended', onEnded);
if (onError) audioElement.removeEventListener('error', onError);
}
// Pause and clean up the audio element
audioElement.pause();
audioElement.removeAttribute('src');
audioElement.load();
// Force garbage collection by removing references
if (audioElement._eventHandlers) {
delete audioElement._eventHandlers;
}
audioElement = null;
} catch (e) {
console.warn('Error cleaning up audio element:', e);
}
}
// Reset state
isPlaying = false;
currentUid = null;
// Update UI
if (currentlyPlayingButton) {
updatePlayPauseButton(currentlyPlayingButton, false);
currentlyPlayingButton = null;
}
}
// Event delegation for play/pause buttons - only handle buttons within the stream list
const streamList = document.getElementById('stream-list');
if (streamList) {
streamList.addEventListener('click', async (e) => {
const playPauseBtn = e.target.closest('.play-pause-btn');
// Skip if not a play button or if it's the personal stream's play button
if (!playPauseBtn || playPauseBtn.closest('#me-page')) return;
// Prevent event from bubbling up to document-level handlers
e.stopPropagation();
e.stopImmediatePropagation();
e.preventDefault();
const uid = playPauseBtn.dataset.uid;
if (!uid) return;
// Toggle play/pause using SharedAudioPlayer
if (streamsPlayer.currentUid === uid && streamsPlayer.audioElement && !streamsPlayer.audioElement.paused && !streamsPlayer.audioElement.ended) {
streamsPlayer.pause();
} else {
await loadAndPlayAudio(uid, playPauseBtn);
}
});
}
// Handle audio end event to update button state
document.addEventListener('play', (e) => {
if (e.target.tagName === 'AUDIO' && e.target !== currentlyPlayingAudio) {
if (currentlyPlayingAudio) {
currentlyPlayingAudio.pause();
}
currentlyPlayingAudio = e.target;
// Update the play/pause button state
const playerArticle = e.target.closest('.stream-player');
if (playerArticle) {
const playBtn = playerArticle.querySelector('.play-pause-btn');
if (playBtn) {
if (currentlyPlayingButton && currentlyPlayingButton !== playBtn) {
updatePlayPauseButton(currentlyPlayingButton, false);
}
updatePlayPauseButton(playBtn, true);
currentlyPlayingButton = playBtn;
}
}
}
}, true);
// Handle audio pause event
document.addEventListener('pause', (e) => {
if (e.target.tagName === 'AUDIO' && e.target === currentlyPlayingAudio) {
const playerArticle = e.target.closest('.stream-player');
if (playerArticle) {
const playBtn = playerArticle.querySelector('.play-pause-btn');
if (playBtn) {
updatePlayPauseButton(playBtn, false);
}
}
currentlyPlayingAudio = null;
currentlyPlayingButton = null;
}
}, true);

File diff suppressed because it is too large Load Diff

View File

@ -14,6 +14,6 @@ export function showToast(message) {
setTimeout(() => { setTimeout(() => {
toast.remove(); toast.remove();
// Do not remove the container; let it persist for stacking // Do not remove the container; let it persist for stacking
}, 3500); }, 15000);
} }

169
static/uid-validator.js Normal file
View File

@ -0,0 +1,169 @@
/**
* UID Validation Utility
*
* Provides comprehensive UID format validation and sanitization
* to ensure all UIDs are properly formatted as email addresses.
*/
export class UidValidator {
constructor() {
// RFC 5322 compliant email regex (basic validation)
this.emailRegex = /^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/;
// Common invalid patterns to check against
this.invalidPatterns = [
/^devuser$/i, // Legacy username pattern
/^user\d+$/i, // Generic user patterns
/^test$/i, // Test user
/^admin$/i, // Admin user
/^\d+$/, // Pure numeric
/^[a-zA-Z]+$/, // Pure alphabetic (no @ symbol)
];
}
/**
* Validate UID format - must be a valid email address
*/
isValidFormat(uid) {
if (!uid || typeof uid !== 'string') {
return {
valid: false,
error: 'UID must be a non-empty string',
code: 'INVALID_TYPE'
};
}
const trimmed = uid.trim();
if (trimmed.length === 0) {
return {
valid: false,
error: 'UID cannot be empty',
code: 'EMPTY_UID'
};
}
// Check against invalid patterns
for (const pattern of this.invalidPatterns) {
if (pattern.test(trimmed)) {
return {
valid: false,
error: `UID matches invalid pattern: ${pattern}`,
code: 'INVALID_PATTERN'
};
}
}
// Validate email format
if (!this.emailRegex.test(trimmed)) {
return {
valid: false,
error: 'UID must be a valid email address',
code: 'INVALID_EMAIL_FORMAT'
};
}
return {
valid: true,
sanitized: trimmed.toLowerCase()
};
}
/**
* Sanitize and validate UID - ensures consistent format
*/
sanitize(uid) {
const validation = this.isValidFormat(uid);
if (!validation.valid) {
console.error('[UID-VALIDATOR] Validation failed:', validation.error, { uid });
return null;
}
return validation.sanitized;
}
/**
* Validate and throw error if invalid
*/
validateOrThrow(uid, context = 'UID') {
const validation = this.isValidFormat(uid);
if (!validation.valid) {
throw new Error(`${context} validation failed: ${validation.error} (${validation.code})`);
}
return validation.sanitized;
}
/**
* Check if a UID needs migration (legacy format)
*/
needsMigration(uid) {
if (!uid || typeof uid !== 'string') {
return false;
}
const trimmed = uid.trim();
// Check if it's already a valid email
if (this.emailRegex.test(trimmed)) {
return false;
}
// Check if it matches known legacy patterns
for (const pattern of this.invalidPatterns) {
if (pattern.test(trimmed)) {
return true;
}
}
return true; // Any non-email format needs migration
}
/**
* Get validation statistics for debugging
*/
getValidationStats(uids) {
const stats = {
total: uids.length,
valid: 0,
invalid: 0,
needsMigration: 0,
errors: {}
};
uids.forEach(uid => {
const validation = this.isValidFormat(uid);
if (validation.valid) {
stats.valid++;
} else {
stats.invalid++;
const code = validation.code || 'UNKNOWN';
stats.errors[code] = (stats.errors[code] || 0) + 1;
}
if (this.needsMigration(uid)) {
stats.needsMigration++;
}
});
return stats;
}
}
// Create singleton instance
export const uidValidator = new UidValidator();
// Legacy exports for backward compatibility
export function validateUidFormat(uid) {
return uidValidator.isValidFormat(uid).valid;
}
export function sanitizeUid(uid) {
return uidValidator.sanitize(uid);
}
export function validateUidOrThrow(uid, context) {
return uidValidator.validateOrThrow(uid, context);
}

View File

@ -1,128 +1,185 @@
// upload.js — Frontend file upload handler
import { showToast } from "./toast.js"; import { showToast } from "./toast.js";
import { playBeep } from "./sound.js"; import { playBeep } from "./sound.js";
import { logToServer } from "./app.js";
// Initialize upload system when DOM is loaded
document.addEventListener('DOMContentLoaded', () => { document.addEventListener('DOMContentLoaded', () => {
const dropzone = document.getElementById("user-upload-area"); // This module handles the file upload functionality, including drag-and-drop,
if (dropzone) { // progress indication, and post-upload actions like refreshing the file list.
dropzone.setAttribute("aria-label", "Upload area. Click or drop an audio file to upload.");
}
const fileInput = document.getElementById("fileInputUser");
const fileInfo = document.createElement("div");
fileInfo.id = "file-info";
fileInfo.style.textAlign = "center";
if (fileInput) {
fileInput.parentNode.insertBefore(fileInfo, fileInput.nextSibling);
}
const streamInfo = document.getElementById("stream-info");
const streamUrlEl = document.getElementById("streamUrl");
const spinner = document.getElementById("spinner");
let abortController;
// Upload function // DOM elements are fetched once the DOM is ready
const upload = async (file) => { const dropzone = document.getElementById("user-upload-area");
if (abortController) abortController.abort(); const fileInput = document.getElementById("fileInputUser");
abortController = new AbortController(); const fileList = document.getElementById("file-list");
fileInfo.innerText = `📁 ${file.name}${(file.size / 1024 / 1024).toFixed(2)} MB`;
if (file.size > 100 * 1024 * 1024) { // Early exit if critical UI elements are missing
showToast("❌ File too large. Please upload a file smaller than 100MB."); if (!dropzone || !fileInput || !fileList) {
// Debug messages disabled
return; return;
} }
spinner.style.display = "block";
showToast('📡 Uploading…');
fileInput.disabled = true; // Attach all event listeners
dropzone.classList.add("uploading"); initializeUploadListeners();
/**
* Main upload function
* @param {File} file - The file to upload
*/
async function upload(file) {
// Get user ID from localStorage or cookie
const uid = localStorage.getItem('uid') || getCookie('uid');
if (!uid) {
// Debug messages disabled
showToast("You must be logged in to upload files.", "error");
return;
}
// Debug messages disabled
// Create and display the upload status indicator
const statusDiv = createStatusIndicator(file.name);
fileList.prepend(statusDiv);
const progressBar = statusDiv.querySelector('.progress-bar');
const statusText = statusDiv.querySelector('.status-text');
const formData = new FormData(); const formData = new FormData();
const sessionUid = localStorage.getItem("uid");
formData.append("uid", sessionUid);
formData.append("file", file); formData.append("file", file);
formData.append("uid", uid);
const res = await fetch("/upload", { try {
signal: abortController.signal, const response = await fetch(`/upload`, {
method: "POST", method: "POST",
body: formData, body: formData,
headers: {
'Accept': 'application/json',
},
}); });
let data, parseError; if (!response.ok) {
try { const errorData = await response.json().catch(() => ({ detail: 'Upload failed with non-JSON response.' }));
data = await res.json(); throw new Error(errorData.detail || 'Unknown upload error');
} catch (e) {
parseError = e;
}
if (!data) {
showToast("❌ Upload failed: " + (parseError && parseError.message ? parseError.message : "Unknown error"));
spinner.style.display = "none";
fileInput.disabled = false;
dropzone.classList.remove("uploading");
return;
}
if (res.ok) {
if (data.quota && data.quota.used_mb !== undefined) {
const bar = document.getElementById("quota-bar");
const text = document.getElementById("quota-text");
const quotaSec = document.getElementById("quota-meter");
if (bar && text && quotaSec) {
quotaSec.hidden = false;
const used = parseFloat(data.quota.used_mb);
bar.value = used;
bar.max = 100;
text.textContent = `${used.toFixed(1)} MB used`;
}
}
spinner.style.display = "none";
fileInput.disabled = false;
dropzone.classList.remove("uploading");
showToast("✅ Upload successful.");
playBeep(432, 0.25, "sine");
} else {
streamInfo.hidden = true;
spinner.style.display = "none";
if ((data.detail || data.error || "").includes("music")) {
showToast("🎵 Upload rejected: singing or music detected.");
} else {
showToast(`❌ Upload failed: ${data.detail || data.error}`);
} }
if (fileInput) fileInput.value = null; const result = await response.json();
if (dropzone) dropzone.classList.remove("uploading"); // Debug messages disabled
if (fileInput) fileInput.disabled = false; playBeep(800, 0.2); // Success beep - higher frequency
if (streamInfo) streamInfo.classList.remove("visible", "slide-in");
// Update UI to show success
statusText.textContent = 'Success!';
progressBar.style.width = '100%';
progressBar.style.backgroundColor = 'var(--success-color)';
// Remove the status indicator after a short delay
setTimeout(() => {
statusDiv.remove();
}, 2000);
// --- Post-Upload Actions ---
await postUploadActions(uid);
} catch (error) {
// Debug messages disabled
playBeep(200, 0.5); // Error beep - lower frequency, longer duration
statusText.textContent = `Error: ${error.message}`;
progressBar.style.backgroundColor = 'var(--error-color)';
statusDiv.classList.add('upload-error');
}
} }
};
// Export the upload function for use in other modules /**
window.upload = upload; * Actions to perform after a successful upload.
* @param {string} uid - The user's ID
*/
async function postUploadActions(uid) {
// 1. Refresh the user's personal stream if the function is available
if (window.loadProfileStream) {
await window.loadProfileStream(uid);
}
// 2. Refresh the file list by re-fetching and then displaying.
if (window.fetchAndDisplayFiles) {
// Use email-based UID for file operations if available, fallback to uid
const fileOperationUid = localStorage.getItem('uid') || uid; // uid is now email-based
// Debug messages disabled
await window.fetchAndDisplayFiles(fileOperationUid);
}
// 3. Update quota display after upload
if (window.updateQuotaDisplay) {
const quotaUid = localStorage.getItem('uid') || uid;
// Debug messages disabled
await window.updateQuotaDisplay(quotaUid);
}
// 4. Refresh the public stream list to update the last update time
if (window.refreshStreamList) {
await window.refreshStreamList();
}
}
if (dropzone && fileInput) { /**
* Creates the DOM element for the upload status indicator.
* @param {string} fileName - The name of the file being uploaded.
* @returns {HTMLElement}
*/
function createStatusIndicator(fileName) {
const statusDiv = document.createElement('div');
statusDiv.className = 'upload-status-indicator';
statusDiv.innerHTML = `
<div class="file-info">
<span class="file-name">${fileName}</span>
<span class="status-text">Uploading...</span>
</div>
<div class="progress-container">
<div class="progress-bar"></div>
</div>
`;
return statusDiv;
}
/**
* Initializes all event listeners for the upload UI.
*/
function initializeUploadListeners() {
dropzone.addEventListener("click", () => { dropzone.addEventListener("click", () => {
console.log("[DEBUG] Dropzone clicked");
fileInput.click(); fileInput.click();
console.log("[DEBUG] fileInput.click() called");
}); });
dropzone.addEventListener("dragover", (e) => { dropzone.addEventListener("dragover", (e) => {
e.preventDefault(); e.preventDefault();
dropzone.classList.add("dragover"); dropzone.classList.add("dragover");
dropzone.style.transition = "background-color 0.3s ease";
}); });
dropzone.addEventListener("dragleave", () => { dropzone.addEventListener("dragleave", () => {
dropzone.classList.remove("dragover"); dropzone.classList.remove("dragover");
}); });
dropzone.addEventListener("drop", (e) => { dropzone.addEventListener("drop", (e) => {
dropzone.classList.add("pulse");
setTimeout(() => dropzone.classList.remove("pulse"), 400);
e.preventDefault(); e.preventDefault();
dropzone.classList.remove("dragover"); dropzone.classList.remove("dragover");
const file = e.dataTransfer.files[0]; const file = e.dataTransfer.files[0];
if (file) upload(file); if (file) {
upload(file);
}
}); });
fileInput.addEventListener("change", (e) => { fileInput.addEventListener("change", (e) => {
const file = e.target.files[0]; const file = e.target.files[0];
if (file) upload(file); if (file) {
upload(file);
}
}); });
} }
/**
* Helper function to get a cookie value by name.
* @param {string} name - The name of the cookie.
* @returns {string|null}
*/
function getCookie(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) return parts.pop().split(';').shift();
return null;
}
// Make the upload function globally accessible if needed by other scripts
window.upload = upload;
}); });

View File

@ -1,11 +0,0 @@
import smtplib
from email.message import EmailMessage
msg = EmailMessage()
msg["From"] = "test@keisanki.net"
msg["To"] = "oib@bubuit.net"
msg["Subject"] = "Test"
msg.set_content("Hello world")
with smtplib.SMTP("localhost") as smtp:
smtp.send_message(msg)

311
upload.py
View File

@ -5,70 +5,115 @@ from slowapi import Limiter
from slowapi.util import get_remote_address from slowapi.util import get_remote_address
from slowapi.errors import RateLimitExceeded from slowapi.errors import RateLimitExceeded
from pathlib import Path from pathlib import Path
import json
import requests
from datetime import datetime
from convert_to_opus import convert_to_opus from convert_to_opus import convert_to_opus
from models import UploadLog, UserQuota, User, PublicStream
from sqlalchemy import select, or_
from database import get_db from database import get_db
from models import UploadLog, UserQuota, User from sqlalchemy.orm import Session
from sqlalchemy import select
limiter = Limiter(key_func=get_remote_address) limiter = Limiter(key_func=get_remote_address)
router = APIRouter() router = APIRouter()
# # Not needed for SlowAPI ≥0.1.5 # # Not needed for SlowAPI ≥0.1.5
DATA_ROOT = Path("./data") DATA_ROOT = Path("./data")
@limiter.limit("5/minute") @limiter.limit("5/minute")
@router.post("/upload") @router.post("/upload")
async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), file: UploadFile = Form(...)): def upload(request: Request, uid: str = Form(...), file: UploadFile = Form(...)):
# Import here to avoid circular imports
from log import log_violation from log import log_violation
import time
# Generate a unique request ID for this upload
request_id = str(int(time.time()))
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Starting upload of {file.filename}")
try: try:
user_dir = DATA_ROOT / uid # Use the database session context manager to handle the session
user_dir.mkdir(parents=True, exist_ok=True) with get_db() as db:
raw_path = user_dir / ("raw." + file.filename.split(".")[-1])
import uuid
unique_name = str(uuid.uuid4()) + ".opus"
# Save temp upload FIRST
with open(raw_path, "wb") as f:
f.write(await file.read())
# Block music/singing via Ollama prompt
import requests
try: try:
with open(raw_path, "rb") as f: # First, verify the user exists and is confirmed
audio = f.read() user = db.query(User).filter(
res = requests.post("http://localhost:11434/api/generate", json={ (User.username == uid) | (User.email == uid)
"model": "whisper", ).first()
"prompt": "Does this audio contain music or singing? Answer yes or no only.",
"audio": audio
}, timeout=10)
resp = res.json().get("response", "").lower()
if "yes" in resp:
raw_path.unlink(missing_ok=True)
raise HTTPException(status_code=403, detail="Upload rejected: music or singing detected")
except Exception as ollama_err:
# fallback: allow, log if needed
pass
processed_path = user_dir / unique_name
# Block unconfirmed users (use ORM)
user = db.exec(select(User).where((User.username == uid) | (User.email == uid))).first()
# If result is a Row or tuple, extract the User object
if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"): if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"):
user = user[0] user = user[0]
from log import log_violation if not user:
log_violation("UPLOAD", request.client.host, uid, f"DEBUG: Incoming uid={uid}, user found={user}, confirmed={getattr(user, 'confirmed', None)}") log_violation("UPLOAD", request.client.host, uid, f"User {uid} not found")
log_violation("UPLOAD", request.client.host, uid, f"DEBUG: After unpack, user={user}, type={type(user)}, confirmed={getattr(user, 'confirmed', None)}") raise HTTPException(status_code=404, detail="User not found")
if not user or not hasattr(user, "confirmed") or not user.confirmed:
raw_path.unlink(missing_ok=True) log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] User check - found: {user is not None}, confirmed: {getattr(user, 'confirmed', False) if user else 'N/A'}")
# Check if user is confirmed
if not hasattr(user, 'confirmed') or not user.confirmed:
raise HTTPException(status_code=403, detail="Account not confirmed") raise HTTPException(status_code=403, detail="Account not confirmed")
# DB-based quota check # Use user.email as the proper UID for quota and directory operations
quota = db.get(UserQuota, uid) user_email = user.email
if quota and quota.storage_bytes >= 100 * 1024 * 1024: quota = db.get(UserQuota, user_email) or UserQuota(uid=user_email, storage_bytes=0)
raw_path.unlink(missing_ok=True)
if quota.storage_bytes >= 100 * 1024 * 1024:
raise HTTPException(status_code=400, detail="Quota exceeded") raise HTTPException(status_code=400, detail="Quota exceeded")
# Create user directory using email (proper UID) - not the uid parameter which could be username
user_dir = DATA_ROOT / user_email
user_dir.mkdir(parents=True, exist_ok=True)
# Generate a unique filename for the processed file first
import uuid
unique_name = f"{uuid.uuid4()}.opus"
raw_ext = file.filename.split(".")[-1].lower()
raw_path = user_dir / ("raw." + raw_ext)
processed_path = user_dir / unique_name
# Clean up any existing raw files first (except the one we're about to create)
for old_file in user_dir.glob('raw.*'):
try:
if old_file != raw_path: # Don't delete the file we're about to create
old_file.unlink(missing_ok=True)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Cleaned up old file: {old_file}")
except Exception as e:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_file}: {e}")
# Save the uploaded file temporarily
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Saving temporary file to {raw_path}")
try:
with open(raw_path, "wb") as f:
content = file.file.read()
if not content:
raise ValueError("Uploaded file is empty")
f.write(content)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Successfully wrote {len(content)} bytes to {raw_path}")
# EARLY DB RECORD CREATION: after upload completes, before processing
early_log = UploadLog(
uid=user_email,
ip=request.client.host,
filename=file.filename, # original filename from user
processed_filename=None, # not yet processed
size_bytes=None # not yet known
)
db.add(early_log)
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE FLUSH] Before db.flush() after early_log add")
db.flush()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE FLUSH] After db.flush() after early_log add")
db.commit()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE COMMIT] After db.commit() after early_log add")
early_log_id = early_log.id
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[DEBUG] Early UploadLog created: id={early_log_id}, filename={file.filename}, UploadLog.filename={early_log.filename}")
except Exception as e:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to save {raw_path}: {e}")
raise HTTPException(status_code=500, detail=f"Failed to save uploaded file: {e}")
# Ollama music/singing check is disabled for this release
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Ollama music/singing check is disabled")
try: try:
convert_to_opus(str(raw_path), str(processed_path)) convert_to_opus(str(raw_path), str(processed_path))
except Exception as e: except Exception as e:
@ -78,8 +123,18 @@ async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), f
original_size = raw_path.stat().st_size original_size = raw_path.stat().st_size
raw_path.unlink(missing_ok=True) # cleanup raw_path.unlink(missing_ok=True) # cleanup
# First, verify the file was created and has content
if not processed_path.exists() or processed_path.stat().st_size == 0:
raise HTTPException(status_code=500, detail="Failed to process audio file")
# Get the final file size
size = processed_path.stat().st_size
# Concatenate all .opus files in random order to stream.opus for public playback # Concatenate all .opus files in random order to stream.opus for public playback
# This is now done after the file is in its final location with log ID
from concat_opus import concat_opus_files from concat_opus import concat_opus_files
def update_stream_opus():
try: try:
concat_opus_files(user_dir, user_dir / "stream.opus") concat_opus_files(user_dir, user_dir / "stream.opus")
except Exception as e: except Exception as e:
@ -87,22 +142,77 @@ async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), f
import shutil import shutil
stream_path = user_dir / "stream.opus" stream_path = user_dir / "stream.opus"
shutil.copy2(processed_path, stream_path) shutil.copy2(processed_path, stream_path)
log_violation("STREAM_UPDATE", request.client.host, uid,
f"[fallback] Updated stream.opus with {processed_path}")
db.add(UploadLog( # Start a transaction
uid=uid, try:
ip=request.client.host, # Update the early DB record with processed filename and size
filename=file.filename, log = db.get(UploadLog, early_log_id)
size_bytes=original_size log.processed_filename = unique_name
)) log.size_bytes = size
db.add(log)
db.flush() # Ensure update is committed
# Store updated quota # Assert that log.filename is still the original filename, never overwritten
size = processed_path.stat().st_size if log.filename is None or (log.filename.endswith('.opus') and log.filename == log.processed_filename):
quota = db.get(UserQuota, uid) log_violation("UPLOAD_ERROR", request.client.host, uid,
if not quota: f"[ASSERTION FAILED] UploadLog.filename was overwritten! id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
quota = UserQuota(uid=uid) raise RuntimeError(f"UploadLog.filename was overwritten! id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
db.add(quota) else:
quota.storage_bytes += size log_violation("UPLOAD_DEBUG", request.client.host, uid,
f"[ASSERTION OK] After update: id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[COMMIT] Committing UploadLog for id={log.id}")
db.commit() db.commit()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[COMMIT OK] UploadLog committed for id={log.id}")
# Rename the processed file to include the log ID for better tracking
processed_with_id = user_dir / f"{log.id}_{unique_name}"
if processed_path.exists():
# First check if there's already a file with the same UUID but different prefix
for existing_file in user_dir.glob(f"*_{unique_name}"):
if existing_file != processed_path:
log_violation("CLEANUP", request.client.host, uid,
f"[UPLOAD] Removing duplicate file: {existing_file}")
existing_file.unlink(missing_ok=True)
# Now do the rename
if processed_path != processed_with_id:
if processed_with_id.exists():
processed_with_id.unlink(missing_ok=True)
processed_path.rename(processed_with_id)
processed_path = processed_with_id
# Only clean up raw.* files, not previously uploaded opus files
for old_temp_file in user_dir.glob('raw.*'):
try:
old_temp_file.unlink(missing_ok=True)
log_violation("CLEANUP", request.client.host, uid, f"[{request_id}] Cleaned up temp file: {old_temp_file}")
except Exception as e:
log_violation("CLEANUP_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_temp_file}: {e}")
# Get or create quota
quota = db.query(UserQuota).filter(UserQuota.uid == user_email).first()
if not quota:
quota = UserQuota(uid=user_email, storage_bytes=0)
db.add(quota)
# Update quota with the new file size
quota.storage_bytes = sum(
f.stat().st_size
for f in user_dir.glob('*.opus')
if f.name != 'stream.opus' and f != processed_path
) + size
# Update public streams
update_public_streams(user_email, quota.storage_bytes, db)
# The context manager will handle commit/rollback
# Now that the transaction is committed and files are in their final location,
# update the stream.opus file to include all files
update_stream_opus()
return { return {
"filename": file.filename, "filename": file.filename,
@ -111,15 +221,92 @@ async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), f
"used_mb": round(quota.storage_bytes / (1024 * 1024), 2) "used_mb": round(quota.storage_bytes / (1024 * 1024), 2)
} }
} }
except HTTPException as e: except HTTPException as e:
# Already a JSON response, just re-raise # Re-raise HTTP exceptions as they are already properly formatted
db.rollback()
raise e raise e
except Exception as e: except Exception as e:
# Log the error and return a 500 response
db.rollback()
import traceback import traceback
tb = traceback.format_exc() tb = traceback.format_exc()
# Log and return a JSON error # Try to log the error
try: try:
log_violation("UPLOAD", request.client.host, uid, f"Unexpected error: {type(e).__name__}: {str(e)}\n{tb}") log_violation("UPLOAD_ERROR", request.client.host, uid, f"Error processing upload: {str(e)}\n{tb}")
except Exception: except Exception:
pass pass # If logging fails, continue with the error response
return {"detail": f"Server error: {type(e).__name__}: {str(e)}"}
# Clean up the processed file if it exists
if 'processed_path' in locals() and processed_path.exists():
processed_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=f"Error processing upload: {str(e)}")
except HTTPException as e:
# Re-raise HTTP exceptions as they are already properly formatted
db.rollback()
raise e
except Exception as e:
# Log the error and return a 500 response
db.rollback()
import traceback
tb = traceback.format_exc()
# Try to log the error
try:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"Error processing upload: {str(e)}\n{tb}")
except Exception:
pass # If logging fails, continue with the error response
# Clean up the processed file if it exists
if 'processed_path' in locals() and processed_path.exists():
processed_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=f"Error processing upload: {str(e)}")
except HTTPException as e:
# Re-raise HTTP exceptions as they are already properly formatted
raise e
except Exception as e:
# Catch any other exceptions that might occur outside the main processing block
import traceback
tb = traceback.format_exc()
try:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"Unhandled error in upload handler: {str(e)}\n{tb}")
except:
pass # If logging fails, continue with the error response
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
def update_public_streams(uid: str, storage_bytes: int, db: Session):
"""Update the public streams list in the database with the latest user upload info"""
try:
# Get the user's info - uid is now email-based
user = db.query(User).filter(User.email == uid).first()
if not user:
print(f"[WARNING] User {uid} not found when updating public streams")
return
# Try to get existing public stream or create new one
public_stream = db.query(PublicStream).filter(PublicStream.uid == uid).first()
if not public_stream:
public_stream = PublicStream(uid=uid)
db.add(public_stream)
# Update the public stream info
public_stream.username = user.username
public_stream.storage_bytes = storage_bytes
public_stream.last_updated = datetime.utcnow()
# Don't commit here - let the caller handle the transaction
db.flush()
except Exception as e:
# Just log the error and let the caller handle the rollback
print(f"[ERROR] Error updating public streams: {e}")
import traceback
traceback.print_exc()
raise # Re-raise to let the caller handle the error