Compare commits

...

16 Commits

Author SHA1 Message Date
oib
988a857be4 upload: set size_bytes=0 at early log creation to satisfy NOT NULL 2025-08-08 09:54:10 +02:00
oib
44a3c06e5e gunicorn: bind to 0.0.0.0:8100 to align with nginx 2025-08-08 08:57:25 +02:00
oib
54b47f6bef migrations: use SQLAlchemy String instead of sqlmodel AutoString in base schema 2025-08-08 08:45:23 +02:00
oib
82e7712632 migrations: add initial base schema; archive legacy migrations 2025-08-08 08:42:54 +02:00
oib
01a89a6129 migrations: move scripts from dev/alembic to tracked alembic/ 2025-08-08 08:22:13 +02:00
oib
90cf7a3fe5 deps: add alembic and gunicorn; recompile requirements.txt 2025-08-08 07:45:45 +02:00
oib
ed43088637 chore(deps): recompile requirements.txt after adding alembic and gunicorn 2025-08-08 07:44:01 +02:00
oib
d4f6c05075 Reorganize development files into dev/ subdirectories
- Move database scripts to dev/scripts/
- Move SQL migrations to dev/migrations/
- Move database backup to dev/db_backups/
- Move docs/ directory to dev/docs/
- Update dev/project_documentation.md with new structure
- Keep active files (concat_opus.py, convert_to_opus.py, list_streams.py, public_streams.txt) in root
2025-08-07 19:56:19 +02:00
oib
72f79b1059 Update authentication system, database models, and UI components 2025-08-07 19:39:22 +02:00
oib
d497492186 feat: Overhaul client-side navigation and clean up project
- Implement a unified SPA routing system in nav.js, removing all legacy and conflicting navigation scripts (router.js, inject-nav.js, fix-nav.js).
- Refactor dashboard.js to delegate all navigation handling to the new nav.js module.
- Create new modular JS files (auth.js, personal-player.js, logger.js) to improve code organization.
- Fix all navigation-related bugs, including guest access and broken footer links.
- Clean up the project root by moving development scripts and backups to a dedicated /dev directory.
- Add a .gitignore file to exclude the database, logs, and other transient files from the repository.
2025-07-28 16:42:46 +02:00
oib
88e468b716 feat: migrate UID system from usernames to email addresses
- Database migration: Updated publicstream.uid from usernames to email addresses
  - devuser → oib@bubuit.net
  - oibchello → oib@chello.at
- Updated related tables (UploadLog, UserQuota) to use email-based UIDs
- Fixed backend audio route to map email UIDs to username-based directories
- Updated SSE event payloads to use email for UID and username for display
- Removed redundant display_name field from SSE events
- Fixed frontend rendering conflicts between nav.js and streams-ui.js
- Updated stream player template to display usernames instead of email addresses
- Added cache-busting parameters to force browser refresh
- Created migration script for future reference

Benefits:
- Eliminates UID duplicates and inconsistency
- Provides stable, unique email-based identifiers
- Maintains user-friendly username display
- Follows proper data normalization practices

Fixes: Stream UI now displays usernames (devuser, oibchello) instead of email addresses
2025-07-27 09:47:38 +02:00
oib
1171510683 Move legacy audio-player.js to dev directory
- audio-player.js was legacy code not used in production
- Actual audio players are in app.js (personal stream) and streams-ui.js (streams page)
- Moving to dev directory to keep production code clean
2025-07-27 09:15:35 +02:00
oib
a9a1c22fee Fix audio player synchronization between streams and personal pages
- Add global audio manager to coordinate playback between different players
- Integrate synchronization into streams-ui.js (streams page player)
- Integrate synchronization into app.js (personal stream player)
- Remove simultaneous playback issues - only one audio plays at a time
- Clean transitions when switching between streams and personal audio

Fixes issue where starting audio on one page didn't stop audio on the other page.
2025-07-27 09:13:55 +02:00
oib
fc4a9c926f Fix upload timeout issue: increase Gunicorn worker timeout to 300s
- Increased timeout from 60s to 300s (5 minutes) for large file uploads
- Added max_requests, max_requests_jitter, and worker_connections settings
- Removed limits on request line and field sizes to handle large uploads
- Also updated Nginx configuration with optimized timeout settings for /upload endpoint

This resolves the 502 Bad Gateway errors that were occurring during large file uploads due to worker timeouts.
2025-07-27 09:00:41 +02:00
oib
f4f712031e Reorganize project structure
- Move development and test files to dev/ directory
- Update .gitignore to exclude development files
- Update paths in configuration files
- Add new audio-player.js for frontend
2025-07-27 07:54:24 +02:00
oib
f6c501030e RC2 2025-07-21 17:39:09 +02:00
67 changed files with 4458 additions and 5092 deletions

79
.gitignore vendored
View File

@ -1,25 +1,80 @@
# Bytecode-Dateien
# Bytecode files
__pycache__/
*.py[cod]
# Virtuelle Umgebungen
# Virtual environments
.venv/
venv/
# Betriebssystem-Dateien
# System files
.DS_Store
Thumbs.db
# Logfiles und Dumps
# Logs and temporary files
*.log
*.bak
*.swp
*.tmp
# IDEs und Editoren
# Node.js dependencies
node_modules/
package.json
package-lock.json
yarn.lock
# Development documentation
PERFORMANCE-TESTING.md
# Build and distribution
dist/
build/
*.min.js
*.min.css
*.map
# Testing
coverage/
*.test.js
*.spec.js
.nyc_output/
# Environment variables
.env
.env.*
!.env.example
# Debug logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Local Database
dicta2stream.db
# Development directory
dev/
# Configuration files
alembic.ini
*.ini
*.conf
*.config
*.yaml
*.yml
*.toml
# IDEs and editors
.vscode/
.idea/
*.sublime-workspace
*.sublime-project
# Local development
.cache/
.temp/
.tmp/
# Project specific
data/*
!data/.gitignore
@ -28,3 +83,17 @@ log/*
streams/*
!streams/.gitignore
# Test files
tests/**/*.js
!tests/*.test.js
!tests/*.spec.js
!tests/README.md
!tests/profile-auth.js
# Performance test results
performance-results/*
!performance-results/.gitkeep
# Legacy files
public_streams.txt

View File

@ -3,7 +3,7 @@
from fastapi import APIRouter, Request, HTTPException, Depends
from fastapi.responses import JSONResponse
from sqlmodel import Session, select
from models import User, UserQuota, UploadLog, DBSession
from models import User, UserQuota, UploadLog, DBSession, PublicStream
from database import get_db
import os
from typing import Dict, Any
@ -11,88 +11,126 @@ from typing import Dict, Any
router = APIRouter(prefix="/api", tags=["account"])
@router.post("/delete-account")
async def delete_account(data: Dict[str, Any], request: Request, db: Session = Depends(get_db)):
async def delete_account(data: Dict[str, Any], request: Request):
try:
# Get UID from request data
uid = data.get("uid")
if not uid:
print(f"[DELETE_ACCOUNT] Error: Missing UID in request data")
# Debug messages disabled
raise HTTPException(status_code=400, detail="Missing UID")
ip = request.client.host
print(f"[DELETE_ACCOUNT] Processing delete request for UID: {uid} from IP: {ip}")
# Debug messages disabled
# Verify user exists and IP matches
user = db.exec(select(User).where(User.username == uid)).first()
if not user:
print(f"[DELETE_ACCOUNT] Error: User {uid} not found")
raise HTTPException(status_code=404, detail="User not found")
# Use the database session context manager
with get_db() as db:
# Handle both email-based and username-based UIDs for backward compatibility
user = None
if user.ip != ip:
print(f"[DELETE_ACCOUNT] Error: IP mismatch. User IP: {user.ip}, Request IP: {ip}")
# First try to find by email (new UID format)
if '@' in uid:
user = db.query(User).filter(User.email == uid).first()
# Debug messages disabled
# If not found by email, try by username (legacy UID format)
if not user:
user = db.query(User).filter(User.username == uid).first()
# Debug messages disabled
if not user:
# Debug messages disabled
raise HTTPException(status_code=404, detail="User not found")
# Extract user attributes while the object is still bound to the session
actual_uid = user.email
user_ip = user.ip
username = user.username
# Debug messages disabled
if user_ip != ip:
# Debug messages disabled
raise HTTPException(status_code=403, detail="Unauthorized: IP address does not match")
# Start transaction
try:
# Delete user's upload logs
uploads = db.exec(select(UploadLog).where(UploadLog.uid == uid)).all()
for upload in uploads:
db.delete(upload)
print(f"[DELETE_ACCOUNT] Deleted {len(uploads)} upload logs for user {uid}")
# Use the database session context manager for all database operations
with get_db() as db:
try:
# Delete user's upload logs (use actual_uid which is always the email)
uploads = db.query(UploadLog).filter(UploadLog.uid == actual_uid).all()
for upload in uploads:
db.delete(upload)
# Debug messages disabled
# Delete user's quota
quota = db.get(UserQuota, uid)
if quota:
db.delete(quota)
print(f"[DELETE_ACCOUNT] Deleted quota for user {uid}")
# Delete user's public streams
streams = db.query(PublicStream).filter(PublicStream.uid == actual_uid).all()
for stream in streams:
db.delete(stream)
# Debug messages disabled
# Delete user's active sessions
sessions = db.exec(select(DBSession).where(DBSession.user_id == uid)).all()
for session in sessions:
db.delete(session)
print(f"[DELETE_ACCOUNT] Deleted {len(sessions)} active sessions for user {uid}")
# Delete user's quota
quota = db.get(UserQuota, actual_uid)
if quota:
db.delete(quota)
# Debug messages disabled
# Delete user account
user_obj = db.get(User, user.email)
if user_obj:
db.delete(user_obj)
print(f"[DELETE_ACCOUNT] Deleted user account {uid} ({user.email})")
# Delete user's active sessions (check both email and username as uid)
sessions_by_email = db.query(DBSession).filter(DBSession.uid == actual_uid).all()
sessions_by_username = db.query(DBSession).filter(DBSession.uid == username).all()
all_sessions = list(sessions_by_email) + list(sessions_by_username)
# Remove duplicates using token (primary key)
unique_sessions = {session.token: session for session in all_sessions}.values()
for session in unique_sessions:
db.delete(session)
# Debug messages disabled
db.commit()
print(f"[DELETE_ACCOUNT] Database changes committed for user {uid}")
# Delete user account
user_obj = db.get(User, actual_uid) # Use actual_uid which is the email
if user_obj:
db.delete(user_obj)
# Debug messages disabled
except Exception as e:
db.rollback()
print(f"[DELETE_ACCOUNT] Database error during account deletion: {str(e)}")
raise HTTPException(status_code=500, detail="Database error during account deletion")
db.commit()
# Debug messages disabled
except Exception as e:
db.rollback()
# Debug messages disabled
# Debug messages disabled
raise HTTPException(status_code=500, detail="Database error during account deletion")
# Delete user's files
try:
user_dir = os.path.join('data', user.username)
# Use the email (actual_uid) for the directory name, which matches how files are stored
user_dir = os.path.join('data', actual_uid)
real_user_dir = os.path.realpath(user_dir)
# Security check to prevent directory traversal
if not real_user_dir.startswith(os.path.realpath('data')):
print(f"[DELETE_ACCOUNT] Security alert: Invalid user directory path: {user_dir}")
# Debug messages disabled
raise HTTPException(status_code=400, detail="Invalid user directory")
if os.path.exists(real_user_dir):
import shutil
shutil.rmtree(real_user_dir, ignore_errors=True)
print(f"[DELETE_ACCOUNT] Deleted user directory: {real_user_dir}")
# Debug messages disabled
else:
print(f"[DELETE_ACCOUNT] User directory not found: {real_user_dir}")
# Debug messages disabled
pass
except Exception as e:
print(f"[DELETE_ACCOUNT] Error deleting user files: {str(e)}")
# Debug messages disabled
# Continue even if file deletion fails, as the account is already deleted from the DB
pass
print(f"[DELETE_ACCOUNT] Successfully deleted account for user {uid}")
# Debug messages disabled
return {"status": "success", "message": "Account and all associated data have been deleted"}
except HTTPException as he:
print(f"[DELETE_ACCOUNT] HTTP Error {he.status_code}: {he.detail}")
# Debug messages disabled
raise
except Exception as e:
print(f"[DELETE_ACCOUNT] Unexpected error: {str(e)}")
# Debug messages disabled
raise HTTPException(status_code=500, detail="An unexpected error occurred")

View File

@ -5,7 +5,7 @@
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = %(here)s/alembic
script_location = alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time

View File

@ -0,0 +1,85 @@
"""initial base schema
Revision ID: 5f0b37b50730
Revises:
Create Date: 2025-08-08 08:42:06.859256
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = '5f0b37b50730'
down_revision: Union[str, Sequence[str], None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('publicstream',
sa.Column('uid', sa.String(), nullable=False),
sa.Column('username', sa.String(), nullable=True),
sa.Column('storage_bytes', sa.Integer(), nullable=False),
sa.Column('mtime', sa.Integer(), nullable=False),
sa.Column('last_updated', sa.DateTime(), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('updated_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('uid')
)
op.create_index(op.f('ix_publicstream_username'), 'publicstream', ['username'], unique=False)
op.create_table('uploadlog',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('uid', sa.String(), nullable=False),
sa.Column('ip', sa.String(), nullable=False),
sa.Column('filename', sa.String(), nullable=True),
sa.Column('processed_filename', sa.String(), nullable=True),
sa.Column('size_bytes', sa.Integer(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('user',
sa.Column('token_created', sa.DateTime(), nullable=False),
sa.Column('email', sa.String(), nullable=False),
sa.Column('username', sa.String(), nullable=False),
sa.Column('token', sa.String(), nullable=False),
sa.Column('confirmed', sa.Boolean(), nullable=False),
sa.Column('ip', sa.String(), nullable=False),
sa.PrimaryKeyConstraint('email')
)
op.create_index(op.f('ix_user_username'), 'user', ['username'], unique=True)
op.create_table('userquota',
sa.Column('uid', sa.String(), nullable=False),
sa.Column('storage_bytes', sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint('uid')
)
op.create_table('dbsession',
sa.Column('token', sa.String(), nullable=False),
sa.Column('uid', sa.String(), nullable=False),
sa.Column('ip_address', sa.String(), nullable=False),
sa.Column('user_agent', sa.String(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('expires_at', sa.DateTime(), nullable=False),
sa.Column('is_active', sa.Boolean(), nullable=False),
sa.Column('last_activity', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['uid'], ['user.email'], ),
sa.PrimaryKeyConstraint('token')
)
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('dbsession')
op.drop_table('userquota')
op.drop_index(op.f('ix_user_username'), table_name='user')
op.drop_table('user')
op.drop_table('uploadlog')
op.drop_index(op.f('ix_publicstream_username'), table_name='publicstream')
op.drop_table('publicstream')
# ### end Alembic commands ###

94
auth.py
View File

@ -1,7 +1,7 @@
"""Authentication middleware and utilities for dicta2stream"""
from fastapi import Request, HTTPException, Depends, status
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from sqlmodel import Session
from sqlmodel import Session, select
from typing import Optional
from models import User, Session as DBSession, verify_session
@ -11,40 +11,39 @@ security = HTTPBearer()
def get_current_user(
request: Request,
db: Session = Depends(get_db),
credentials: HTTPAuthorizationCredentials = Depends(security)
) -> User:
"""Dependency to get the current authenticated user"""
token = credentials.credentials
db_session = verify_session(db, token)
if not db_session:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or expired session",
headers={"WWW-Authenticate": "Bearer"},
)
# Get the user from the session
user = db.exec(
select(User).where(User.username == db_session.user_id)
).first()
if not user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="User not found",
headers={"WWW-Authenticate": "Bearer"},
)
# Attach the session to the request state for later use
request.state.session = db_session
return user
# Use the database session context manager
with get_db() as db:
db_session = verify_session(db, token)
if not db_session:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or expired session",
headers={"WWW-Authenticate": "Bearer"},
)
# Get the user from the session using query interface
user = db.query(User).filter(User.email == db_session.uid).first()
if not user:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="User not found",
headers={"WWW-Authenticate": "Bearer"},
)
# Attach the session to the request state for later use
request.state.session = db_session
return user
def get_optional_user(
request: Request,
db: Session = Depends(get_db),
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security, use_cache=False)
) -> Optional[User]:
"""Dependency that returns the current user if authenticated, None otherwise"""
@ -52,22 +51,45 @@ def get_optional_user(
return None
try:
return get_current_user(request, db, credentials)
# get_current_user now handles its own database session
return get_current_user(request, credentials)
except HTTPException:
return None
def create_session(db: Session, user: User, request: Request) -> DBSession:
"""Create a new session for the user"""
user_agent = request.headers.get("user-agent")
def create_session(user: User, request: Request) -> DBSession:
"""Create a new session for the user (valid for 24 hours)"""
import secrets
from datetime import datetime, timedelta
user_agent = request.headers.get("user-agent", "")
ip_address = request.client.host if request.client else "0.0.0.0"
session = DBSession.create_for_user(
user_id=user.username,
# Create session token and set 24-hour expiry
session_token = secrets.token_urlsafe(32)
expires_at = datetime.utcnow() + timedelta(hours=24)
# Create the session object
session = DBSession(
token=session_token,
user_id=user.email,
ip_address=ip_address,
user_agent=user_agent
user_agent=user_agent,
expires_at=expires_at,
is_active=True
)
db.add(session)
db.commit()
return session
# Use the database session context manager
with get_db() as db:
try:
db.add(session)
db.commit()
db.refresh(session) # Ensure we have the latest data
return session
except Exception as e:
db.rollback()
# Debug messages disabled
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to create session"
)

View File

@ -15,7 +15,6 @@ security = HTTPBearer()
async def logout(
request: Request,
response: Response,
db: Session = Depends(get_db),
credentials: HTTPAuthorizationCredentials = Depends(security)
):
"""Log out by invalidating the current session"""
@ -26,25 +25,28 @@ async def logout(
if not token:
return {"message": "No session to invalidate"}
try:
# Find and invalidate the session
session = db.exec(
select(DBSession)
.where(DBSession.token == token)
.where(DBSession.is_active == True) # noqa: E712
).first()
if session:
try:
session.is_active = False
db.add(session)
db.commit()
except Exception:
db.rollback()
except Exception:
# Continue with logout even if session lookup fails
pass
# Use the database session context manager
with get_db() as db:
try:
# Find and invalidate the session using query interface
session = db.query(DBSession).filter(
DBSession.token == token,
DBSession.is_active == True # noqa: E712
).first()
if session:
try:
session.is_active = False
db.add(session)
db.commit()
except Exception as e:
db.rollback()
# Debug messages disabled
# Continue with logout even if session update fails
except Exception as e:
# Debug messages disabled
# Continue with logout even if session lookup fails
pass
# Clear the session cookie
response.delete_cookie(
@ -56,7 +58,7 @@ async def logout(
)
# Clear any other auth-related cookies
for cookie_name in ["uid", "authToken", "isAuthenticated", "token"]:
for cookie_name in ["uid", "authToken", "username", "token"]:
response.delete_cookie(
key=cookie_name,
path="/",
@ -71,15 +73,15 @@ async def logout(
except HTTPException:
# Re-raise HTTP exceptions
raise
except Exception:
except Exception as e:
# Debug messages disabled
# Don't expose internal errors to the client
return {"message": "Logout processed"}
@router.get("/me")
async def get_current_user_info(
current_user: User = Depends(get_current_user),
db: Session = Depends(get_db)
current_user: User = Depends(get_current_user)
):
"""Get current user information"""
return {
@ -92,15 +94,16 @@ async def get_current_user_info(
@router.get("/sessions")
async def list_sessions(
current_user: User = Depends(get_current_user),
db: Session = Depends(get_db)
current_user: User = Depends(get_current_user)
):
"""List all active sessions for the current user"""
sessions = DBSession.get_active_sessions(db, current_user.username)
return [
{
"id": s.id,
"ip_address": s.ip_address,
# Use the database session context manager
with get_db() as db:
sessions = DBSession.get_active_sessions(db, current_user.username)
return [
{
"id": s.id,
"ip_address": s.ip_address,
"user_agent": s.user_agent,
"created_at": s.created_at.isoformat(),
"last_used_at": s.last_used_at.isoformat(),
@ -113,26 +116,34 @@ async def list_sessions(
@router.post("/sessions/{session_id}/revoke")
async def revoke_session(
session_id: int,
current_user: User = Depends(get_current_user),
db: Session = Depends(get_db)
current_user: User = Depends(get_current_user)
):
"""Revoke a specific session"""
session = db.get(DBSession, session_id)
if not session or session.user_id != current_user.username:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Session not found"
)
if not session.is_active:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Session is already inactive"
)
session.is_active = False
db.add(session)
db.commit()
return {"message": "Session revoked"}
# Use the database session context manager
with get_db() as db:
session = db.get(DBSession, session_id)
if not session or session.uid != current_user.email:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Session not found"
)
if not session.is_active:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Session is already inactive"
)
try:
session.is_active = False
db.add(session)
db.commit()
return {"message": "Session revoked successfully"}
except Exception as e:
db.rollback()
# Debug messages disabled
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to revoke session"
)

View File

@ -1,70 +0,0 @@
#!/usr/bin/env python3
"""
Create a silent OPUS audio file with 1 second of silence.
"""
import os
import opuslib
import numpy as np
import struct
# Configuration
SAMPLE_RATE = 48000
CHANNELS = 1
FRAME_SIZE = 960 # 20ms at 48kHz
SILENCE_DURATION = 1.0 # seconds
OUTPUT_FILE = "silent.opus"
# Calculate number of frames needed
num_frames = int((SAMPLE_RATE * SILENCE_DURATION) / (FRAME_SIZE * CHANNELS))
# Initialize Opus encoder
enc = opuslib.Encoder(SAMPLE_RATE, CHANNELS, 'voip')
# Create silent audio data (all zeros)
silent_frame = struct.pack('h' * FRAME_SIZE * CHANNELS, *([0] * FRAME_SIZE * CHANNELS))
# Create Ogg Opus file
with open(OUTPUT_FILE, 'wb') as f:
# Write Ogg header
f.write(b'OggS') # Magic number
f.write(b'\x00') # Version
f.write(b'\x00') # Header type (0 = normal)
f.write(b'\x00\x00\x00\x00\x00\x00\x00\x00') # Granule position
f.write(b'\x00\x00\x00\x00') # Bitstream serial number
f.write(b'\x00\x00\x00\x00') # Page sequence number
f.write(b'\x00\x00\x00\x00') # Checksum
f.write(b'\x01') # Number of segments
f.write(b'\x00') # Segment table (0 = 1 byte segment)
# Write Opus header
f.write(b'OpusHead') # Magic signature
f.write(b'\x01') # Version
f.write(chr(CHANNELS).encode('latin1')) # Channel count
f.write(struct.pack('<H', 80)) # Preskip (80 samples)
f.write(struct.pack('<I', SAMPLE_RATE)) # Input sample rate
f.write(b'\x00\x00') # Output gain
f.write(b'\x00') # Channel mapping family (0 = mono/stereo)
# Write comment header
f.write(b'OpusTags') # Magic signature
f.write(struct.pack('<I', 0)) # Vendor string length (0 for none)
f.write(struct.pack('<I', 0)) # Number of comments (0)
# Encode and write silent frames
for _ in range(num_frames):
# Encode the silent frame
encoded = enc.encode(silent_frame, FRAME_SIZE)
# Write Ogg page
f.write(b'OggS') # Magic number
f.write(b'\x00') # Version
f.write(b'\x00') # Header type (0 = normal)
f.write(struct.pack('<Q', (FRAME_SIZE * _) % (1 << 64))) # Granule position
f.write(b'\x00\x00\x00\x00') # Bitstream serial number
f.write(struct.pack('<I', _ + 2)) # Page sequence number
f.write(b'\x00\x00\x00\x00') # Checksum (0 for now)
f.write(b'\x01') # Number of segments
f.write(chr(len(encoded)).encode('latin1')) # Segment length
f.write(encoded) # The encoded data
print(f"Created silent OPUS file: {OUTPUT_FILE}")

View File

@ -1,11 +1,33 @@
# database.py — SQLModel engine/session for PostgreSQL
from sqlmodel import create_engine, Session
from sqlmodel import create_engine, Session, SQLModel
from contextlib import contextmanager
import os
POSTGRES_URL = os.getenv("DATABASE_URL", "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream")
engine = create_engine(POSTGRES_URL, echo=False)
# Debug messages disabled
POSTGRES_URL = os.getenv("DATABASE_URL", "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream")
engine = create_engine(POSTGRES_URL, echo=False) # Disable echo for production
# SQLAlchemy Base class for models
Base = SQLModel
@contextmanager
def get_db():
with Session(engine) as session:
"""Session management context manager that ensures proper commit/rollback."""
session = Session(engine)
try:
# Debug messages disabled
yield session
session.commit()
# Debug messages disabled
except Exception as e:
# Debug messages disabled
session.rollback()
raise
finally:
# Debug messages disabled
session.close()
# For backward compatibility
get_db_deprecated = get_db

View File

@ -1,212 +0,0 @@
# deletefile.py — FastAPI route for file deletion
import os
import shutil
from typing import Optional, Dict, Any
from pathlib import Path
from fastapi import APIRouter, HTTPException, Request, Depends, status, Header
from sqlalchemy import select, delete, and_
from sqlalchemy.orm import Session
from database import get_db
from models import UploadLog, UserQuota, User, DBSession
router = APIRouter()
# Use absolute path for security
DATA_ROOT = Path(os.path.abspath("./data"))
def get_current_user(
authorization: str = Header(None, description="Bearer token for authentication"),
db: Session = Depends(get_db)
) -> User:
"""
Get current user from authorization token with enhanced security.
Args:
authorization: The Authorization header containing the Bearer token
db: Database session dependency
Returns:
User: The authenticated user
Raises:
HTTPException: If authentication fails or user not found
"""
if not authorization or not authorization.startswith("Bearer "):
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Authentication required"
)
token = authorization.split(" ")[1]
try:
with Session(db) as session:
# Check if session is valid
session_stmt = select(DBSession).where(
and_(
DBSession.token == token,
DBSession.is_active == True,
DBSession.expires_at > datetime.utcnow()
)
)
db_session = session.exec(session_stmt).first()
if not db_session:
print(f"[DELETE_FILE] Invalid or expired session token")
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or expired session"
)
# Get the user
user = session.get(User, db_session.user_id)
if not user:
print(f"[DELETE_FILE] User not found for session token")
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="User not found"
)
return user
except Exception as e:
print(f"[DELETE_FILE] Error during user authentication: {str(e)}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Error during authentication"
)
@router.delete("/delete/{filename}")
async def delete_file(
request: Request,
filename: str,
db: Session = Depends(get_db),
current_user: User = Depends(get_current_user)
) -> Dict[str, Any]:
"""
Delete a file for the authenticated user with enhanced security and error handling.
Args:
request: The HTTP request object
filename: The name of the file to delete
db: Database session
current_user: The authenticated user
Returns:
Dict: Status and message of the operation
Raises:
HTTPException: If file not found, permission denied, or other errors
"""
print(f"[DELETE_FILE] Processing delete request for file '{filename}' from user {current_user.username}")
try:
# Security: Validate filename to prevent directory traversal
if not filename or any(c in filename for c in ['..', '/', '\\']):
print(f"[DELETE_FILE] Security alert: Invalid filename '{filename}'")
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Invalid filename"
)
# Construct full path with security checks
user_dir = DATA_ROOT / current_user.username
file_path = (user_dir / filename).resolve()
# Security: Ensure the file is within the user's directory
if not file_path.is_relative_to(user_dir.resolve()):
print(f"[DELETE_FILE] Security alert: Attempted path traversal: {file_path}")
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Access denied"
)
# Verify file exists and is a file
if not file_path.exists() or not file_path.is_file():
print(f"[DELETE_FILE] File not found: {file_path}")
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="File not found"
)
# Get file size before deletion for quota update
file_size = file_path.stat().st_size
print(f"[DELETE_FILE] Deleting file: {file_path} (size: {file_size} bytes)")
# Start database transaction
with Session(db) as session:
try:
# Delete the file
try:
os.unlink(file_path)
print(f"[DELETE_FILE] Successfully deleted file: {file_path}")
except OSError as e:
print(f"[DELETE_FILE] Error deleting file: {str(e)}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to delete file"
)
# Clean up any associated raw files
raw_pattern = f"raw.*{filename}"
raw_files = list(file_path.parent.glob(raw_pattern))
for raw_file in raw_files:
try:
os.unlink(raw_file)
print(f"[DELETE_FILE] Deleted raw file: {raw_file}")
except OSError as e:
print(f"[DELETE_FILE] Warning: Could not delete raw file {raw_file}: {str(e)}")
# Delete the upload log entry
result = session.execute(
delete(UploadLog).where(
and_(
UploadLog.uid == current_user.username,
UploadLog.processed_filename == filename
)
)
)
if result.rowcount == 0:
print(f"[DELETE_FILE] Warning: No upload log entry found for {filename}")
else:
print(f"[DELETE_FILE] Deleted upload log entry for {filename}")
# Update user quota
quota = session.exec(
select(UserQuota)
.where(UserQuota.uid == current_user.username)
.with_for_update()
).first()
if quota:
new_quota = max(0, quota.storage_bytes - file_size)
print(f"[DELETE_FILE] Updating quota: {quota.storage_bytes} -> {new_quota}")
quota.storage_bytes = new_quota
session.add(quota)
session.commit()
print(f"[DELETE_FILE] Successfully updated database")
return {
"status": "success",
"message": "File deleted successfully",
"bytes_freed": file_size
}
except Exception as e:
session.rollback()
print(f"[DELETE_FILE] Database error: {str(e)}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Database error during file deletion"
)
except HTTPException as he:
print(f"[DELETE_FILE] HTTP Error {he.status_code}: {he.detail}")
raise
except Exception as e:
print(f"[DELETE_FILE] Unexpected error: {str(e)}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="An unexpected error occurred"
)

View File

@ -1,40 +0,0 @@
# dev_user.py — Script to create and confirm a dev user for dicta2stream
import os
from sqlmodel import Session
from database import engine
from models import User, UserQuota
from datetime import datetime
import uuid
USERNAME = os.getenv("DEV_USERNAME", "devuser")
EMAIL = os.getenv("DEV_EMAIL", "devuser@localhost")
IP = os.getenv("DEV_IP", "127.0.0.1")
with Session(engine) as session:
user = session.get(User, EMAIL)
if not user:
token = str(uuid.uuid4())
user = User(
email=EMAIL,
username=USERNAME,
token=token,
confirmed=True,
ip=IP,
token_created=datetime.utcnow()
)
session.add(user)
print(f"[INFO] Created new dev user: {USERNAME} with email: {EMAIL}")
else:
user.confirmed = True
user.ip = IP
print(f"[INFO] Existing user found. Marked as confirmed: {USERNAME}")
quota = session.get(UserQuota, USERNAME)
if not quota:
quota = UserQuota(uid=USERNAME, storage_bytes=0)
session.add(quota)
print(f"[INFO] Created quota for user: {USERNAME}")
session.commit()
print(f"[INFO] Dev user ready: {USERNAME} ({EMAIL}) — confirmed, IP={IP}")
print(f"[INFO] To use: set localStorage uid and confirmed_uid to '{USERNAME}' in your browser.")

View File

@ -1,10 +1,16 @@
bind = "0.0.0.0:8000"
bind = "0.0.0.0:8100"
workers = 2 # Tune based on available CPU cores
worker_class = "uvicorn.workers.UvicornWorker"
timeout = 60
timeout = 300 # Increased from 60 to 300 seconds (5 minutes)
keepalive = 30
loglevel = "info"
accesslog = "-"
errorlog = "-"
proxy_allow_ips = "*"
max_requests = 1000
max_requests_jitter = 50
worker_connections = 1000
limit_request_line = 0 # No limit on request line size
limit_request_field_size = 0 # No limit on field size
limit_request_fields = 100 # Limit number of header fields

View File

@ -1,94 +0,0 @@
#!/usr/bin/env python3
"""
Script to import stream data from backup file into the publicstream table.
"""
import json
from datetime import datetime
from pathlib import Path
from sqlalchemy import create_engine, select
from sqlalchemy.orm import sessionmaker
from sqlmodel import Session
from models import PublicStream, User, UserQuota, DBSession, UploadLog
from database import engine
# Database connection URL - using the same as in database.py
DATABASE_URL = "postgresql://d2s:kuTy4ZKs2VcjgDh6@localhost:5432/dictastream"
def import_streams_from_backup(backup_file: str):
"""Import stream data from backup file into the database."""
# Set up database connection
SessionLocal = sessionmaker(bind=engine)
with Session(engine) as session:
try:
# Read the backup file
with open(backup_file, 'r') as f:
for line in f:
line = line.strip()
if not line:
continue
try:
# Parse the JSON data
stream_data = json.loads(line)
uid = stream_data.get('uid')
size = stream_data.get('size', 0)
mtime = stream_data.get('mtime', int(datetime.now().timestamp()))
if not uid:
print(f"Skipping invalid entry (missing uid): {line}")
continue
# Check if the stream already exists
existing = session.exec(
select(PublicStream).where(PublicStream.uid == uid)
).first()
now = datetime.utcnow()
if existing:
# Update existing record
existing.size = size
existing.mtime = mtime
existing.updated_at = now
session.add(existing)
print(f"Updated stream: {uid}")
else:
# Create new record
stream = PublicStream(
uid=uid,
size=size,
mtime=mtime,
created_at=now,
updated_at=now
)
session.add(stream)
print(f"Added stream: {uid}")
# Commit after each record to ensure data integrity
session.commit()
except json.JSONDecodeError as e:
print(f"Error parsing line: {line}")
print(f"Error: {e}")
session.rollback()
except Exception as e:
print(f"Error processing line: {line}")
print(f"Error: {e}")
session.rollback()
print("Import completed successfully!")
except Exception as e:
session.rollback()
print(f"Error during import: {e}")
raise
if __name__ == "__main__":
backup_file = "public_streams.txt.backup"
if not Path(backup_file).exists():
print(f"Error: Backup file '{backup_file}' not found.")
exit(1)
print(f"Starting import from {backup_file}...")
import_streams_from_backup(backup_file)

View File

@ -1,36 +0,0 @@
#!/usr/bin/env python3
"""Initialize the database with required tables"""
import os
import sys
from sqlmodel import SQLModel, create_engine
from dotenv import load_dotenv
# Add the parent directory to the path so we can import our models
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from models import User, UserQuota, UploadLog, PublicStream, Session
def init_db():
"""Initialize the database with required tables"""
# Load environment variables
load_dotenv()
# Get database URL from environment or use default
database_url = os.getenv(
"DATABASE_URL",
"postgresql://postgres:postgres@localhost/dicta2stream"
)
print(f"Connecting to database: {database_url}")
# Create database engine
engine = create_engine(database_url)
# Create all tables
print("Creating database tables...")
SQLModel.metadata.create_all(engine)
print("Database initialized successfully!")
if __name__ == "__main__":
init_db()

View File

@ -15,7 +15,7 @@ router = APIRouter()
DATA_ROOT = Path("./data")
@router.get("/streams-sse")
async def streams_sse(request: Request, db: Session = Depends(get_db)):
async def streams_sse(request: Request):
# Add CORS headers for SSE
origin = request.headers.get('origin', '')
allowed_origins = ["https://dicta2stream.net", "http://localhost:8000", "http://127.0.0.1:8000"]
@ -43,15 +43,15 @@ async def streams_sse(request: Request, db: Session = Depends(get_db)):
return Response(status_code=204, headers=headers)
async def event_wrapper():
try:
async for event in list_streams_sse(db):
yield event
except Exception as e:
# Only log errors if DEBUG is enabled
if os.getenv("DEBUG") == "1":
import traceback
traceback.print_exc()
yield f"data: {json.dumps({'error': True, 'message': 'An error occurred'})}\n\n"
# Use the database session context manager
with get_db() as db:
try:
async for event in list_streams_sse(db):
yield event
except Exception as e:
# Only log errors if DEBUG is enabled
# Debug messages disabled
yield f"data: {json.dumps({'error': True, 'message': 'An error occurred'})}\n\n"
return StreamingResponse(
event_wrapper(),
@ -66,16 +66,39 @@ async def list_streams_sse(db):
yield ":ping\n\n"
# Query all public streams from the database with required fields
stmt = select(PublicStream).order_by(PublicStream.mtime.desc())
result = db.execute(stmt)
streams = result.scalars().all()
# Also get all valid users to filter out orphaned streams
from models import User
# Use the query interface instead of execute
all_streams = db.query(PublicStream).order_by(PublicStream.mtime.desc()).all()
# Get all valid user UIDs (email and username)
all_users = db.query(User).all()
valid_uids = set()
for user in all_users:
valid_uids.add(user.email)
valid_uids.add(user.username)
# Filter out orphaned streams (streams without corresponding user accounts)
streams = []
orphaned_count = 0
for stream in all_streams:
if stream.uid in valid_uids:
streams.append(stream)
else:
orphaned_count += 1
print(f"[STREAMS] Filtering out orphaned stream: {stream.uid} (username: {stream.username})")
if orphaned_count > 0:
print(f"[STREAMS] Filtered out {orphaned_count} orphaned streams from public display")
if not streams:
print("No public streams found in the database")
yield f"data: {json.dumps({'end': True})}\n\n"
return
print(f"Found {len(streams)} public streams in the database")
# Debug messages disabled
# Send each stream as an SSE event
for stream in streams:
@ -85,54 +108,49 @@ async def list_streams_sse(db):
'uid': stream.uid or '',
'size': stream.storage_bytes or 0,
'mtime': int(stream.mtime) if stream.mtime is not None else 0,
'username': stream.username or stream.uid or '',
'display_name': stream.display_name or stream.username or stream.uid or '',
'username': stream.username or '',
'created_at': stream.created_at.isoformat() if stream.created_at else None,
'updated_at': stream.updated_at.isoformat() if stream.updated_at else None
}
print(f"Sending stream data: {stream_data}")
# Debug messages disabled
yield f"data: {json.dumps(stream_data)}\n\n"
# Small delay to prevent overwhelming the client
await asyncio.sleep(0.1)
except Exception as e:
print(f"Error processing stream {stream.uid}: {str(e)}")
if os.getenv("DEBUG") == "1":
import traceback
traceback.print_exc()
# Debug messages disabled
continue
# Send end of stream marker
print("Finished sending all streams")
# Debug messages disabled
yield f"data: {json.dumps({'end': True})}\n\n"
except Exception as e:
print(f"Error in list_streams_sse: {str(e)}")
if os.getenv("DEBUG") == "1":
import traceback
traceback.print_exc()
# Debug messages disabled
yield f"data: {json.dumps({'error': True, 'message': str(e)})}\n\n"
def list_streams(db: Session = Depends(get_db)):
@router.get("/streams")
def list_streams():
"""List all public streams from the database"""
try:
stmt = select(PublicStream).order_by(PublicStream.mtime.desc())
result = db.execute(stmt)
streams = result.scalars().all()
return {
"streams": [
{
'uid': stream.uid,
'size': stream.size,
'mtime': stream.mtime,
'created_at': stream.created_at.isoformat() if stream.created_at else None,
'updated_at': stream.updated_at.isoformat() if stream.updated_at else None
}
for stream in streams
]
}
except Exception as e:
if os.getenv("DEBUG") == "1":
import traceback
traceback.print_exc()
return {"streams": []}
# Use the database session context manager
with get_db() as db:
try:
# Use the query interface instead of execute
streams = db.query(PublicStream).order_by(PublicStream.mtime.desc()).all()
return {
"streams": [
{
'uid': stream.uid,
'size': stream.size,
'mtime': stream.mtime,
'created_at': stream.created_at.isoformat() if stream.created_at else None,
'updated_at': stream.updated_at.isoformat() if stream.updated_at else None
}
for stream in streams
]
}
except Exception as e:
# Debug messages disabled
return {"streams": []}

View File

@ -1,23 +0,0 @@
# list_user_files.py
from fastapi import APIRouter, Depends, HTTPException
from pathlib import Path
from models import User
from database import get_db
router = APIRouter()
@router.get("/user-files/{uid}")
def list_user_files(uid: str, db = Depends(get_db)):
# Check user exists and is confirmed
from sqlmodel import select
user = db.exec(select(User).where((User.username == uid) | (User.email == uid))).first()
if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"):
user = user[0]
if not user or not user.confirmed:
raise HTTPException(status_code=403, detail="Account not confirmed")
user_dir = Path("data") / uid
if not user_dir.exists() or not user_dir.is_dir():
return {"files": []}
files = [f.name for f in user_dir.iterdir() if f.is_file() and not f.name.startswith(".")]
files.sort()
return {"files": files}

3
log.py
View File

@ -15,5 +15,6 @@ def log_violation(event: str, ip: str, uid: str, reason: str):
f.write(log_entry)
# If DEBUG mode, also print to stdout
if os.getenv("DEBUG", "0") in ("1", "true", "True"): # Set DEBUG=1 in .env to enable
print(f"[DEBUG] {log_entry.strip()}")
# Debug messages disabled
pass

137
magic.py
View File

@ -12,58 +12,107 @@ import json
router = APIRouter()
@router.post("/magic-login")
async def magic_login(request: Request, response: Response, db: Session = Depends(get_db), token: str = Form(...)):
print(f"[magic-login] Received token: {token}")
user = db.exec(select(User).where(User.token == token)).first()
print(f"[magic-login] User lookup: {'found' if user else 'not found'}")
if not user:
print("[magic-login] Invalid or expired token")
return RedirectResponse(url="/?error=Invalid%20or%20expired%20token", status_code=302)
if datetime.utcnow() - user.token_created > timedelta(minutes=30):
print(f"[magic-login] Token expired for user: {user.username}")
return RedirectResponse(url="/?error=Token%20expired", status_code=302)
# Mark user as confirmed if not already
if not user.confirmed:
user.confirmed = True
user.ip = request.client.host
db.add(user)
print(f"[magic-login] User {user.username} confirmed.")
# Create a new session for the user (valid for 1 hour)
session_token = secrets.token_urlsafe(32)
expires_at = datetime.utcnow() + timedelta(hours=1)
async def magic_login(request: Request, response: Response, token: str = Form(...)):
# Debug messages disabled
# Create new session
session = DBSession(
token=session_token,
user_id=user.username,
ip_address=request.client.host or "",
user_agent=request.headers.get("user-agent", ""),
expires_at=expires_at,
is_active=True
# Use the database session context manager
with get_db() as db:
try:
# Look up user by token
user = db.query(User).filter(User.token == token).first()
# Debug messages disabled
if not user:
# Debug messages disabled
raise HTTPException(status_code=401, detail="Invalid or expired token")
if datetime.utcnow() - user.token_created > timedelta(minutes=30):
# Debug messages disabled
raise HTTPException(status_code=401, detail="Token expired")
# Mark user as confirmed if not already
if not user.confirmed:
user.confirmed = True
user.ip = request.client.host
db.add(user)
# Debug messages disabled
# Create a new session for the user (valid for 24 hours)
session_token = secrets.token_urlsafe(32)
expires_at = datetime.utcnow() + timedelta(hours=24)
# Create new session
session = DBSession(
token=session_token,
uid=user.email or user.username, # Use email as UID
ip_address=request.client.host or "",
user_agent=request.headers.get("user-agent", ""),
expires_at=expires_at,
is_active=True
)
db.add(session)
db.commit()
# Store user data for use after the session is committed
user_email = user.email or user.username
username = user.username
except Exception as e:
db.rollback()
# Debug messages disabled
# Debug messages disabled
raise HTTPException(status_code=500, detail="Database error during login")
# Determine if we're running in development (localhost) or production
is_localhost = request.url.hostname == "localhost"
# Prepare response data
response_data = {
"success": True,
"message": "Login successful",
"user": {
"email": user_email,
"username": username
},
"token": session_token # Include the token in the JSON response
}
# Create the response
response = JSONResponse(
content=response_data,
status_code=200
)
db.add(session)
db.commit()
# Set cookie with the session token (valid for 1 hour)
# Set cookies
response.set_cookie(
key="sessionid",
value=session_token,
httponly=True,
secure=not request.url.hostname == "localhost",
samesite="lax",
max_age=3600, # 1 hour
secure=not is_localhost,
samesite="lax" if is_localhost else "none",
max_age=86400, # 24 hours
path="/"
)
print(f"[magic-login] Session created for user: {user.username}")
# Redirect to success page
return RedirectResponse(
url=f"/?login=success&confirmed_uid={user.username}",
status_code=302,
headers=dict(response.headers)
response.set_cookie(
key="uid",
value=user_email,
samesite="lax" if is_localhost else "none",
secure=not is_localhost,
max_age=86400, # 24 hours
path="/"
)
response.set_cookie(
key="authToken",
value=session_token,
samesite="lax" if is_localhost else "none",
secure=not is_localhost,
max_age=86400, # 24 hours
path="/"
)
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
return response

674
main.py
View File

@ -90,9 +90,32 @@ def get_current_user(request: Request, db: Session = Depends(get_db)):
from range_response import range_response
@app.get("/audio/{uid}/{filename}")
def get_audio(uid: str, filename: str, request: Request, db: Session = Depends(get_db)):
def get_audio(uid: str, filename: str, request: Request):
# Allow public access ONLY to stream.opus
user_dir = os.path.join("data", uid)
# Use the database session context manager
with get_db() as db:
try:
# Use email-based UID directly for file system access
# If UID contains @, it's an email - use it directly
if '@' in uid:
from models import User
user = db.query(User).filter(User.email == uid).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
filesystem_uid = uid # Use email directly for directory
else:
# Legacy support for username-based UIDs - convert to email
from models import User
user = db.query(User).filter(User.username == uid).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
filesystem_uid = user.email # Convert username to email for directory
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Database error: {str(e)}")
user_dir = os.path.join("data", filesystem_uid)
file_path = os.path.join(user_dir, filename)
real_user_dir = os.path.realpath(user_dir)
real_file_path = os.path.realpath(file_path)
@ -114,7 +137,8 @@ def get_audio(uid: str, filename: str, request: Request, db: Session = Depends(g
return FileResponse(real_file_path, media_type="audio/ogg")
if debug_mode:
print("[DEBUG] FastAPI running in debug mode.")
# Debug messages disabled
pass
# Global error handler to always return JSON
from slowapi.errors import RateLimitExceeded
@ -166,7 +190,7 @@ from register import router as register_router
from magic import router as magic_router
from upload import router as upload_router
from streams import router as streams_router
from list_user_files import router as list_user_files_router
from auth_router import router as auth_router
app.include_router(streams_router)
@ -175,14 +199,100 @@ from list_streams import router as list_streams_router
from account_router import router as account_router
# Include all routers
app.include_router(auth_router)
app.include_router(auth_router, prefix="/api")
app.include_router(account_router)
app.include_router(register_router)
app.include_router(magic_router)
app.include_router(upload_router)
app.include_router(list_user_files_router)
app.include_router(list_streams_router)
@app.get("/user-files/{uid}")
async def list_user_files(uid: str):
from pathlib import Path
# Get the user's directory and check for files first
user_dir = Path("data") / uid
if not user_dir.exists() or not user_dir.is_dir():
return {"files": []}
# Get all files that actually exist on disk
existing_files = {f.name for f in user_dir.iterdir() if f.is_file()}
# Use the database session context manager for all database operations
with get_db() as db:
# Verify the user exists
user_check = db.query(User).filter((User.username == uid) | (User.email == uid)).first()
if not user_check:
raise HTTPException(status_code=404, detail="User not found")
# Query the UploadLog table for this user
all_upload_logs = db.query(UploadLog).filter(
UploadLog.uid == uid
).order_by(UploadLog.created_at.desc()).all()
# Track processed files to avoid duplicates
processed_files = set()
files_metadata = []
for log in all_upload_logs:
# Skip if no processed filename
if not log.processed_filename:
continue
# Skip if we've already processed this file
if log.processed_filename in processed_files:
continue
# Skip stream.opus from uploads list (it's a special file)
if log.processed_filename == 'stream.opus':
continue
# Skip if file doesn't exist on disk
# Files are stored with the pattern: {upload_id}_{processed_filename}
expected_filename = f"{log.id}_{log.processed_filename}"
if expected_filename not in existing_files:
# Only delete records older than 5 minutes to avoid race conditions
from datetime import datetime, timedelta
cutoff_time = datetime.utcnow() - timedelta(minutes=5)
if log.created_at < cutoff_time:
print(f"[CLEANUP] Removing orphaned DB record (older than 5min): {expected_filename}")
db.delete(log)
continue
# Add to processed files to avoid duplicates
processed_files.add(log.processed_filename)
# Always use the original filename if present
display_name = log.filename if log.filename else log.processed_filename
# Only include files that exist on disk
# Files are stored with the pattern: {upload_id}_{processed_filename}
stored_filename = f"{log.id}_{log.processed_filename}"
file_path = user_dir / stored_filename
if file_path.exists() and file_path.is_file():
try:
# Get the actual file size in case it changed
actual_size = file_path.stat().st_size
files_metadata.append({
"original_name": display_name,
"stored_name": log.processed_filename,
"size": actual_size
})
except OSError:
# If we can't access the file, skip it
continue
# Commit any database changes (deletions of non-existent files)
try:
db.commit()
except Exception as e:
print(f"[ERROR] Failed to commit database changes: {e}")
db.rollback()
return {"files": files_metadata}
# Serve static files
app.mount("/static", StaticFiles(directory="static"), name="static")
@ -245,9 +355,9 @@ def serve_me():
@app.get("/admin/stats")
def admin_stats(request: Request, db: Session = Depends(get_db)):
from sqlmodel import select
users = db.exec(select(User)).all()
users = db.query(User).all()
users_count = len(users)
total_quota = db.exec(select(UserQuota)).all()
total_quota = db.query(UserQuota).all()
total_quota_sum = sum(q.storage_bytes for q in total_quota)
violations_log = 0
try:
@ -279,10 +389,224 @@ def debug(request: Request):
MAX_QUOTA_BYTES = 100 * 1024 * 1024
# Delete account endpoint has been moved to account_router.py
# Delete account endpoint - fallback implementation since account_router.py has loading issues
@app.post("/api/delete-account")
async def delete_account_fallback(request: Request, db: Session = Depends(get_db)):
try:
# Get request data
data = await request.json()
uid = data.get("uid")
if not uid:
raise HTTPException(status_code=400, detail="Missing UID")
ip = request.client.host
# Debug messages disabled
# Find user by email or username
user = None
if '@' in uid:
user = db.exec(select(User).where(User.email == uid)).first()
if not user:
user = db.exec(select(User).where(User.username == uid)).first()
# If still not found, check if this UID exists in upload logs and try to find the associated user
if not user:
# Look for upload logs with this UID to find the real user
upload_log = db.exec(select(UploadLog).where(UploadLog.uid == uid)).first()
if upload_log:
# Try to find a user that might be associated with this UID
# Check if there's a user with the same IP or similar identifier
all_users = db.exec(select(User)).all()
for potential_user in all_users:
# Use the first confirmed user as fallback (for orphaned UIDs)
if potential_user.confirmed:
user = potential_user
# Debug messages disabled
break
if not user:
# Debug messages disabled
raise HTTPException(status_code=404, detail="User not found")
if user.ip != ip:
raise HTTPException(status_code=403, detail="Unauthorized: IP address does not match")
# Delete user data from database using the original UID
# The original UID is what's stored in the database records
# Delete upload logs for all possible UIDs (original UID, email, username)
upload_logs_to_delete = []
# Check for upload logs with original UID
upload_logs_original = db.query(UploadLog).filter(UploadLog.uid == uid).all()
if upload_logs_original:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_original)
# Check for upload logs with user email
upload_logs_email = db.query(UploadLog).filter(UploadLog.uid == user.email).all()
if upload_logs_email:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_email)
# Check for upload logs with username
upload_logs_username = db.query(UploadLog).filter(UploadLog.uid == user.username).all()
if upload_logs_username:
# Debug messages disabled
upload_logs_to_delete.extend(upload_logs_username)
# Delete all found upload log records
for log in upload_logs_to_delete:
try:
db.delete(log)
except Exception as e:
# Debug messages disabled
pass
# Debug messages disabled
# Delete user quota for both the original UID and user email (to cover all cases)
quota_original = db.get(UserQuota, uid)
if quota_original:
# Debug messages disabled
db.delete(quota_original)
quota_email = db.get(UserQuota, user.email)
if quota_email:
# Debug messages disabled
db.delete(quota_email)
# Delete user sessions
sessions = db.query(DBSession).filter(DBSession.user_id == user.username).all()
# Debug messages disabled
for session in sessions:
db.delete(session)
# Delete public stream entries for all possible UIDs
# Use select() instead of get() to find all matching records
public_streams_to_delete = []
# Check for public stream with original UID
public_stream_original = db.query(PublicStream).filter(PublicStream.uid == uid).first()
if public_stream_original:
# Debug messages disabled
public_streams_to_delete.append(public_stream_original)
# Check for public stream with user email
public_stream_email = db.query(PublicStream).filter(PublicStream.uid == user.email).first()
if public_stream_email:
# Debug messages disabled
public_streams_to_delete.append(public_stream_email)
# Check for public stream with username
public_stream_username = db.query(PublicStream).filter(PublicStream.uid == user.username).first()
if public_stream_username:
# Debug messages disabled
public_streams_to_delete.append(public_stream_username)
# Delete all found public stream records
for ps in public_streams_to_delete:
try:
# Debug messages disabled
db.delete(ps)
except Exception as e:
# Debug messages disabled
pass
# Debug messages disabled
# Delete user directory BEFORE deleting user record - check all possible locations
import shutil
# Try to delete directory with UID (email) - current standard
uid_dir = os.path.join('data', uid)
if os.path.exists(uid_dir):
# Debug messages disabled
shutil.rmtree(uid_dir, ignore_errors=True)
# Also try to delete directory with email (in case of different UID formats)
email_dir = os.path.join('data', user.email)
if os.path.exists(email_dir) and email_dir != uid_dir:
# Debug messages disabled
shutil.rmtree(email_dir, ignore_errors=True)
# Also try to delete directory with username (legacy format)
username_dir = os.path.join('data', user.username)
if os.path.exists(username_dir) and username_dir != uid_dir and username_dir != email_dir:
# Debug messages disabled
shutil.rmtree(username_dir, ignore_errors=True)
# Delete user account AFTER directory cleanup
db.delete(user)
db.commit()
# Debug messages disabled
return {"status": "success", "message": "Account deleted successfully"}
except HTTPException:
raise
except Exception as e:
# Debug messages disabled
db.rollback()
raise HTTPException(status_code=500, detail=f"Failed to delete account: {str(e)}")
# Cleanup endpoint for orphaned public streams
@app.post("/api/cleanup-streams")
async def cleanup_orphaned_streams(request: Request, db: Session = Depends(get_db)):
try:
# Get request data
data = await request.json()
admin_secret = data.get("admin_secret")
# Verify admin access
if admin_secret != ADMIN_SECRET:
raise HTTPException(status_code=403, detail="Unauthorized")
# Find orphaned public streams (streams without corresponding user accounts)
all_streams = db.query(PublicStream).all()
all_users = db.query(User).all()
# Create sets of valid UIDs from user accounts
valid_uids = set()
for user in all_users:
valid_uids.add(user.email)
valid_uids.add(user.username)
orphaned_streams = []
for stream in all_streams:
if stream.uid not in valid_uids:
orphaned_streams.append(stream)
# Delete orphaned streams
deleted_count = 0
for stream in orphaned_streams:
try:
print(f"[CLEANUP] Deleting orphaned stream: {stream.uid} (username: {stream.username})")
db.delete(stream)
deleted_count += 1
except Exception as e:
print(f"[CLEANUP] Error deleting stream {stream.uid}: {e}")
db.commit()
print(f"[CLEANUP] Deleted {deleted_count} orphaned public streams")
return {
"status": "success",
"message": f"Deleted {deleted_count} orphaned public streams",
"deleted_streams": [s.uid for s in orphaned_streams]
}
except HTTPException:
raise
except Exception as e:
print(f"[CLEANUP] Error: {str(e)}")
db.rollback()
raise HTTPException(status_code=500, detail=f"Cleanup failed: {str(e)}")
# Original delete account endpoint has been moved to account_router.py
@app.delete("/uploads/{uid}/{filename}")
async def delete_file(uid: str, filename: str, request: Request, db: Session = Depends(get_db)):
async def delete_file(uid: str, filename: str, request: Request):
"""
Delete a file for a specific user.
@ -306,26 +630,84 @@ async def delete_file(uid: str, filename: str, request: Request, db: Session = D
if user.ip != ip:
raise HTTPException(status_code=403, detail="Device/IP mismatch. Please log in again.")
# Set up user directory and validate paths
user_dir = os.path.join('data', user.username)
# Set up user directory using email (matching upload logic)
user_dir = os.path.join('data', user.email)
os.makedirs(user_dir, exist_ok=True)
# Decode URL-encoded filename
from urllib.parse import unquote
filename = unquote(filename)
# Debug: Print the user info and filename being used
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
if os.path.exists(user_dir):
# Debug messages disabled
pass
# Construct and validate target path
target_path = os.path.join(user_dir, filename)
real_target_path = os.path.realpath(target_path)
real_user_dir = os.path.realpath(user_dir)
# Debug: Print the constructed paths
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Security check: Ensure the target path is inside the user's directory
if not real_target_path.startswith(real_user_dir + os.sep):
# Debug messages disabled
raise HTTPException(status_code=403, detail="Invalid file path")
# Check if file exists
if not os.path.isfile(real_target_path):
raise HTTPException(status_code=404, detail=f"File not found: {filename}")
# Debug: List files in the directory to help diagnose the issue
try:
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
if os.path.exists(real_user_dir):
files_in_dir = os.listdir(real_user_dir)
# Debug messages disabled
# Print detailed file info
for f in files_in_dir:
full_path = os.path.join(real_user_dir, f)
try:
# Debug messages disabled
pass
except Exception as e:
# Debug messages disabled
pass
# Debug messages disabled
# Debug messages disabled
# Debug messages disabled
# Try to find a matching file (case-insensitive, partial match)
matching_files = [f for f in files_in_dir if filename.lower() in f.lower()]
if matching_files:
# Debug messages disabled
# Use the first matching file
real_target_path = os.path.join(real_user_dir, matching_files[0])
# Debug messages disabled
# Debug messages disabled
else:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"File not found: {filename}")
else:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"User directory not found")
except HTTPException:
raise
except Exception as e:
# Debug messages disabled
raise HTTPException(status_code=404, detail=f"File not found: {filename}")
# Delete both the target file and its UUID-only variant
deleted_files = []
@ -364,20 +746,23 @@ async def delete_file(uid: str, filename: str, request: Request, db: Session = D
# Clean up the database record for this file
try:
# Find and delete the upload log entry
log_entry = db.exec(
select(UploadLog)
.where(UploadLog.uid == uid)
.where(UploadLog.processed_filename == filename)
).first()
if log_entry:
db.delete(log_entry)
db.commit()
log_violation("DB_CLEANUP", ip, uid, f"Removed DB record for {filename}")
with get_db() as db:
try:
# Find and delete the upload log entry
log_entry = db.query(UploadLog).filter(
UploadLog.uid == uid,
UploadLog.processed_filename == filename
).first()
if log_entry:
db.delete(log_entry)
db.commit()
log_violation("DB_CLEANUP", ip, uid, f"Removed DB record for {filename}")
except Exception as e:
db.rollback()
raise e
except Exception as e:
log_violation("DB_CLEANUP_ERROR", ip, uid, f"Failed to clean up DB record: {str(e)}")
db.rollback()
# Regenerate stream.opus after file deletion
try:
@ -392,14 +777,17 @@ async def delete_file(uid: str, filename: str, request: Request, db: Session = D
# Update user quota in a separate try-except to not fail the entire operation
try:
# Use verify_and_fix_quota to ensure consistency between disk and DB
total_size = verify_and_fix_quota(db, user.username, user_dir)
log_violation("QUOTA_UPDATE", ip, uid,
f"Updated quota: {total_size} bytes")
with get_db() as db:
try:
# Use verify_and_fix_quota to ensure consistency between disk and DB
total_size = verify_and_fix_quota(db, user.username, user_dir)
log_violation("QUOTA_UPDATE", ip, uid,
f"Updated quota: {total_size} bytes")
except Exception as e:
db.rollback()
raise e
except Exception as e:
log_violation("QUOTA_ERROR", ip, uid, f"Quota update failed: {str(e)}")
db.rollback()
return {"status": "deleted"}
@ -431,11 +819,13 @@ def verify_and_fix_quota(db: Session, uid: str, user_dir: str) -> int:
if os.path.isfile(stream_opus_path):
try:
total_size = os.path.getsize(stream_opus_path)
print(f"[QUOTA] Stream.opus size for {uid}: {total_size} bytes")
# Debug messages disabled
except (OSError, FileNotFoundError) as e:
print(f"[QUOTA] Error getting size for stream.opus: {e}")
# Debug messages disabled
pass
else:
print(f"[QUOTA] stream.opus not found in {user_dir}")
# Debug messages disabled
pass
# Update quota in database
q = db.get(UserQuota, uid) or UserQuota(uid=uid, storage_bytes=0)
@ -443,123 +833,143 @@ def verify_and_fix_quota(db: Session, uid: str, user_dir: str) -> int:
db.add(q)
# Clean up any database records for files that don't exist
uploads = db.exec(select(UploadLog).where(UploadLog.uid == uid)).all()
# BUT only for records older than 5 minutes to avoid race conditions with recent uploads
from datetime import datetime, timedelta
cutoff_time = datetime.utcnow() - timedelta(minutes=5)
uploads = db.query(UploadLog).filter(
UploadLog.uid == uid,
UploadLog.created_at < cutoff_time # Only check older records
).all()
for upload in uploads:
if upload.processed_filename: # Only check if processed_filename exists
stored_filename = f"{upload.id}_{upload.processed_filename}"
file_path = os.path.join(user_dir, stored_filename)
if not os.path.isfile(file_path):
print(f"[QUOTA] Removing orphaned DB record: {stored_filename}")
# Debug messages disabled
db.delete(upload)
try:
db.commit()
print(f"[QUOTA] Updated quota for {uid}: {total_size} bytes")
# Debug messages disabled
except Exception as e:
print(f"[QUOTA] Error committing quota update: {e}")
# Debug messages disabled
db.rollback()
raise
return total_size
@app.get("/me/{uid}")
def get_me(uid: str, request: Request, response: Response, db: Session = Depends(get_db)):
def get_me(uid: str, request: Request, response: Response):
# Add headers to prevent caching
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "0"
print(f"[DEBUG] GET /me/{uid} - Client IP: {request.client.host}")
try:
# Get user info
user = get_user_by_uid(uid)
if not user:
print(f"[ERROR] User with UID {uid} not found")
raise HTTPException(status_code=404, detail="User not found")
# Only enforce IP check in production
if not debug_mode:
if user.ip != request.client.host:
print(f"[WARNING] IP mismatch for UID {uid}: {request.client.host} != {user.ip}")
# In production, we might want to be more strict
# But for now, we'll just log a warning in development
if not debug_mode:
raise HTTPException(status_code=403, detail="IP address mismatch")
# Get user directory
user_dir = os.path.join('data', uid)
os.makedirs(user_dir, exist_ok=True)
# Get all upload logs for this user
upload_logs = db.exec(
select(UploadLog)
.where(UploadLog.uid == uid)
.order_by(UploadLog.created_at.desc())
).all()
print(f"[DEBUG] Found {len(upload_logs)} upload logs for UID {uid}")
# Build file list from database records, checking if files exist on disk
files = []
seen_files = set() # Track seen files to avoid duplicates
print(f"[DEBUG] Processing {len(upload_logs)} upload logs for UID {uid}")
for i, log in enumerate(upload_logs):
if not log.filename or not log.processed_filename:
print(f"[DEBUG] Skipping log entry {i}: missing filename or processed_filename")
continue
# The actual filename on disk has the log ID prepended
stored_filename = f"{log.id}_{log.processed_filename}"
file_path = os.path.join(user_dir, stored_filename)
# Debug messages disabled
# Use the database session context manager for all database operations
with get_db() as db:
try:
# Get user info
user = db.query(User).filter((User.username == uid) | (User.email == uid)).first()
if not user:
print(f"[ERROR] User with UID {uid} not found")
raise HTTPException(status_code=404, detail="User not found")
# Skip if we've already seen this file
if stored_filename in seen_files:
print(f"[DEBUG] Skipping duplicate file: {stored_filename}")
continue
seen_files.add(stored_filename)
# Only include the file if it exists on disk and is not stream.opus
if os.path.isfile(file_path) and stored_filename != 'stream.opus':
try:
# Get the actual file size in case it changed
file_size = os.path.getsize(file_path)
file_info = {
"name": stored_filename,
"original_name": log.filename,
"size": file_size
}
files.append(file_info)
print(f"[DEBUG] Added file {len(files)}: {log.filename} (stored as {stored_filename}, {file_size} bytes)")
except OSError as e:
print(f"[WARNING] Could not access file {stored_filename}: {e}")
else:
print(f"[DEBUG] File not found on disk or is stream.opus: {stored_filename}")
# Log all files being returned
print("[DEBUG] All files being returned:")
for i, file_info in enumerate(files, 1):
print(f" {i}. {file_info['name']} (original: {file_info['original_name']}, size: {file_info['size']} bytes)")
# Verify and fix quota based on actual files on disk
total_size = verify_and_fix_quota(db, uid, user_dir)
quota_mb = round(total_size / (1024 * 1024), 2)
print(f"[DEBUG] Verified quota for UID {uid}: {quota_mb} MB")
# Only enforce IP check in production
if not debug_mode:
if user.ip != request.client.host:
print(f"[WARNING] IP mismatch for UID {uid}: {request.client.host} != {user.ip}")
# In production, we might want to be more strict
if not debug_mode:
raise HTTPException(status_code=403, detail="IP address mismatch")
response_data = {
"files": files,
"quota": quota_mb
}
print(f"[DEBUG] Returning {len(files)} files and quota info")
return response_data
except HTTPException:
# Re-raise HTTP exceptions as they are
raise
except Exception as e:
# Log the full traceback for debugging
import traceback
error_trace = traceback.format_exc()
print(f"[ERROR] Error in /me/{uid} endpoint: {str(e)}\n{error_trace}")
# Return a 500 error with a generic message
raise HTTPException(status_code=500, detail="Internal server error")
# Get user directory
user_dir = os.path.join('data', uid)
os.makedirs(user_dir, exist_ok=True)
# Get all upload logs for this user using the query interface
upload_logs = db.query(UploadLog).filter(
UploadLog.uid == uid
).order_by(UploadLog.created_at.desc()).all()
# Debug messages disabled
# Build file list from database records, checking if files exist on disk
files = []
seen_files = set() # Track seen files to avoid duplicates
# Debug messages disabled
for i, log in enumerate(upload_logs):
if not log.filename or not log.processed_filename:
# Debug messages disabled
continue
# The actual filename on disk has the log ID prepended
stored_filename = f"{log.id}_{log.processed_filename}"
file_path = os.path.join(user_dir, stored_filename)
# Skip if we've already seen this file
if stored_filename in seen_files:
# Debug messages disabled
continue
seen_files.add(stored_filename)
# Only include the file if it exists on disk and is not stream.opus
if os.path.isfile(file_path) and stored_filename != 'stream.opus':
try:
# Get the actual file size in case it changed
file_size = os.path.getsize(file_path)
file_info = {
"name": stored_filename,
"original_name": log.filename,
"size": file_size
}
files.append(file_info)
# Debug messages disabled
except OSError as e:
print(f"[WARNING] Could not access file {stored_filename}: {e}")
else:
# Debug messages disabled
pass
# Log all files being returned
# Debug messages disabled
# for i, file_info in enumerate(files, 1):
# print(f" {i}. {file_info['name']} (original: {file_info['original_name']}, size: {file_info['size']} bytes)")
# Verify and fix quota based on actual files on disk
total_size = verify_and_fix_quota(db, uid, user_dir)
quota_mb = round(total_size / (1024 * 1024), 2)
max_quota_mb = round(MAX_QUOTA_BYTES / (1024 * 1024), 2)
# Debug messages disabled
response_data = {
"files": files,
"quota": {
"used": quota_mb,
"max": max_quota_mb,
"used_bytes": total_size,
"max_bytes": MAX_QUOTA_BYTES,
"percentage": round((total_size / MAX_QUOTA_BYTES) * 100, 2) if MAX_QUOTA_BYTES > 0 else 0
}
}
# Debug messages disabled
return response_data
except HTTPException:
# Re-raise HTTP exceptions as they are
raise
except Exception as e:
# Log the full traceback for debugging
import traceback
error_trace = traceback.format_exc()
print(f"[ERROR] Error in /me/{uid} endpoint: {str(e)}\n{error_trace}")
# Rollback any database changes in case of error
db.rollback()
# Return a 500 error with a generic message
raise HTTPException(status_code=500, detail="Internal server error")

View File

@ -1,67 +0,0 @@
"""Add session and public_stream tables
Revision ID: 0002
Revises: 0001
Create Date: 2023-04-01 00:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = '0002'
down_revision = '0001'
branch_labels = None
depends_on = None
def upgrade():
# Create public_stream table
op.create_table(
'public_stream',
sa.Column('uid', sa.String(), nullable=False, comment='User ID of the stream owner'),
sa.Column('filename', sa.String(), nullable=False, comment='Name of the audio file'),
sa.Column('size', sa.BigInteger(), nullable=False, comment='File size in bytes'),
sa.Column('mtime', sa.Float(), nullable=False, comment='Last modified time as Unix timestamp'),
sa.Column('created_at', sa.DateTime(), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(), server_default=sa.text('now()'), nullable=False, onupdate=sa.text('now()')),
sa.PrimaryKeyConstraint('uid', 'filename')
)
# Create session table
op.create_table(
'session',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('user_id', sa.String(), nullable=False, index=True, comment='Reference to user.username'),
sa.Column('token', sa.Text(), nullable=False, index=True, comment='Random session token'),
sa.Column('ip_address', sa.String(45), nullable=False, comment='IP address of the client'),
sa.Column('user_agent', sa.Text(), nullable=True, comment='User-Agent header from the client'),
sa.Column('created_at', sa.DateTime(), server_default=sa.text('now()'), nullable=False),
sa.Column('expires_at', sa.DateTime(), nullable=False, comment='When the session expires'),
sa.Column('last_used_at', sa.DateTime(), server_default=sa.text('now()'), nullable=False, onupdate=sa.text('now()')),
sa.Column('is_active', sa.Boolean(), server_default=sa.text('true'), nullable=False, comment='Whether the session is active'),
sa.ForeignKeyConstraint(['user_id'], ['user.username'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
# Create indexes
op.create_index('ix_session_user_id', 'session', ['user_id'], unique=False)
op.create_index('ix_session_token', 'session', ['token'], unique=True)
op.create_index('ix_session_expires_at', 'session', ['expires_at'], unique=False)
op.create_index('ix_session_last_used_at', 'session', ['last_used_at'], unique=False)
op.create_index('ix_public_stream_uid', 'public_stream', ['uid'], unique=False)
op.create_index('ix_public_stream_updated_at', 'public_stream', ['updated_at'], unique=False)
def downgrade():
# Drop indexes first
op.drop_index('ix_session_user_id', table_name='session')
op.drop_index('ix_session_token', table_name='session')
op.drop_index('ix_session_expires_at', table_name='session')
op.drop_index('ix_session_last_used_at', table_name='session')
op.drop_index('ix_public_stream_uid', table_name='public_stream')
op.drop_index('ix_public_stream_updated_at', table_name='public_stream')
# Drop tables
op.drop_table('session')
op.drop_table('public_stream')

View File

@ -1,24 +0,0 @@
"""Add processed_filename to UploadLog
Revision ID: add_processed_filename_to_uploadlog
Revises:
Create Date: 2025-06-28 13:20:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'add_processed_filename_to_uploadlog'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# Add the processed_filename column to the uploadlog table
op.add_column('uploadlog',
sa.Column('processed_filename', sa.String(), nullable=True))
def downgrade():
# Remove the processed_filename column if rolling back
op.drop_column('uploadlog', 'processed_filename')

View File

@ -9,7 +9,6 @@ class User(SQLModel, table=True):
token_created: datetime = Field(default_factory=datetime.utcnow)
email: str = Field(primary_key=True)
username: str = Field(unique=True, index=True)
display_name: str = Field(default="", nullable=True)
token: str
confirmed: bool = False
ip: str = Field(default="")
@ -32,7 +31,7 @@ class UploadLog(SQLModel, table=True):
class DBSession(SQLModel, table=True):
token: str = Field(primary_key=True)
user_id: str = Field(foreign_key="user.username")
uid: str = Field(foreign_key="user.email") # This references User.email (primary key)
ip_address: str
user_agent: str
created_at: datetime = Field(default_factory=datetime.utcnow)
@ -45,7 +44,6 @@ class PublicStream(SQLModel, table=True):
"""Stores public stream metadata for all users"""
uid: str = Field(primary_key=True)
username: Optional[str] = Field(default=None, index=True)
display_name: Optional[str] = Field(default=None)
storage_bytes: int = 0
mtime: int = Field(default_factory=lambda: int(datetime.utcnow().timestamp()))
last_updated: Optional[datetime] = Field(default_factory=datetime.utcnow)
@ -55,26 +53,26 @@ class PublicStream(SQLModel, table=True):
def get_user_by_uid(uid: str) -> Optional[User]:
"""
Retrieve a user by their UID (username).
Retrieve a user by their UID (email).
Note: In this application, the User model uses email as primary key,
but we're using username as UID for API routes. This function looks up
users by username.
Note: In this application, UIDs are consistently email-based.
The User model uses email as primary key, and all user references
throughout the system use email format.
Args:
uid: The username to look up
uid: The email to look up
Returns:
User object if found, None otherwise
"""
with Session(engine) as session:
# First try to find by username (which is what we're using as UID)
statement = select(User).where(User.username == uid)
# Primary lookup by email (which is what we're using as UID)
statement = select(User).where(User.email == uid)
user = session.exec(statement).first()
# If not found by username, try by email (for backward compatibility)
if not user and '@' in uid:
statement = select(User).where(User.email == uid)
# Fallback: try by username for legacy compatibility
if not user and '@' not in uid:
statement = select(User).where(User.username == uid)
user = session.exec(statement).first()
return user
@ -85,11 +83,10 @@ def verify_session(db: Session, token: str) -> DBSession:
from datetime import datetime
# Find the session
session = db.exec(
select(DBSession)
.where(DBSession.token == token)
.where(DBSession.is_active == True) # noqa: E712
.where(DBSession.expires_at > datetime.utcnow())
session = db.query(DBSession).filter(
DBSession.token == token,
DBSession.is_active == True, # noqa: E712
DBSession.expires_at > datetime.utcnow()
).first()
if not session:

View File

@ -1,4 +0,0 @@
INFO: Will watch for changes in these directories: ['/home/oib/games/dicta2stream']
ERROR: [Errno 98] Address already in use
INFO: Will watch for changes in these directories: ['/home/oib/games/dicta2stream']
ERROR: [Errno 98] Address already in use

View File

@ -1,2 +0,0 @@
{"uid":"oibchello","size":3371119,"mtime":1752994076}
{"uid":"orangeicebear","size":1734396,"mtime":1748767975}

View File

@ -1,3 +0,0 @@
{"uid":"devuser","size":90059327,"mtime":1752911461}
{"uid":"oibchello","size":16262818,"mtime":1752911899}
{"uid":"orangeicebear","size":1734396,"mtime":1748767975}

View File

@ -16,27 +16,27 @@ MAGIC_FROM = "noreply@dicta2stream.net"
MAGIC_DOMAIN = "https://dicta2stream.net"
DATA_ROOT = Path("./data")
def initialize_user_directory(username: str):
def initialize_user_directory(uid: str):
"""Initialize user directory with a silent stream.opus file"""
try:
user_dir = DATA_ROOT / username
user_dir = DATA_ROOT / uid
default_stream_path = DATA_ROOT / "stream.opus"
print(f"[DEBUG] Initializing user directory: {user_dir.absolute()}")
# Debug messages disabled
# Create the directory if it doesn't exist
user_dir.mkdir(parents=True, exist_ok=True)
print(f"[DEBUG] Directory created or already exists: {user_dir.exists()}")
# Debug messages disabled
# Create stream.opus by copying the default stream.opus file
user_stream_path = user_dir / "stream.opus"
print(f"[DEBUG] Creating stream.opus at: {user_stream_path.absolute()}")
# Debug messages disabled
if not user_stream_path.exists():
if default_stream_path.exists():
import shutil
shutil.copy2(default_stream_path, user_stream_path)
print(f"[DEBUG] Copied default stream.opus to {user_stream_path}")
# Debug messages disabled
else:
print(f"[ERROR] Default stream.opus not found at {default_stream_path}")
# Fallback: create an empty file to prevent errors
@ -45,71 +45,108 @@ def initialize_user_directory(username: str):
return True
except Exception as e:
print(f"Error initializing user directory for {username}: {str(e)}")
print(f"Error initializing user directory for {uid}: {str(e)}")
return False
@router.post("/register")
def register(request: Request, email: str = Form(...), user: str = Form(...), db: Session = Depends(get_db)):
def register(request: Request, email: str = Form(...), user: str = Form(...)):
from sqlalchemy.exc import IntegrityError
# Try to find user by email or username
existing_user = db.get(User, email)
if not existing_user:
# Try by username (since username is not primary key, need to query)
stmt = select(User).where(User.username == user)
existing_user = db.exec(stmt).first()
token = str(uuid.uuid4())
if existing_user:
# Update token, timestamp, and ip, set confirmed False
from datetime import datetime
existing_user.token = token
existing_user.token_created = datetime.utcnow()
existing_user.confirmed = False
existing_user.ip = request.client.host
db.add(existing_user)
from datetime import datetime
# Use the database session context manager
with get_db() as db:
try:
db.commit()
except Exception as e:
db.rollback()
raise HTTPException(status_code=500, detail=f"Database error: {e}")
else:
# Register new user
db.add(User(email=email, username=user, token=token, confirmed=False, ip=request.client.host))
db.add(UserQuota(uid=user))
try:
# First commit the user to the database
db.commit()
# Check if user exists by email
existing_user_by_email = db.get(User, email)
# Check if user exists by username
existing_user_by_username = db.query(User).filter(User.username == user).first()
token = str(uuid.uuid4())
action = None
# Case 1: Email and username match in db - it's a login
if existing_user_by_email and existing_user_by_username and existing_user_by_email.email == existing_user_by_username.email:
# Update token for existing user (login)
existing_user_by_email.token = token
existing_user_by_email.token_created = datetime.utcnow()
existing_user_by_email.confirmed = False
existing_user_by_email.ip = request.client.host
db.add(existing_user_by_email)
db.commit()
action = "login"
# Case 2: Email matches but username does not - only one account per email
elif existing_user_by_email and (not existing_user_by_username or existing_user_by_email.email != existing_user_by_username.email):
raise HTTPException(status_code=409, detail="📧 This email is already registered with a different username.\nOnly one account per email is allowed.")
# Case 3: Email does not match but username is in db - username already taken
elif not existing_user_by_email and existing_user_by_username:
raise HTTPException(status_code=409, detail="👤 This username is already taken.\nPlease choose a different username.")
# Case 4: Neither email nor username exist - create new user
elif not existing_user_by_email and not existing_user_by_username:
# Register new user
new_user = User(email=email, username=user, token=token, confirmed=False, ip=request.client.host)
new_quota = UserQuota(uid=email) # Use email as UID for quota tracking
db.add(new_user)
db.add(new_quota)
db.commit()
action = "register"
# Initialize user directory after successful registration
if not initialize_user_directory(email):
print(f"[WARNING] Failed to initialize user directory for {email}")
# If we get here, we've either logged in or registered successfully
if action not in ["login", "register"]:
raise HTTPException(status_code=400, detail="Invalid registration request")
# Store the email for use after the session is committed
user_email = email
# Only after successful commit, initialize the user directory
initialize_user_directory(user)
initialize_user_directory(email)
except Exception as e:
db.rollback()
if isinstance(e, IntegrityError):
# Race condition: user created after our check
# Try again as login
stmt = select(User).where((User.email == email) | (User.username == user))
existing_user = db.exec(stmt).first()
if existing_user:
existing_user.token = token
existing_user.confirmed = False
existing_user.ip = request.client.host
db.add(existing_user)
db.commit()
# Check which constraint was violated to provide specific feedback
error_str = str(e).lower()
if 'username' in error_str or 'user_username_key' in error_str:
raise HTTPException(status_code=409, detail="👤 This username is already taken.\nPlease choose a different username.")
elif 'email' in error_str or 'user_pkey' in error_str:
raise HTTPException(status_code=409, detail="📧 This email is already registered with a different username.\nOnly one account per email is allowed.")
else:
raise HTTPException(status_code=409, detail="Username or email already exists.")
# Generic fallback if we can't determine the specific constraint
raise HTTPException(status_code=409, detail="⚠️ Registration failed due to a conflict.\nPlease try again with different credentials.")
else:
raise HTTPException(status_code=500, detail=f"Database error: {e}")
# Send magic link
msg = EmailMessage()
msg["From"] = MAGIC_FROM
msg["To"] = email
msg["Subject"] = "Your magic login link"
msg.set_content(
f"Hello {user},\n\nClick to confirm your account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time login."
)
# Send magic link with appropriate message based on action
msg = EmailMessage()
msg["From"] = MAGIC_FROM
msg["To"] = email
if action == "login":
msg["Subject"] = "Your magic login link"
msg.set_content(
f"Hello {user},\n\nClick to log in to your account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time login."
)
response_message = "📧 Check your email for a magic login link!"
else: # registration
msg["Subject"] = "Welcome to dicta2stream - Confirm your account"
msg.set_content(
f"Hello {user},\n\nWelcome to dicta2stream! Click to confirm your new account:\n{MAGIC_DOMAIN}/?token={token}\n\nThis link is valid for one-time confirmation."
)
response_message = "🎉 Account created! Check your email for a magic login link!"
try:
with smtplib.SMTP("localhost") as smtp:
smtp.send_message(msg)
except Exception as e:
raise HTTPException(status_code=500, detail=f"Email failed: {e}")
return { "message": "Confirmation sent" }
return {"message": response_message, "action": action}

View File

View File

@ -12,3 +12,5 @@ uvicorn==0.34.2
uvloop==0.21.0
watchfiles==1.0.5
websockets==15.0.1
alembic
gunicorn

View File

@ -4,6 +4,8 @@
#
# pip-compile requirements.in
#
alembic==1.16.4
# via -r requirements.in
annotated-types==0.6.0
# via pydantic
anyio==4.2.0
@ -18,6 +20,8 @@ fastapi==0.115.12
# via -r requirements.in
greenlet==3.2.1
# via sqlalchemy
gunicorn==23.0.0
# via -r requirements.in
h11==0.14.0
# via uvicorn
httptools==0.6.4
@ -26,8 +30,14 @@ idna==3.4
# via anyio
limits==3.2.0
# via slowapi
mako==1.3.10
# via alembic
markupsafe==3.0.2
# via mako
packaging==23.0
# via limits
# via
# gunicorn
# limits
psycopg2-binary==2.9.10
# via -r requirements.in
pydantic==2.6.0
@ -47,13 +57,16 @@ slowapi==0.1.9
sniffio==1.3.0
# via anyio
sqlalchemy==2.0.40
# via sqlmodel
# via
# alembic
# sqlmodel
sqlmodel==0.0.24
# via -r requirements.in
starlette==0.46.1
# via fastapi
typing-extensions==4.13.2
# via
# alembic
# fastapi
# limits
# pydantic

View File

@ -1,29 +0,0 @@
#!/usr/bin/env python3
"""Run database migrations"""
import os
import sys
from alembic.config import Config
from alembic import command
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
def run_migrations():
# Get database URL from environment or use default
database_url = os.getenv(
"DATABASE_URL",
"postgresql://postgres:postgres@localhost/dicta2stream"
)
# Set up Alembic config
alembic_cfg = Config()
alembic_cfg.set_main_option("script_location", "migrations")
alembic_cfg.set_main_option("sqlalchemy.url", database_url)
# Run migrations
command.upgrade(alembic_cfg, "head")
print("Database migrations completed successfully.")
if __name__ == "__main__":
run_migrations()

File diff suppressed because it is too large Load Diff

636
static/audio-player.js Normal file
View File

@ -0,0 +1,636 @@
/**
* Audio Player Module
* A shared audio player implementation based on the working "Your Stream" player
*/
import { globalAudioManager } from './global-audio-manager.js';
export class AudioPlayer {
constructor() {
// Audio state
this.audioElement = null;
this.currentUid = null;
this.isPlaying = false;
this.currentButton = null;
this.audioUrl = '';
this.lastPlayTime = 0;
this.isLoading = false;
this.loadTimeout = null; // For tracking loading timeouts
this.retryCount = 0;
this.maxRetries = 3;
this.retryDelay = 3000; // 3 seconds
this.buffering = false;
this.bufferRetryTimeout = null;
this.lastLoadTime = 0;
this.minLoadInterval = 2000; // 2 seconds between loads
this.pendingLoad = false;
// Create a single audio element that we'll reuse
this.audioElement = new Audio();
this.audioElement.preload = 'none';
this.audioElement.crossOrigin = 'anonymous';
// Bind methods
this.loadAndPlay = this.loadAndPlay.bind(this);
this.stop = this.stop.bind(this);
this.cleanup = this.cleanup.bind(this);
this.handlePlayError = this.handlePlayError.bind(this);
this.handleStalled = this.handleStalled.bind(this);
this.handleWaiting = this.handleWaiting.bind(this);
this.handlePlaying = this.handlePlaying.bind(this);
this.handleEnded = this.handleEnded.bind(this);
// Set up event listeners
this.setupEventListeners();
// Register with global audio manager to handle stop requests from other players
globalAudioManager.addListener('personal', () => {
console.log('[audio-player] Received stop request from global audio manager');
this.stop();
});
}
/**
* Load and play audio for a specific UID
* @param {string} uid - The user ID for the audio stream
* @param {HTMLElement} button - The play/pause button element
*/
/**
* Validates that a UID is in the correct UUID format
* @param {string} uid - The UID to validate
* @returns {boolean} True if valid, false otherwise
*/
isValidUuid(uid) {
// UUID v4 format: xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
return uuidRegex.test(uid);
}
/**
* Logs an error and updates the button state
* @param {HTMLElement} button - The button to update
* @param {string} message - Error message to log
*/
handleError(button, message) {
console.error(message);
if (button) {
this.updateButtonState(button, 'error');
}
}
async loadAndPlay(uid, button) {
const now = Date.now();
// Prevent rapid successive load attempts
if (this.pendingLoad || (now - this.lastLoadTime < this.minLoadInterval)) {
console.log('[AudioPlayer] Skipping duplicate load request');
return;
}
// Validate UID exists and is in correct format
if (!uid) {
this.handleError(button, 'No UID provided for audio playback');
return;
}
// For logging purposes
const requestId = Math.random().toString(36).substr(2, 8);
console.log(`[AudioPlayer] Load request ${requestId} for UID: ${uid}`);
this.pendingLoad = true;
this.lastLoadTime = now;
// If we're in the middle of loading, check if it's for the same UID
if (this.isLoading) {
// If same UID, ignore duplicate request
if (this.currentUid === uid) {
console.log(`[AudioPlayer] Already loading this UID, ignoring duplicate request: ${uid}`);
this.pendingLoad = false;
return;
}
// If different UID, queue the new request
console.log(`[AudioPlayer] Already loading, queuing request for UID: ${uid}`);
setTimeout(() => {
this.pendingLoad = false;
this.loadAndPlay(uid, button);
}, 500);
return;
}
// If we're in the middle of loading, check if it's for the same UID
if (this.isLoading) {
// If same UID, ignore duplicate request
if (this.currentUid === uid) {
console.log('Already loading this UID, ignoring duplicate request:', uid);
return;
}
// If different UID, queue the new request
console.log('Already loading, queuing request for UID:', uid);
setTimeout(() => this.loadAndPlay(uid, button), 500);
return;
}
// If already playing this stream, just toggle pause/play
if (this.currentUid === uid && this.audioElement) {
try {
if (this.isPlaying) {
console.log('Pausing current playback');
try {
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
this.isPlaying = false;
this.updateButtonState(button, 'paused');
} catch (pauseError) {
console.warn('Error pausing audio, continuing with state update:', pauseError);
this.isPlaying = false;
this.updateButtonState(button, 'paused');
}
} else {
console.log('Resuming playback from time:', this.lastPlayTime);
try {
// If we have a last play time, seek to it
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
await this.audioElement.play();
this.isPlaying = true;
this.updateButtonState(button, 'playing');
} catch (playError) {
console.error('Error resuming playback, reloading source:', playError);
// If resume fails, try reloading the source
this.currentUid = null; // Force reload of the source
return this.loadAndPlay(uid, button);
}
}
return; // Exit after handling pause/resume
} catch (error) {
console.error('Error toggling playback:', error);
this.updateButtonState(button, 'error');
return;
}
}
// If we get here, we're loading a new stream
this.isLoading = true;
this.currentUid = uid;
this.currentButton = button;
this.isPlaying = true;
this.updateButtonState(button, 'loading');
// Notify global audio manager that personal player is starting
globalAudioManager.startPlayback('personal', uid);
try {
// Only clean up if switching streams
if (this.currentUid !== uid) {
this.cleanup();
}
// Store the current button reference
this.currentButton = button;
this.currentUid = uid;
// Create a new audio element if we don't have one
if (!this.audioElement) {
this.audioElement = new Audio();
} else if (this.audioElement.readyState > 0) {
// If we already have a loaded source, just play it
try {
await this.audioElement.play();
this.isPlaying = true;
this.updateButtonState(button, 'playing');
return;
} catch (playError) {
console.warn('Error playing existing source, will reload:', playError);
// Continue to load a new source
}
}
// Clear any existing sources
while (this.audioElement.firstChild) {
this.audioElement.removeChild(this.audioElement.firstChild);
}
// Set the source URL with proper encoding and cache-busting timestamp
// Using the format: /audio/{uid}/stream.opus?t={timestamp}
// Only update timestamp if we're loading a different UID or after a retry
const timestamp = this.retryCount > 0 ? new Date().getTime() : this.lastLoadTime;
this.audioUrl = `/audio/${encodeURIComponent(uid)}/stream.opus?t=${timestamp}`;
console.log(`[AudioPlayer] Loading audio from URL: ${this.audioUrl} (attempt ${this.retryCount + 1}/${this.maxRetries})`);
console.log('Loading audio from URL:', this.audioUrl);
this.audioElement.src = this.audioUrl;
// Load the new source (don't await, let canplay handle it)
try {
this.audioElement.load();
// If load() doesn't throw, we'll wait for canplay event
} catch (e) {
// Ignore abort errors as they're expected during rapid toggling
if (e.name !== 'AbortError') {
console.error('Error loading audio source:', e);
this.isLoading = false;
this.updateButtonState(button, 'error');
}
}
// Reset the current time when loading a new source
this.audioElement.currentTime = 0;
this.lastPlayTime = 0;
// Set up error handling
this.audioElement.onerror = (e) => {
console.error('Audio element error:', e, this.audioElement.error);
this.isLoading = false;
this.updateButtonState(button, 'error');
};
// Handle when audio is ready to play
const onCanPlay = () => {
this.audioElement.removeEventListener('canplay', onCanPlay);
this.isLoading = false;
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
this.audioElement.play().then(() => {
this.isPlaying = true;
this.updateButtonState(button, 'playing');
}).catch(e => {
console.error('Error playing after load:', e);
this.updateButtonState(button, 'error');
});
};
// Define the error handler
const errorHandler = (e) => {
console.error('Audio element error:', e, this.audioElement.error);
this.isLoading = false;
this.updateButtonState(button, 'error');
};
// Define the play handler
const playHandler = () => {
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
this.audioElement.removeEventListener('canplay', playHandler);
this.isLoading = false;
if (this.lastPlayTime > 0) {
this.audioElement.currentTime = this.lastPlayTime;
}
this.audioElement.play().then(() => {
this.isPlaying = true;
this.updateButtonState(button, 'playing');
}).catch(e => {
console.error('Error playing after load:', e);
this.isPlaying = false;
this.updateButtonState(button, 'error');
});
};
// Add event listeners
this.audioElement.addEventListener('error', errorHandler, { once: true });
this.audioElement.addEventListener('canplay', playHandler, { once: true });
// Load and play the new source
try {
await this.audioElement.load();
// Don't await play() here, let the canplay handler handle it
// Set a timeout to handle cases where canplay doesn't fire
this.loadTimeout = setTimeout(() => {
if (this.isLoading) {
console.warn('Audio loading timed out for UID:', uid);
this.isLoading = false;
this.updateButtonState(button, 'error');
}
}, 10000); // 10 second timeout
} catch (e) {
console.error('Error loading audio:', e);
this.isLoading = false;
this.updateButtonState(button, 'error');
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
}
} catch (error) {
console.error('Error in loadAndPlay:', error);
// Only cleanup and show error if we're still on the same track
if (this.currentUid === uid) {
this.cleanup();
this.updateButtonState(button, 'error');
}
}
}
/**
* Stop playback and clean up resources
*/
stop() {
try {
if (this.audioElement) {
console.log('Stopping audio playback');
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
this.isPlaying = false;
// Notify global audio manager that personal player has stopped
globalAudioManager.stopPlayback('personal');
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
}
} catch (error) {
console.error('Error stopping audio:', error);
// Don't throw, just log the error
}
}
/**
* Set up event listeners for the audio element
*/
setupEventListeners() {
if (!this.audioElement) return;
// Remove any existing listeners to prevent duplicates
this.audioElement.removeEventListener('error', this.handlePlayError);
this.audioElement.removeEventListener('stalled', this.handleStalled);
this.audioElement.removeEventListener('waiting', this.handleWaiting);
this.audioElement.removeEventListener('playing', this.handlePlaying);
this.audioElement.removeEventListener('ended', this.handleEnded);
// Add new listeners
this.audioElement.addEventListener('error', this.handlePlayError);
this.audioElement.addEventListener('stalled', this.handleStalled);
this.audioElement.addEventListener('waiting', this.handleWaiting);
this.audioElement.addEventListener('playing', this.handlePlaying);
this.audioElement.addEventListener('ended', this.handleEnded);
}
/**
* Handle play errors
*/
handlePlayError(event) {
console.error('[AudioPlayer] Playback error:', {
event: event.type,
error: this.audioElement.error,
currentTime: this.audioElement.currentTime,
readyState: this.audioElement.readyState,
networkState: this.audioElement.networkState,
src: this.audioElement.src
});
this.isPlaying = false;
this.buffering = false;
this.pendingLoad = false;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'error');
}
// Auto-retry logic
if (this.retryCount < this.maxRetries) {
this.retryCount++;
console.log(`Retrying playback (attempt ${this.retryCount}/${this.maxRetries})...`);
setTimeout(() => {
if (this.currentUid && this.currentButton) {
this.loadAndPlay(this.currentUid, this.currentButton);
}
}, this.retryDelay);
} else {
console.error('Max retry attempts reached');
this.retryCount = 0; // Reset for next time
}
}
/**
* Handle stalled audio (buffering issues)
*/
handleStalled() {
console.log('[AudioPlayer] Playback stalled, attempting to recover...');
this.buffering = true;
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
}
this.bufferRetryTimeout = setTimeout(() => {
if (this.buffering) {
console.log('[AudioPlayer] Buffer recovery timeout, attempting to reload...');
if (this.currentUid && this.currentButton) {
// Only retry if we're still supposed to be playing
if (this.isPlaying) {
this.retryCount++;
if (this.retryCount <= this.maxRetries) {
console.log(`[AudioPlayer] Retry ${this.retryCount}/${this.maxRetries} for UID: ${this.currentUid}`);
this.loadAndPlay(this.currentUid, this.currentButton);
} else {
console.error('[AudioPlayer] Max retry attempts reached');
this.retryCount = 0;
this.updateButtonState(this.currentButton, 'error');
}
}
}
}
}, 5000); // 5 second buffer recovery timeout
}
/**
* Handle waiting event (buffering)
*/
handleWaiting() {
console.log('Audio waiting for data...');
this.buffering = true;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'loading');
}
}
/**
* Handle playing event (playback started/resumed)
*/
handlePlaying() {
console.log('Audio playback started/resumed');
this.buffering = false;
this.retryCount = 0; // Reset retry counter on successful playback
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
this.bufferRetryTimeout = null;
}
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'playing');
}
}
/**
* Handle ended event (playback completed)
*/
handleEnded() {
console.log('Audio playback ended');
this.isPlaying = false;
this.buffering = false;
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
}
/**
* Clean up resources
*/
cleanup() {
// Clear any pending timeouts
if (this.loadTimeout) {
clearTimeout(this.loadTimeout);
this.loadTimeout = null;
}
if (this.bufferRetryTimeout) {
clearTimeout(this.bufferRetryTimeout);
this.bufferRetryTimeout = null;
}
// Update button state if we have a reference to the current button
if (this.currentButton) {
this.updateButtonState(this.currentButton, 'paused');
}
// Pause the audio and store the current time
if (this.audioElement) {
try {
// Remove event listeners to prevent memory leaks
this.audioElement.removeEventListener('error', this.handlePlayError);
this.audioElement.removeEventListener('stalled', this.handleStalled);
this.audioElement.removeEventListener('waiting', this.handleWaiting);
this.audioElement.removeEventListener('playing', this.handlePlaying);
this.audioElement.removeEventListener('ended', this.handleEnded);
try {
this.audioElement.pause();
this.lastPlayTime = this.audioElement.currentTime;
} catch (e) {
console.warn('Error pausing audio during cleanup:', e);
}
try {
// Clear any existing sources
while (this.audioElement.firstChild) {
this.audioElement.removeChild(this.audioElement.firstChild);
}
// Clear the source and reset the audio element
this.audioElement.removeAttribute('src');
try {
this.audioElement.load();
} catch (e) {
console.warn('Error in audio load during cleanup:', e);
}
} catch (e) {
console.warn('Error cleaning up audio sources:', e);
}
} catch (e) {
console.warn('Error during audio cleanup:', e);
}
}
// Reset state
this.currentUid = null;
this.currentButton = null;
this.audioUrl = '';
this.isPlaying = false;
this.buffering = false;
this.retryCount = 0;
// Notify global audio manager that personal player has stopped
globalAudioManager.stopPlayback('personal');
}
/**
* Update the state of a play/pause button
* @param {HTMLElement} button - The button to update
* @param {string} state - The state to set ('playing', 'paused', 'loading', 'error')
*/
updateButtonState(button, state) {
if (!button) return;
// Only update the current button's state
if (state === 'playing') {
// If this button is now playing, update all buttons
document.querySelectorAll('.play-pause-btn').forEach(btn => {
btn.classList.remove('playing', 'paused', 'loading', 'error');
if (btn === button) {
btn.classList.add('playing');
} else {
btn.classList.add('paused');
}
});
} else {
// For other states, just update the target button
button.classList.remove('playing', 'paused', 'loading', 'error');
if (state) {
button.classList.add(state);
}
}
// Update button icon and aria-label for the target button
const icon = button.querySelector('i');
if (icon) {
if (state === 'playing') {
icon.className = 'fas fa-pause';
button.setAttribute('aria-label', 'Pause');
} else {
icon.className = 'fas fa-play';
button.setAttribute('aria-label', 'Play');
}
}
}
}
// Create a singleton instance
export const audioPlayer = new AudioPlayer();
// Export utility functions for direct use
export function initAudioPlayer(container = document) {
// Set up event delegation for play/pause buttons
container.addEventListener('click', (e) => {
const playButton = e.target.closest('.play-pause-btn');
if (!playButton) return;
e.preventDefault();
e.stopPropagation();
const uid = playButton.dataset.uid;
if (!uid) return;
audioPlayer.loadAndPlay(uid, playButton);
});
// Set up event delegation for stop buttons if they exist
container.addEventListener('click', (e) => {
const stopButton = e.target.closest('.stop-btn');
if (!stopButton) return;
e.preventDefault();
e.stopPropagation();
audioPlayer.stop();
});
}
// Auto-initialize if this is the main module
if (typeof document !== 'undefined') {
document.addEventListener('DOMContentLoaded', () => {
initAudioPlayer();
});
}

688
static/auth-manager.js Normal file
View File

@ -0,0 +1,688 @@
/**
* Centralized Authentication Manager
*
* This module consolidates all authentication logic from auth.js, magic-login.js,
* and cleanup-auth.js into a single, maintainable module.
*/
import { showToast } from './toast.js';
class AuthManager {
constructor() {
this.DEBUG_AUTH_STATE = false;
this.AUTH_CHECK_DEBOUNCE = 1000; // 1 second
this.AUTH_CHECK_INTERVAL = 30000; // 30 seconds
this.CACHE_TTL = 5000; // 5 seconds
// Authentication state cache
this.authStateCache = {
timestamp: 0,
value: null,
ttl: this.CACHE_TTL
};
// Track auth check calls
this.lastAuthCheckTime = 0;
this.authCheckCounter = 0;
this.wasAuthenticated = null;
// Bind all methods that will be used as event handlers
this.checkAuthState = this.checkAuthState.bind(this);
this.handleMagicLoginRedirect = this.handleMagicLoginRedirect.bind(this);
this.logout = this.logout.bind(this);
this.deleteAccount = this.deleteAccount.bind(this);
this.handleStorageEvent = this.handleStorageEvent.bind(this);
this.handleVisibilityChange = this.handleVisibilityChange.bind(this);
// Initialize
this.initialize = this.initialize.bind(this);
}
/**
* Validate UID format - must be a valid email address
*/
validateUidFormat(uid) {
if (!uid || typeof uid !== 'string') {
// Debug messages disabled
return false;
}
// Email regex pattern - RFC 5322 compliant basic validation
const emailRegex = /^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/;
const isValid = emailRegex.test(uid);
if (!isValid) {
// Debug messages disabled
} else {
// Debug messages disabled
}
return isValid;
}
/**
* Sanitize and validate UID - ensures consistent format
*/
sanitizeUid(uid) {
if (!uid || typeof uid !== 'string') {
// Debug messages disabled
return null;
}
// Trim whitespace and convert to lowercase
const sanitized = uid.trim().toLowerCase();
// Validate the sanitized UID
if (!this.validateUidFormat(sanitized)) {
// Debug messages disabled
return null;
}
// Debug messages disabled
return sanitized;
}
/**
* Check if current stored UID is valid and fix if needed
*/
validateStoredUid() {
const storedUid = localStorage.getItem('uid');
if (!storedUid) {
// Debug messages disabled
return null;
}
const sanitizedUid = this.sanitizeUid(storedUid);
if (!sanitizedUid) {
// Debug messages disabled
this.clearAuthState();
return null;
}
// Update stored UID if sanitization changed it
if (sanitizedUid !== storedUid) {
// Debug messages disabled
localStorage.setItem('uid', sanitizedUid);
// Update cookies as well
document.cookie = `uid=${sanitizedUid}; path=/; SameSite=Lax; Secure`;
}
return sanitizedUid;
}
/**
* Get cookie value by name
*/
getCookieValue(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) {
return parts.pop().split(';').shift();
}
return null;
}
/**
* Initialize the authentication manager
*/
async initialize() {
// Debug messages disabled
// Validate stored UID format and fix if needed
const validUid = this.validateStoredUid();
if (validUid) {
// Debug messages disabled
} else {
// Debug messages disabled
}
// Handle magic link login if present
await this.handleMagicLoginRedirect();
// Setup authentication state polling
this.setupAuthStatePolling();
// Setup event listeners
document.addEventListener('visibilitychange', this.handleVisibilityChange);
this.setupEventListeners();
// Debug messages disabled
}
/**
* Fetch user information from the server
*/
async fetchUserInfo() {
try {
// Get the auth token from cookies
const authToken = this.getCookieValue('authToken') || localStorage.getItem('authToken');
// Debug messages disabled
const headers = {
'Accept': 'application/json',
'Content-Type': 'application/json'
};
// Add Authorization header if we have a token
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
// Debug messages disabled
} else {
// Debug messages disabled
}
// Debug messages disabled
const response = await fetch('/api/me', {
method: 'GET',
credentials: 'include',
headers: headers
});
// Debug messages disabled
if (response.ok) {
const contentType = response.headers.get('content-type');
// Debug messages disabled
if (contentType && contentType.includes('application/json')) {
const userInfo = await response.json();
// Debug messages disabled
return userInfo;
} else {
const text = await response.text();
// Debug messages disabled
}
} else {
const errorText = await response.text();
// Debug messages disabled
}
return null;
} catch (error) {
// Debug messages disabled
return null;
}
}
/**
* Set authentication state in localStorage and cookies
*/
setAuthState(userEmail, username, authToken = null) {
// Debug messages disabled
// Validate and sanitize the UID (email)
const sanitizedUid = this.sanitizeUid(userEmail);
if (!sanitizedUid) {
// Debug messages disabled
throw new Error(`Invalid UID format: ${userEmail}. UID must be a valid email address.`);
}
// Validate username (basic check)
if (!username || typeof username !== 'string' || username.trim().length === 0) {
// Debug messages disabled
throw new Error(`Invalid username: ${username}. Username cannot be empty.`);
}
const sanitizedUsername = username.trim();
// Generate auth token if not provided
if (!authToken) {
authToken = 'token-' + Math.random().toString(36).substring(2, 15);
}
// Debug messages disabled
// Set localStorage for client-side access (not sent to server)
localStorage.setItem('uid', sanitizedUid); // Primary UID is email
localStorage.setItem('username', sanitizedUsername); // Username for display
localStorage.setItem('uid_time', Date.now().toString());
// Set cookies for server authentication (sent with requests)
document.cookie = `uid=${encodeURIComponent(sanitizedUid)}; path=/; SameSite=Lax`;
document.cookie = `authToken=${authToken}; path=/; SameSite=Lax; Secure`;
// Note: isAuthenticated is determined by presence of valid authToken, no need to duplicate
// Clear cache to force refresh
this.authStateCache.timestamp = 0;
}
/**
* Clear authentication state
*/
clearAuthState() {
// Debug messages disabled
// Clear localStorage (client-side data only)
const authKeys = ['uid', 'username', 'uid_time'];
authKeys.forEach(key => localStorage.removeItem(key));
// Clear cookies
document.cookie.split(';').forEach(cookie => {
const eqPos = cookie.indexOf('=');
const name = eqPos > -1 ? cookie.substr(0, eqPos).trim() : cookie.trim();
document.cookie = `${name}=;expires=Thu, 01 Jan 1970 00:00:00 GMT;path=/; SameSite=Lax`;
});
// Clear cache
this.authStateCache.timestamp = 0;
}
/**
* Check if user is currently authenticated
*/
isAuthenticated() {
const now = Date.now();
// Use cached value if still valid
if (this.authStateCache.timestamp > 0 &&
(now - this.authStateCache.timestamp) < this.authStateCache.ttl) {
return this.authStateCache.value;
}
// Check authentication state - simplified approach
const hasUid = !!(document.cookie.includes('uid=') || localStorage.getItem('uid'));
const hasAuthToken = !!document.cookie.includes('authToken=');
const isAuth = hasUid && hasAuthToken;
// Update cache
this.authStateCache.timestamp = now;
this.authStateCache.value = isAuth;
return isAuth;
}
/**
* Get current user data
*/
getCurrentUser() {
if (!this.isAuthenticated()) {
return null;
}
return {
uid: localStorage.getItem('uid'),
email: localStorage.getItem('uid'), // uid is the email
username: localStorage.getItem('username'),
authToken: this.getCookieValue('authToken') // authToken is in cookies
};
}
/**
* Handle magic link login redirect
*/
async handleMagicLoginRedirect() {
const params = new URLSearchParams(window.location.search);
// Handle secure token-based magic login only
const token = params.get('token');
if (token) {
// Debug messages disabled
// Clean up URL immediately
const url = new URL(window.location.href);
url.searchParams.delete('token');
window.history.replaceState({}, document.title, url.pathname + url.search);
await this.processTokenLogin(token);
return true;
}
return false;
}
/**
* Process token-based login
*/
async processTokenLogin(token) {
try {
// Debug messages disabled
const formData = new FormData();
formData.append('token', token);
// Debug messages disabled
const response = await fetch('/magic-login', {
method: 'POST',
body: formData,
});
// Debug messages disabled
// Handle successful token login response
const contentType = response.headers.get('content-type');
// Debug messages disabled
if (contentType && contentType.includes('application/json')) {
const data = await response.json();
// Debug messages disabled
if (data && data.success && data.user) {
// Debug messages disabled
// Use the user data and token from the response
const { email, username } = data.user;
const authToken = data.token; // Get token from JSON response
// Debug messages disabled
// Set auth state with the token from the response
this.setAuthState(email, username, authToken);
this.updateUIState(true);
await this.initializeUserSession(username, email);
showToast('✅ Login successful!');
this.navigateToProfile();
return;
} else {
// Debug messages disabled
throw new Error('Invalid user data received from server');
}
} else {
const text = await response.text();
// Debug messages disabled
throw new Error(`Unexpected response format: ${text || 'No details available'}`);
}
} catch (error) {
// Debug messages disabled
showToast(`Login failed: ${error.message}`, 'error');
}
}
/**
* Initialize user session after login
*/
async initializeUserSession(username, userEmail) {
// Initialize dashboard
if (window.initDashboard) {
await window.initDashboard(username);
} else {
// Debug messages disabled
}
// Fetch and display file list
if (window.fetchAndDisplayFiles) {
// Debug messages disabled
await window.fetchAndDisplayFiles(userEmail);
} else {
// Debug messages disabled
}
}
/**
* Navigate to user profile
*/
navigateToProfile() {
if (window.showOnly) {
// Debug messages disabled
window.showOnly('me-page');
} else if (window.location.hash !== '#me-page') {
window.location.hash = '#me-page';
}
}
/**
* Update UI state based on authentication
*/
updateUIState(isAuthenticated) {
if (isAuthenticated) {
document.body.classList.add('authenticated');
document.body.classList.remove('guest');
// Note: Removed auto-loading of profile stream to prevent auto-play on page load
// Profile stream will only play when user clicks the play button
} else {
document.body.classList.remove('authenticated');
document.body.classList.add('guest');
}
this.updateAccountDeletionVisibility(isAuthenticated);
// Force reflow
void document.body.offsetHeight;
}
/**
* Update account deletion section visibility
*/
updateAccountDeletionVisibility(isAuthenticated) {
const accountDeletionSection = document.getElementById('account-deletion-section');
const deleteAccountFromPrivacy = document.getElementById('delete-account-from-privacy');
if (isAuthenticated) {
this.showElement(accountDeletionSection);
this.showElement(deleteAccountFromPrivacy);
} else {
this.hideElement(accountDeletionSection);
this.hideElement(deleteAccountFromPrivacy);
}
}
showElement(element) {
if (element) {
element.style.display = 'block';
element.style.visibility = 'visible';
}
}
hideElement(element) {
if (element) {
element.style.display = 'none';
}
}
/**
* Check authentication state with caching and debouncing
*/
checkAuthState(force = false) {
const now = Date.now();
// Debounce frequent calls
if (!force && (now - this.lastAuthCheckTime) < this.AUTH_CHECK_DEBOUNCE) {
return this.authStateCache.value;
}
this.lastAuthCheckTime = now;
this.authCheckCounter++;
if (this.DEBUG_AUTH_STATE) {
// Debug messages disabled
}
const isAuthenticated = this.isAuthenticated();
// Only update UI if state changed or forced
if (force || this.wasAuthenticated !== isAuthenticated) {
if (this.DEBUG_AUTH_STATE) {
// Debug messages disabled
}
// Handle logout detection
if (this.wasAuthenticated === true && isAuthenticated === false) {
// Debug messages disabled
this.logout();
return false;
}
this.updateUIState(isAuthenticated);
this.wasAuthenticated = isAuthenticated;
}
return isAuthenticated;
}
/**
* Setup authentication state polling
*/
setupAuthStatePolling() {
// Initial check
this.checkAuthState(true);
// Periodic checks
setInterval(() => {
this.checkAuthState(!document.hidden);
}, this.AUTH_CHECK_INTERVAL);
// Storage event listener
window.addEventListener('storage', this.handleStorageEvent);
// Visibility change listener
document.addEventListener('visibilitychange', this.handleVisibilityChange);
}
/**
* Handle storage events
*/
handleStorageEvent(e) {
if (['isAuthenticated', 'authToken', 'uid'].includes(e.key)) {
this.checkAuthState(true);
}
}
/**
* Handle visibility change events
*/
handleVisibilityChange() {
if (!document.hidden) {
this.checkAuthState(true);
}
}
/**
* Setup event listeners
*/
setupEventListeners() {
document.addEventListener('click', (e) => {
// Delete account buttons
if (e.target.closest('#delete-account') || e.target.closest('#delete-account-from-privacy')) {
this.deleteAccount(e);
return;
}
});
}
/**
* Delete user account
*/
async deleteAccount(e) {
if (e) e.preventDefault();
if (this.deleteAccount.inProgress) return;
if (!confirm('Are you sure you want to delete your account?\nThis action is permanent.')) {
return;
}
this.deleteAccount.inProgress = true;
const deleteBtn = e?.target.closest('button');
const originalText = deleteBtn?.textContent;
if (deleteBtn) {
deleteBtn.disabled = true;
deleteBtn.textContent = 'Deleting...';
}
try {
const response = await fetch('/api/delete-account', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
credentials: 'include',
body: JSON.stringify({ uid: localStorage.getItem('uid') })
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({ detail: 'Failed to delete account.' }));
throw new Error(errorData.detail);
}
showToast('Account deleted successfully.', 'success');
this.logout();
} catch (error) {
// Debug messages disabled
showToast(error.message, 'error');
} finally {
this.deleteAccount.inProgress = false;
if (deleteBtn) {
deleteBtn.disabled = false;
deleteBtn.textContent = originalText;
}
}
}
/**
* Logout user
*/
logout() {
// Debug messages disabled
this.clearAuthState();
window.location.href = '/';
}
/**
* Cleanup authentication state (for migration/debugging)
*/
async cleanupAuthState(manualEmail = null) {
// Debug messages disabled
let userEmail = manualEmail;
// Try to get email from server if not provided
if (!userEmail) {
const userInfo = await this.fetchUserInfo();
userEmail = userInfo?.email;
if (!userEmail) {
userEmail = prompt('Please enter your email address (e.g., oib@chello.at):');
if (!userEmail || !userEmail.includes('@')) {
// Debug messages disabled
return { success: false, error: 'Invalid email' };
}
}
}
if (!userEmail) {
// Debug messages disabled
return { success: false, error: 'No email available' };
}
// Get current username for reference
const currentUsername = localStorage.getItem('username') || localStorage.getItem('uid');
// Clear and reset authentication state
this.clearAuthState();
this.setAuthState(userEmail, currentUsername || userEmail);
// Debug messages disabled
// Debug messages disabled
// Refresh if on profile page
if (window.location.hash === '#me-page') {
window.location.reload();
}
return {
email: userEmail,
username: currentUsername,
success: true
};
}
/**
* Destroy the authentication manager
*/
destroy() {
window.removeEventListener('storage', this.handleStorageEvent);
document.removeEventListener('visibilitychange', this.handleVisibilityChange);
}
}
// Create and export singleton instance
const authManager = new AuthManager();
// Export for global access
window.authManager = authManager;
export default authManager;

View File

@ -1,5 +1,5 @@
// static/auth-ui.js — navigation link and back-button handlers
import { showOnly } from './router.js';
import { showSection } from './nav.js';
// Data-target navigation (e.g., at #links)
export function initNavLinks() {
@ -10,7 +10,7 @@ export function initNavLinks() {
if (!a || !linksContainer.contains(a)) return;
e.preventDefault();
const target = a.dataset.target;
if (target) showOnly(target);
if (target) showSection(target);
const burger = document.getElementById('burger-toggle');
if (burger && burger.checked) burger.checked = false;
});
@ -22,7 +22,7 @@ export function initBackButtons() {
btn.addEventListener('click', e => {
e.preventDefault();
const target = btn.dataset.back;
if (target) showOnly(target);
if (target) showSection(target);
});
});
}

31
static/auth.js Normal file
View File

@ -0,0 +1,31 @@
/**
* Simplified Authentication Module
*
* This file now uses the centralized AuthManager for all authentication logic.
* Legacy code has been replaced with the new consolidated approach.
*/
import authManager from './auth-manager.js';
import { loadProfileStream } from './personal-player.js';
// Initialize authentication manager when DOM is ready
document.addEventListener('DOMContentLoaded', async () => {
// Debug messages disabled
// Initialize the centralized auth manager
await authManager.initialize();
// Make loadProfileStream available globally for auth manager
window.loadProfileStream = loadProfileStream;
// Debug messages disabled
});
// Export auth manager for other modules to use
export { authManager };
// Legacy compatibility - expose some functions globally
window.getCurrentUser = () => authManager.getCurrentUser();
window.isAuthenticated = () => authManager.isAuthenticated();
window.logout = () => authManager.logout();
window.cleanupAuthState = (email) => authManager.cleanupAuthState(email);

38
static/cleanup-auth.js Normal file
View File

@ -0,0 +1,38 @@
/**
* Simplified Authentication Cleanup Module
*
* This file now uses the centralized AuthManager for authentication cleanup.
* The cleanup logic has been moved to the AuthManager.
*/
import authManager from './auth-manager.js';
/**
* Clean up authentication state - now delegated to AuthManager
* This function is kept for backward compatibility.
*/
async function cleanupAuthState(manualEmail = null) {
console.log('[CLEANUP] Starting authentication state cleanup via AuthManager...');
// Delegate to the centralized AuthManager
return await authManager.cleanupAuthState(manualEmail);
}
// Auto-run cleanup if this script is loaded directly
if (typeof window !== 'undefined') {
// Export function for manual use
window.cleanupAuthState = cleanupAuthState;
// Auto-run if URL contains cleanup parameter
const urlParams = new URLSearchParams(window.location.search);
if (urlParams.get('cleanup') === 'auth') {
cleanupAuthState().then(result => {
if (result && result.success) {
console.log('[CLEANUP] Auto-cleanup completed successfully');
}
});
}
}
// Export for ES6 modules
export { cleanupAuthState };

View File

@ -34,8 +34,7 @@
#file-list li {
display: flex;
justify-content: space-between;
align-items: center;
flex-direction: column;
padding: 0.75rem 1rem;
margin: 0.5rem 0;
background-color: var(--surface);
@ -97,36 +96,58 @@
.file-info {
display: flex;
align-items: center;
align-items: flex-start;
flex: 1;
min-width: 0; /* Allows text truncation */
min-width: 0;
flex-direction: column;
gap: 0.25rem;
}
.file-icon {
margin-right: 0.75rem;
font-size: 1.2em;
flex-shrink: 0;
.file-header {
display: flex;
align-items: flex-start;
justify-content: space-between;
width: 100%;
gap: 0.75rem;
}
.file-name {
color: var(--primary);
text-decoration: none;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
margin-right: 0.5rem;
}
.file-name:hover {
text-decoration: underline;
color: var(--text-color);
word-break: break-word;
overflow-wrap: break-word;
line-height: 1.3;
flex: 1;
font-size: 0.95em;
}
.file-size {
color: var(--text-muted);
font-size: 0.85em;
margin-left: 0.5rem;
font-size: 0.8em;
white-space: nowrap;
flex-shrink: 0;
font-style: italic;
align-self: flex-start;
}
.delete-file {
align-self: center;
background: none;
border: none;
font-size: 1.1em;
cursor: pointer;
padding: 0.3rem 0.5rem;
border-radius: 4px;
transition: all 0.2s ease;
color: var(--text-muted);
margin-top: 0.2rem;
}
.delete-file:hover {
background-color: var(--error);
color: white;
transform: scale(1.1);
}
.file-actions {

File diff suppressed because it is too large Load Diff

220
static/file-display.js Normal file
View File

@ -0,0 +1,220 @@
// This function is responsible for rendering the list of files to the DOM.
// It is globally accessible via window.displayUserFiles.
window.displayUserFiles = function(uid, files) {
const fileList = document.getElementById('file-list');
if (!fileList) {
// Debug messages disabled
return;
}
if (!files || files.length === 0) {
fileList.innerHTML = '<li>You have no uploaded files yet.</li>';
return;
}
const fragment = document.createDocumentFragment();
const displayedFiles = new Set();
files.forEach(file => {
// Use original_name for display, stored_name for operations.
let displayName = file.original_name || file.stored_name || 'Unnamed File';
const storedFileName = file.stored_name || file.original_name;
// No UUID pattern replacement: always show the original_name from backend.
// Skip if no valid identifier is found or if it's a duplicate.
if (!storedFileName || displayedFiles.has(storedFileName)) {
return;
}
displayedFiles.add(storedFileName);
const listItem = document.createElement('li');
const fileUrl = `/user-uploads/${uid}/${encodeURIComponent(storedFileName)}`;
const fileSize = file.size ? (file.size / 1024 / 1024).toFixed(2) + ' MB' : 'N/A';
let fileIcon = '🎵'; // Default icon
const fileExt = displayName.split('.').pop().toLowerCase();
if (['mp3', 'wav', 'ogg', 'flac', 'm4a'].includes(fileExt)) {
fileIcon = '🎵';
} else if (['jpg', 'jpeg', 'png', 'gif', 'svg'].includes(fileExt)) {
fileIcon = '🖼️';
} else if (['pdf', 'doc', 'docx', 'txt'].includes(fileExt)) {
fileIcon = '📄';
}
listItem.innerHTML = `
<div class="file-info">
<div class="file-header">
<span class="file-name">${displayName}</span>
<span class="file-size">${fileSize}</span>
</div>
</div>
<button class="delete-file" title="Delete file" data-filename="${storedFileName}" data-display-name="${displayName}">🗑️</button>
`;
fragment.appendChild(listItem);
});
fileList.appendChild(fragment);
};
// Function to handle file deletion
async function deleteFile(uid, fileName, listItem, displayName = '') {
const fileToDelete = displayName || fileName;
if (!confirm(`Are you sure you want to delete "${fileToDelete}"?`)) {
return;
}
// Show loading state
if (listItem) {
listItem.style.opacity = '0.6';
listItem.style.pointerEvents = 'none';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = true;
deleteButton.textContent = '⏳';
}
}
try {
if (!uid) {
throw new Error('User not authenticated. Please log in again.');
}
// Debug messages disabled
const authToken = localStorage.getItem('authToken');
const headers = { 'Content-Type': 'application/json' };
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
}
// Get the email from localStorage (it's the UID)
const email = localStorage.getItem('uid');
if (!email) {
throw new Error('User not authenticated');
}
// The backend expects the full email as the UID in the path
// We need to ensure it's properly encoded for the URL
const username = email;
// Debug messages disabled
// Check if the filename is just a UUID (without log ID prefix)
const uuidPattern = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}\.\w+$/i;
let fileToDelete = fileName;
// If the filename is just a UUID, try to find the actual file with log ID prefix
if (uuidPattern.test(fileName)) {
// Debug messages disabled
try {
// First try to get the list of files to find the one with the matching UUID
const filesResponse = await fetch(`/user-files/${uid}`, {
method: 'GET',
headers: headers,
credentials: 'include'
});
if (filesResponse.ok) {
const filesData = await filesResponse.json();
if (filesData.files && Array.isArray(filesData.files)) {
// Look for a file that contains our UUID in its name
const matchingFile = filesData.files.find(f =>
f.stored_name && f.stored_name.includes(fileName)
);
if (matchingFile && matchingFile.stored_name) {
// Debug messages disabled
fileToDelete = matchingFile.stored_name;
}
}
}
} catch (e) {
// Debug messages disabled
// Continue with the original filename if there's an error
}
}
// Use the username in the URL with the correct filename
// Debug messages disabled
const response = await fetch(`/uploads/${username}/${encodeURIComponent(fileToDelete)}`, {
method: 'DELETE',
headers: headers,
credentials: 'include'
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.detail || `HTTP error! status: ${response.status}`);
}
// Remove the file from the UI immediately
if (listItem && listItem.parentNode) {
listItem.parentNode.removeChild(listItem);
}
// Show success message
window.showToast(`Successfully deleted "${fileToDelete}"`, 'success');
// If the file list is now empty, show a message
const fileList = document.getElementById('file-list');
if (fileList && fileList.children.length === 0) {
fileList.innerHTML = '<li class="no-files">No files uploaded yet.</li>';
}
// Refresh the file list and stream
const uid_current = localStorage.getItem('uid');
if (window.fetchAndDisplayFiles) {
// Use email-based UID for file operations if available, fallback to uid_current
const fileOperationUid = localStorage.getItem('uid') || uid_current; // uid is now email-based
// Debug messages disabled
await window.fetchAndDisplayFiles(fileOperationUid);
}
if (window.loadProfileStream) {
await window.loadProfileStream(uid_current);
}
} catch (error) {
// Debug messages disabled
window.showToast(`Error deleting "${fileToDelete}": ${error.message}`, 'error');
// Reset the button state if there was an error
if (listItem) {
listItem.style.opacity = '';
listItem.style.pointerEvents = '';
const deleteButton = listItem.querySelector('.delete-file');
if (deleteButton) {
deleteButton.disabled = false;
deleteButton.textContent = '🗑️';
}
}
}
}
// Add event delegation for delete buttons
document.addEventListener('DOMContentLoaded', () => {
const fileList = document.getElementById('file-list');
if (fileList) {
fileList.addEventListener('click', (e) => {
const deleteButton = e.target.closest('.delete-file');
if (deleteButton) {
e.preventDefault();
e.stopPropagation();
const listItem = deleteButton.closest('li');
if (!listItem) return;
const uid = localStorage.getItem('uid');
if (!uid) {
window.showToast('You need to be logged in to delete files', 'error');
// Debug messages disabled
return;
}
const fileName = deleteButton.getAttribute('data-filename');
const displayName = deleteButton.getAttribute('data-display-name') || fileName;
deleteFile(uid, fileName, listItem, displayName);
}
});
}
});

View File

@ -1,134 +0,0 @@
// Force hide guest navigation for authenticated users
function fixMobileNavigation() {
console.log('[FIX-NAV] Running navigation fix...');
// Check if user is authenticated
const hasAuthCookie = document.cookie.includes('isAuthenticated=true');
const hasUidCookie = document.cookie.includes('uid=');
const hasLocalStorageAuth = localStorage.getItem('isAuthenticated') === 'true';
const hasAuthToken = localStorage.getItem('authToken') !== null;
const isAuthenticated = hasAuthCookie || hasUidCookie || hasLocalStorageAuth || hasAuthToken;
console.log('[FIX-NAV] Authentication state:', {
isAuthenticated,
hasAuthCookie,
hasUidCookie,
hasLocalStorageAuth,
hasAuthToken
});
if (isAuthenticated) {
// Force hide guest navigation with !important styles
const guestNav = document.getElementById('guest-dashboard');
if (guestNav) {
console.log('[FIX-NAV] Hiding guest navigation');
guestNav.style.cssText = `
display: none !important;
visibility: hidden !important;
opacity: 0 !important;
height: 0 !important;
width: 0 !important;
padding: 0 !important;
margin: 0 !important;
border: none !important;
position: absolute !important;
overflow: hidden !important;
clip: rect(0, 0, 0, 0) !important;
pointer-events: none !important;
`;
guestNav.classList.add('force-hidden');
}
// Ensure user navigation is visible with !important styles
const userNav = document.getElementById('user-dashboard');
if (userNav) {
console.log('[FIX-NAV] Showing user navigation');
userNav.style.cssText = `
display: flex !important;
visibility: visible !important;
opacity: 1 !important;
height: auto !important;
position: relative !important;
clip: auto !important;
pointer-events: auto !important;
`;
userNav.classList.add('force-visible');
}
// Add authenticated class to body
document.body.classList.add('authenticated');
document.body.classList.remove('guest-mode');
// Prevent default behavior of nav links that might cause page reloads
document.querySelectorAll('a[href^="#"]').forEach(link => {
link.addEventListener('click', (e) => {
e.preventDefault();
const targetId = link.getAttribute('href');
if (targetId && targetId !== '#') {
// Use history API to update URL without full page reload
history.pushState(null, '', targetId);
// Dispatch a custom event that other scripts can listen for
window.dispatchEvent(new CustomEvent('hashchange'));
// Force re-apply our navigation fix
setTimeout(fixMobileNavigation, 0);
}
});
});
} else {
// User is not authenticated - ensure guest nav is visible
const guestNav = document.getElementById('guest-dashboard');
if (guestNav) {
guestNav.style.cssText = ''; // Reset any inline styles
}
document.body.classList.remove('authenticated');
document.body.classList.add('guest-mode');
}
}
// Run on page load
document.addEventListener('DOMContentLoaded', fixMobileNavigation);
// Also run after a short delay to catch any dynamic content
setTimeout(fixMobileNavigation, 100);
setTimeout(fixMobileNavigation, 300);
setTimeout(fixMobileNavigation, 1000);
// Listen for hash changes (navigation)
window.addEventListener('hashchange', fixMobileNavigation);
// Listen for pushState/replaceState (SPA navigation)
const originalPushState = history.pushState;
const originalReplaceState = history.replaceState;
history.pushState = function() {
originalPushState.apply(this, arguments);
setTimeout(fixMobileNavigation, 0);
};
history.replaceState = function() {
originalReplaceState.apply(this, arguments);
setTimeout(fixMobileNavigation, 0);
};
// Run on any DOM mutations (for dynamically loaded content)
const observer = new MutationObserver((mutations) => {
let shouldFix = false;
mutations.forEach((mutation) => {
if (mutation.addedNodes.length || mutation.removedNodes.length) {
shouldFix = true;
}
});
if (shouldFix) {
setTimeout(fixMobileNavigation, 0);
}
});
observer.observe(document.body, {
childList: true,
subtree: true,
attributes: true,
attributeFilter: ['class', 'style', 'id']
});
// Export for debugging
window.fixMobileNavigation = fixMobileNavigation;

View File

@ -8,5 +8,7 @@
<a href="#" data-target="privacy-page">Privacy</a>
<span class="separator"></span>
<a href="#" data-target="imprint-page">Imprint</a>
<span class="separator auth-only" style="display: none;"></span>
<a href="#" data-target="your-stream" class="auth-only" style="display: none;">Your Stream</a>
</div>
</footer>

View File

@ -1,13 +0,0 @@
#!/bin/bash
# Create a 1-second silent audio file in Opus format
ffmpeg -f lavfi -i anullsrc=r=48000:cl=mono -t 1 -c:a libopus -b:a 60k /home/oib/games/dicta2stream/static/test-audio.opus
# Verify the file was created
if [ -f "/home/oib/games/dicta2stream/static/test-audio.opus" ]; then
echo "Test audio file created successfully at /home/oib/games/dicta2stream/static/test-audio.opus"
echo "File size: $(du -h /home/oib/games/dicta2stream/static/test-audio.opus | cut -f1)"
else
echo "Failed to create test audio file"
exit 1
fi

View File

@ -0,0 +1,126 @@
/**
* Global Audio Manager
* Coordinates audio playback between different components to ensure only one audio plays at a time
*/
class GlobalAudioManager {
constructor() {
this.currentPlayer = null; // 'streams' or 'personal' or null
this.currentUid = null;
this.listeners = new Set();
// Bind methods
this.startPlayback = this.startPlayback.bind(this);
this.stopPlayback = this.stopPlayback.bind(this);
this.addListener = this.addListener.bind(this);
this.removeListener = this.removeListener.bind(this);
}
/**
* Register a player that wants to start playback
* @param {string} playerType - 'streams' or 'personal'
* @param {string} uid - The UID being played
* @param {Object} playerInstance - Reference to the player instance
*/
startPlayback(playerType, uid, playerInstance = null) {
// Debug messages disabled
// If the same player is already playing the same UID, allow it
if (this.currentPlayer === playerType && this.currentUid === uid) {
return true;
}
// Stop any currently playing audio
if (this.currentPlayer && this.currentPlayer !== playerType) {
this.notifyStop(this.currentPlayer);
}
// Update current state
this.currentPlayer = playerType;
this.currentUid = uid;
// Debug messages disabled
return true;
}
/**
* Notify that playback has stopped
* @param {string} playerType - 'streams' or 'personal'
*/
stopPlayback(playerType) {
if (this.currentPlayer === playerType) {
// Debug messages disabled
this.currentPlayer = null;
this.currentUid = null;
}
}
/**
* Get current playback state
*/
getCurrentState() {
return {
player: this.currentPlayer,
uid: this.currentUid
};
}
/**
* Check if a specific player is currently active
*/
isPlayerActive(playerType) {
return this.currentPlayer === playerType;
}
/**
* Add a listener for stop events
* @param {string} playerType - 'streams' or 'personal'
* @param {Function} callback - Function to call when this player should stop
*/
addListener(playerType, callback) {
const listener = { playerType, callback };
this.listeners.add(listener);
return listener;
}
/**
* Remove a listener
*/
removeListener(listener) {
this.listeners.delete(listener);
}
/**
* Notify a specific player type to stop
*/
notifyStop(playerType) {
// Debug messages disabled
this.listeners.forEach(listener => {
if (listener.playerType === playerType) {
try {
listener.callback();
} catch (error) {
console.error(`Error calling stop callback for ${playerType}:`, error);
}
}
});
}
/**
* Force stop all playback
*/
stopAll() {
if (this.currentPlayer) {
this.notifyStop(this.currentPlayer);
this.currentPlayer = null;
this.currentUid = null;
}
}
}
// Create singleton instance
export const globalAudioManager = new GlobalAudioManager();
// Make it available globally for debugging
if (typeof window !== 'undefined') {
window.globalAudioManager = globalAudioManager;
}

View File

@ -21,8 +21,11 @@
}
</style>
<link rel="modulepreload" href="/static/sound.js" />
<script src="/static/streams-ui.js" type="module"></script>
<script src="/static/app.js" type="module"></script>
<script src="/static/file-display.js?v=3"></script>
<script type="module" src="/static/dashboard.js?v=7"></script>
<script src="/static/streams-ui.js?v=3" type="module"></script>
<script src="/static/auth.js?v=5" type="module"></script>
<script src="/static/app.js?v=6" type="module"></script>
</head>
<body>
<header>
@ -36,7 +39,7 @@
<nav id="guest-dashboard" class="dashboard-nav guest-only">
<a href="#welcome-page" id="guest-welcome">Welcome</a>
<a href="#stream-page" id="guest-streams">Streams</a>
<a href="#account" id="guest-login">Account</a>
<a href="#register-page" id="guest-login">Account</a>
</nav>
<!-- User Dashboard -->
@ -47,7 +50,7 @@
</nav>
<section id="me-page" class="auth-only">
<div>
<h2>Your Stream</h2>
<h2 id="your-stream-heading">Your Stream</h2>
</div>
<article>
<p>This is your personal stream. Only you can upload to it.</p>
@ -65,12 +68,12 @@
<button id="logout-button" class="button">🚪 Log Out</button>
</article>
<section id="quota-meter" class="auth-only">
<p class="quota-meter">Quota: <progress id="quota-bar" value="0" max="100"></progress> <span id="quota-text">0 MB</span></p>
<h4>Uploaded Files</h4>
<section id="uploaded-files" class="auth-only">
<h3>Uploaded Files</h3>
<ul id="file-list" class="file-list">
<li>Loading files...</li>
</ul>
<p class="quota-meter">Quota: <progress id="quota-bar" value="0" max="100"></progress> <span id="quota-text">0 MB</span></p>
</section>
<!-- Account Deletion Section -->
@ -187,27 +190,17 @@
<footer>
<p class="footer-links">
<a href="#" id="footer-terms" data-target="terms-page">Terms</a> |
<a href="#" id="footer-privacy" data-target="privacy-page">Privacy</a> |
<a href="#" id="footer-imprint" data-target="imprint-page">Imprint</a>
<a href="#terms-page" id="footer-terms">Terms</a> |
<a href="#privacy-page" id="footer-privacy">Privacy</a> |
<a href="#imprint-page" id="footer-imprint">Imprint</a>
</p>
</footer>
<script type="module" src="/static/dashboard.js"></script>
<script type="module" src="/static/app.js"></script>
<!-- Load public streams UI logic -->
<script type="module" src="/static/streams-ui.js"></script>
<script type="module" src="/static/streams-ui.js?v=3"></script>
<!-- Load upload functionality -->
<script type="module" src="/static/upload.js"></script>
<script type="module">
import "/static/nav.js";
window.addEventListener("pageshow", () => {
const dz = document.querySelector("#user-upload-area");
if (dz) dz.classList.remove("uploading");
const spinner = document.querySelector("#spinner");
if (spinner) spinner.style.display = "none";
});
</script>
<script type="module">
import { initMagicLogin } from '/static/magic-login.js';
const params = new URLSearchParams(window.location.search);
@ -220,7 +213,7 @@
}
</script>
<script type="module" src="/static/init-personal-stream.js"></script>
<!-- Temporary fix for mobile navigation -->
<script src="/static/fix-nav.js"></script>
<script type="module" src="/static/personal-player.js"></script>
</body>
</html>

View File

@ -1,184 +0,0 @@
// inject-nav.js - Handles dynamic injection and management of navigation elements
import { showOnly } from './router.js';
// Function to set up guest navigation links
function setupGuestNav() {
const guestDashboard = document.getElementById('guest-dashboard');
if (!guestDashboard) return;
const links = guestDashboard.querySelectorAll('a');
links.forEach(link => {
link.addEventListener('click', (e) => {
e.preventDefault();
const target = link.getAttribute('href')?.substring(1); // Remove '#'
if (target) {
window.location.hash = target;
if (window.router && typeof window.router.showOnly === 'function') {
window.router.showOnly(target);
}
}
});
});
}
// Function to set up user navigation links
function setupUserNav() {
const userDashboard = document.getElementById('user-dashboard');
if (!userDashboard) return;
const links = userDashboard.querySelectorAll('a');
links.forEach(link => {
// Handle logout specially
if (link.getAttribute('href') === '#logout') {
link.addEventListener('click', (e) => {
e.preventDefault();
if (window.handleLogout) {
window.handleLogout();
}
});
} else {
// Handle regular navigation
link.addEventListener('click', (e) => {
e.preventDefault();
const target = link.getAttribute('href')?.substring(1); // Remove '#'
if (target) {
window.location.hash = target;
if (window.router && typeof window.router.showOnly === 'function') {
window.router.showOnly(target);
}
}
});
}
});
}
function createUserNav() {
const nav = document.createElement('div');
nav.className = 'dashboard-nav';
nav.setAttribute('role', 'navigation');
nav.setAttribute('aria-label', 'User navigation');
const navList = document.createElement('ul');
navList.className = 'nav-list';
const links = [
{ id: 'user-stream', target: 'your-stream', text: 'Your Stream' },
{ id: 'nav-streams', target: 'streams', text: 'Streams' },
{ id: 'nav-welcome', target: 'welcome', text: 'Welcome' },
{ id: 'user-logout', target: 'logout', text: 'Logout' }
];
// Create and append links
links.forEach((link) => {
const li = document.createElement('li');
li.className = 'nav-item';
const a = document.createElement('a');
a.id = link.id;
a.href = '#';
a.className = 'nav-link';
a.setAttribute('data-target', link.target);
a.textContent = link.text;
a.addEventListener('click', (e) => {
e.preventDefault();
const target = e.currentTarget.getAttribute('data-target');
if (target === 'logout') {
if (window.handleLogout) {
window.handleLogout();
}
} else if (target) {
window.location.hash = target;
if (window.router && typeof window.router.showOnly === 'function') {
window.router.showOnly(target);
}
}
});
li.appendChild(a);
navList.appendChild(li);
});
nav.appendChild(navList);
return nav;
}
// Navigation injection function
export function injectNavigation(isAuthenticated = false) {
// Get the appropriate dashboard element based on auth state
const guestDashboard = document.getElementById('guest-dashboard');
const userDashboard = document.getElementById('user-dashboard');
if (isAuthenticated) {
// Show user dashboard, hide guest dashboard
if (guestDashboard) guestDashboard.style.display = 'none';
if (userDashboard) userDashboard.style.display = 'block';
document.body.classList.add('authenticated');
document.body.classList.remove('guest-mode');
} else {
// Show guest dashboard, hide user dashboard
if (guestDashboard) guestDashboard.style.display = 'block';
if (userDashboard) userDashboard.style.display = 'none';
document.body.classList.add('guest-mode');
document.body.classList.remove('authenticated');
}
// Set up menu links and active state
setupMenuLinks();
updateActiveNav();
return isAuthenticated ? userDashboard : guestDashboard;
}
// Set up menu links with click handlers
function setupMenuLinks() {
// Set up guest and user navigation links
setupGuestNav();
setupUserNav();
// Handle hash changes for SPA navigation
window.addEventListener('hashchange', updateActiveNav);
}
// Update active navigation link
function updateActiveNav() {
const currentHash = window.location.hash.substring(1) || 'welcome';
// Remove active class from all links in both dashboards
document.querySelectorAll('#guest-dashboard a, #user-dashboard a').forEach(link => {
link.classList.remove('active');
// Check if this link's href matches the current hash
const linkTarget = link.getAttribute('href')?.substring(1); // Remove '#'
if (linkTarget === currentHash) {
link.classList.add('active');
}
});
}
// Initialize when DOM is loaded
document.addEventListener('DOMContentLoaded', () => {
// Check authentication state and initialize navigation
const isAuthenticated = document.cookie.includes('sessionid=') ||
localStorage.getItem('isAuthenticated') === 'true';
// Initialize navigation based on authentication state
injectNavigation(isAuthenticated);
// Set up menu links and active navigation
setupMenuLinks();
updateActiveNav();
// Update body classes based on authentication state
if (isAuthenticated) {
document.body.classList.add('authenticated');
document.body.classList.remove('guest-mode');
} else {
document.body.classList.add('guest-mode');
document.body.classList.remove('authenticated');
}
console.log('[NAV] Navigation initialized', { isAuthenticated });
});
// Make the function available globally for debugging
window.injectNavigation = injectNavigation;

6
static/logger.js Normal file
View File

@ -0,0 +1,6 @@
export function logToServer(msg) {
const xhr = new XMLHttpRequest();
xhr.open("POST", "/log", true);
xhr.setRequestHeader("Content-Type", "application/json");
xhr.send(JSON.stringify({ msg }));
}

View File

@ -1,90 +1,43 @@
// static/magic-login.js — handles magiclink token UI
import { showOnly } from './router.js';
/**
* Simplified Magic Login Module
*
* This file now uses the centralized AuthManager for authentication logic.
* The token-based magic login is handled by the AuthManager.
*/
import authManager from './auth-manager.js';
import { showSection } from './nav.js';
let magicLoginSubmitted = false;
/**
* Initialize magic login - now delegated to AuthManager
* This function is kept for backward compatibility but the actual
* magic login logic is handled by the AuthManager during initialization.
*/
export async function initMagicLogin() {
console.debug('[magic-login] initMagicLogin called');
// Debug messages disabled
// The AuthManager handles both URL-based and token-based magic login
// during its initialization, so we just need to ensure it's initialized
if (!window.authManager) {
// Debug messages disabled
await authManager.initialize();
}
// Check if there was a magic login processed
const params = new URLSearchParams(location.search);
const token = params.get('token');
if (!token) {
console.debug('[magic-login] No token in URL');
return;
}
// Remove token from URL immediately to prevent loops
const url = new URL(window.location.href);
url.searchParams.delete('token');
window.history.replaceState({}, document.title, url.pathname + url.search);
try {
const formData = new FormData();
formData.append('token', token);
const res = await fetch('/magic-login', {
method: 'POST',
body: formData,
});
if (res.redirected) {
// If redirected, backend should set cookie; but set localStorage for SPA
const url = new URL(res.url);
const confirmedUid = url.searchParams.get('confirmed_uid');
if (confirmedUid) {
// Generate a simple auth token (in a real app, this would come from the server)
const authToken = 'token-' + Math.random().toString(36).substring(2, 15);
// Set cookies and localStorage for SPA session logic
document.cookie = `uid=${encodeURIComponent(confirmedUid)}; path=/; SameSite=Lax`;
document.cookie = `authToken=${authToken}; path=/; SameSite=Lax; Secure`;
// Store in localStorage for client-side access
localStorage.setItem('uid', confirmedUid);
localStorage.setItem('confirmed_uid', confirmedUid);
localStorage.setItem('authToken', authToken);
localStorage.setItem('uid_time', Date.now().toString());
}
window.location.href = res.url;
return;
}
// If not redirected, show error (shouldn't happen in normal flow)
let data;
const contentType = res.headers.get('content-type');
if (contentType && contentType.includes('application/json')) {
data = await res.json();
if (data && data.confirmed_uid) {
// Generate a simple auth token (in a real app, this would come from the server)
const authToken = 'token-' + Math.random().toString(36).substring(2, 15);
// Set cookies and localStorage for SPA session logic
document.cookie = `uid=${encodeURIComponent(data.confirmed_uid)}; path=/; SameSite=Lax`;
document.cookie = `authToken=${authToken}; path=/; SameSite=Lax; Secure`;
// Store in localStorage for client-side access
localStorage.setItem('uid', data.confirmed_uid);
localStorage.setItem('confirmed_uid', data.confirmed_uid);
localStorage.setItem('authToken', authToken);
localStorage.setItem('uid_time', Date.now().toString());
import('./toast.js').then(({ showToast }) => {
showToast('✅ Login successful!');
// Update UI state after login
const guestDashboard = document.getElementById('guest-dashboard');
const userDashboard = document.getElementById('user-dashboard');
const registerPage = document.getElementById('register-page');
if (guestDashboard) guestDashboard.style.display = 'none';
if (userDashboard) userDashboard.style.display = 'block';
if (registerPage) registerPage.style.display = 'none';
// Show the user's stream page
if (window.showOnly) {
window.showOnly('me-page');
}
});
return;
}
alert(data.detail || 'Login failed.');
} else {
const text = await res.text();
alert(text || 'Login failed.');
}
} catch (err) {
alert('Network error: ' + err);
if (token) {
// Debug messages disabled
} else {
// Debug messages disabled
}
}
// Export for backward compatibility
export { magicLoginSubmitted };
// Make showSection available globally for AuthManager
window.showSection = showSection;

View File

@ -7,447 +7,97 @@ function getCookie(name) {
return null;
}
document.addEventListener("DOMContentLoaded", () => {
// Check authentication status
const isLoggedIn = !!getCookie('uid');
// Update body class for CSS-based visibility
document.body.classList.toggle('logged-in', isLoggedIn);
// Get all main content sections
const mainSections = Array.from(document.querySelectorAll('main > section'));
// Show/hide sections with smooth transitions
const showSection = (sectionId) => {
// Update body class to indicate current page
document.body.className = '';
if (sectionId) {
document.body.classList.add(`page-${sectionId}`);
// Determines the correct section to show based on auth status and requested section
function getValidSection(sectionId) {
const isLoggedIn = !!getCookie('uid');
const protectedSections = ['me-page', 'account-page'];
const guestOnlySections = ['login-page', 'register-page', 'magic-login-page'];
if (isLoggedIn) {
// If logged in, guest-only sections are invalid, redirect to 'me-page'
if (guestOnlySections.includes(sectionId)) {
return 'me-page';
}
} else {
document.body.classList.add('page-welcome');
// If not logged in, protected sections are invalid, redirect to 'welcome-page'
if (protectedSections.includes(sectionId)) {
return 'welcome-page';
}
}
// If the section doesn't exist in the DOM, default to welcome page
if (!document.getElementById(sectionId)) {
return 'welcome-page';
}
return sectionId;
}
// Main function to show/hide sections
export function showSection(sectionId) {
const mainSections = Array.from(document.querySelectorAll('main > section'));
// Update body class for page-specific CSS
document.body.className = document.body.className.replace(/page-\S+/g, '');
document.body.classList.add(`page-${sectionId || 'welcome-page'}`);
// Update active state of navigation links
document.querySelectorAll('.dashboard-nav a').forEach(link => {
link.classList.remove('active');
if ((!sectionId && link.getAttribute('href') === '#welcome-page') ||
(sectionId && link.getAttribute('href') === `#${sectionId}`)) {
link.classList.add('active');
}
link.classList.remove('active');
if (link.getAttribute('href') === `#${sectionId}`) {
link.classList.add('active');
}
});
mainSections.forEach(section => {
// Skip navigation sections
if (section.id === 'guest-dashboard' || section.id === 'user-dashboard') {
return;
}
const isTarget = section.id === sectionId;
const isLegalPage = ['terms-page', 'privacy-page', 'imprint-page'].includes(sectionId);
const isWelcomePage = !sectionId || sectionId === 'welcome-page';
if (isTarget || (isLegalPage && section.id === sectionId)) {
// Show the target section or legal page
section.classList.add('active');
section.hidden = false;
// Focus the section for accessibility with a small delay
// Only focus if the section is focusable and in the viewport
const focusSection = () => {
try {
if (section && typeof section.focus === 'function' &&
section.offsetParent !== null && // Check if element is visible
section.getBoundingClientRect().top < window.innerHeight &&
section.getBoundingClientRect().bottom > 0) {
section.focus({ preventScroll: true });
}
} catch (e) {
// Silently fail if focusing isn't possible
if (window.DEBUG_NAV || (window.location.hostname === 'localhost' || window.location.hostname === '127.0.0.1')) {
console.debug('Could not focus section:', e);
}
}
};
// Use requestAnimationFrame for better performance
requestAnimationFrame(() => {
// Only set the timeout in debug mode or local development
if (window.DEBUG_NAV || (window.location.hostname === 'localhost' || window.location.hostname === '127.0.0.1')) {
setTimeout(focusSection, 50);
} else {
focusSection();
}
});
} else if (isWelcomePage && section.id === 'welcome-page') {
// Special handling for welcome page
section.classList.add('active');
section.hidden = false;
} else {
// Hide other sections
section.classList.remove('active');
section.hidden = true;
}
section.hidden = section.id !== sectionId;
});
// Update URL hash without page scroll
if (sectionId && !['terms-page', 'privacy-page', 'imprint-page'].includes(sectionId)) {
if (sectionId === 'welcome-page') {
history.replaceState(null, '', window.location.pathname);
} else {
history.replaceState(null, '', `#${sectionId}`);
}
}
};
// Handle initial page load
const getValidSection = (sectionId) => {
const protectedSections = ['me-page', 'register-page'];
// If not logged in and trying to access protected section
if (!isLoggedIn && protectedSections.includes(sectionId)) {
return 'welcome-page';
}
// If section doesn't exist, default to welcome page
if (!document.getElementById(sectionId)) {
return 'welcome-page';
}
return sectionId;
};
// Process initial page load
const initialPage = window.location.hash.substring(1) || 'welcome-page';
const validSection = getValidSection(initialPage);
// Update URL if needed
if (validSection !== initialPage) {
window.location.hash = validSection;
}
// Show the appropriate section
showSection(validSection);
const Router = {
sections: Array.from(document.querySelectorAll("main > section")),
showOnly(id) {
// Validate the section ID
const validId = getValidSection(id);
// Update URL if needed
if (validId !== id) {
window.location.hash = validId;
return;
}
// Show the requested section
showSection(validId);
// Handle the quota meter visibility - only show with 'me-page'
const quotaMeter = document.getElementById('quota-meter');
if (quotaMeter) {
quotaMeter.hidden = validId !== 'me-page';
quotaMeter.tabIndex = validId === 'me-page' ? 0 : -1;
}
// Update navigation active states
this.updateActiveNav(validId);
},
updateActiveNav(activeId) {
// Update active states for navigation links
document.querySelectorAll('.dashboard-nav a').forEach(link => {
const target = link.getAttribute('href').substring(1);
if (target === activeId) {
link.setAttribute('aria-current', 'page');
link.classList.add('active');
// Update URL hash without causing a page scroll, this is for direct calls to showSection
// Normal navigation is handled by the hashchange listener
const currentHash = `#${sectionId}`;
if (window.location.hash !== currentHash) {
if (history.pushState) {
if (sectionId && sectionId !== 'welcome-page') {
history.pushState(null, null, currentHash);
} else {
history.pushState(null, null, window.location.pathname + window.location.search);
}
}
}
}
document.addEventListener("DOMContentLoaded", () => {
const isLoggedIn = !!getCookie('uid');
document.body.classList.toggle('authenticated', isLoggedIn);
// Unified click handler for SPA navigation
document.body.addEventListener('click', (e) => {
const link = e.target.closest('a[href^="#"]');
// Ensure the link is not inside a component that handles its own navigation
if (!link || link.closest('.no-global-nav')) return;
e.preventDefault();
const newHash = link.getAttribute('href');
if (window.location.hash !== newHash) {
window.location.hash = newHash;
}
});
// Main routing logic on hash change
const handleNavigation = () => {
const sectionId = window.location.hash.substring(1) || 'welcome-page';
const validSectionId = getValidSection(sectionId);
if (sectionId !== validSectionId) {
window.location.hash = validSectionId; // This will re-trigger handleNavigation
} else {
link.removeAttribute('aria-current');
link.classList.remove('active');
showSection(validSectionId);
}
});
}
};
// Initialize the router
const router = Router;
// Handle section visibility based on authentication
const updateSectionVisibility = (sectionId) => {
const section = document.getElementById(sectionId);
if (!section) return;
// Skip navigation sections and quota meter
if (['guest-dashboard', 'user-dashboard', 'quota-meter'].includes(sectionId)) {
return;
}
const currentHash = window.location.hash.substring(1);
const isLegalPage = ['terms-page', 'privacy-page', 'imprint-page'].includes(sectionId);
// Special handling for legal pages - always show when in hash
if (isLegalPage) {
const isActive = sectionId === currentHash;
section.hidden = !isActive;
section.tabIndex = isActive ? 0 : -1;
if (isActive) section.focus();
return;
}
// Special handling for me-page - only show to authenticated users
if (sectionId === 'me-page') {
section.hidden = !isLoggedIn || currentHash !== 'me-page';
section.tabIndex = (isLoggedIn && currentHash === 'me-page') ? 0 : -1;
return;
}
// Special handling for register page - only show to guests
if (sectionId === 'register-page') {
section.hidden = isLoggedIn || currentHash !== 'register-page';
section.tabIndex = (!isLoggedIn && currentHash === 'register-page') ? 0 : -1;
return;
}
// For other sections, show if they match the current section ID
const isActive = sectionId === currentHash;
section.hidden = !isActive;
section.tabIndex = isActive ? 0 : -1;
if (isActive) {
section.focus();
}
};
// Initialize the router
router.init = function() {
// Update visibility for all sections
this.sections.forEach(section => {
updateSectionVisibility(section.id);
});
// Show user-upload-area only when me-page is shown and user is logged in
const userUpload = document.getElementById("user-upload-area");
if (userUpload) {
const uid = getCookie("uid");
userUpload.style.display = (window.location.hash === '#me-page' && uid) ? '' : 'none';
}
// Store the current page
localStorage.setItem("last_page", window.location.hash.substring(1));
// Initialize navigation
initNavLinks();
initBackButtons();
initStreamLinks();
// Ensure proper focus management for accessibility
const currentSection = document.querySelector('main > section:not([hidden])');
if (currentSection) {
currentSection.setAttribute('tabindex', '0');
currentSection.focus();
}
};
// Initialize the router
router.init();
// Handle footer links
document.querySelectorAll('.footer-links a').forEach(link => {
link.addEventListener('click', (e) => {
e.preventDefault();
const target = link.dataset.target;
if (target) {
// Show the target section without updating URL hash
showSection(target);
}
});
});
// Export the showOnly function for global access
window.showOnly = router.showOnly.bind(router);
// Make router available globally for debugging
window.appRouter = router;
};
// Highlight active profile link on browser back/forward navigation
function highlightActiveProfileLink() {
const params = new URLSearchParams(window.location.search);
const profileUid = params.get('profile');
const ul = document.getElementById('stream-list');
if (!ul) return;
ul.querySelectorAll('a.profile-link').forEach(link => {
const url = new URL(link.href, window.location.origin);
const uidParam = url.searchParams.get('profile');
link.classList.toggle('active', uidParam === profileUid);
});
}
window.addEventListener('popstate', () => {
const params = new URLSearchParams(window.location.search);
const profileUid = params.get('profile');
const currentPage = window.location.hash.substring(1) || 'welcome-page';
// Prevent unauthorized access to me-page
if ((currentPage === 'me-page' || profileUid) && !getCookie('uid')) {
history.replaceState(null, '', '#welcome-page');
showOnly('welcome-page');
return;
}
if (profileUid) {
showOnly('me-page');
if (typeof window.showProfilePlayerFromUrl === 'function') {
window.showProfilePlayerFromUrl();
}
} else {
highlightActiveProfileLink();
}
});
window.addEventListener('hashchange', handleNavigation);
/* restore last page (unless magiclink token present) */
const params = new URLSearchParams(location.search);
const token = params.get("token");
if (!token) {
const last = localStorage.getItem("last_page");
if (last && document.getElementById(last)) {
showOnly(last);
} else if (document.getElementById("welcome-page")) {
// Show Welcome page by default for all new/guest users
showOnly("welcome-page");
}
// Highlight active link on initial load
highlightActiveProfileLink();
}
/* token → show magiclogin page */
if (token) {
document.getElementById("magic-token").value = token;
showOnly("magic-login-page");
const err = params.get("error");
if (err) {
const box = document.getElementById("magic-error");
box.textContent = decodeURIComponent(err);
box.style.display = "block";
}
}
function renderStreamList(streams) {
const ul = document.getElementById("stream-list");
if (!ul) return;
if (streams.length) {
streams.sort();
ul.innerHTML = streams.map(uid => `
<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${uid}</a></li>
`).join("");
} else {
ul.innerHTML = "<li>No active streams.</li>";
}
// Ensure correct link is active after rendering
highlightActiveProfileLink();
}
// Initialize navigation listeners
function initNavLinks() {
const navIds = ["links", "user-dashboard", "guest-dashboard"];
navIds.forEach(id => {
const nav = document.getElementById(id);
if (!nav) return;
nav.addEventListener("click", e => {
const a = e.target.closest("a[data-target]");
if (!a || !nav.contains(a)) return;
e.preventDefault();
// Save audio state before navigation
const audio = document.getElementById('me-audio');
const wasPlaying = audio && !audio.paused;
const currentTime = audio ? audio.currentTime : 0;
const target = a.dataset.target;
if (target) showOnly(target);
// Handle stream page specifically
if (target === "stream-page" && typeof window.maybeLoadStreamsOnShow === "function") {
window.maybeLoadStreamsOnShow();
}
// Handle me-page specifically
else if (target === "me-page" && audio) {
// Restore audio state if it was playing
if (wasPlaying) {
audio.currentTime = currentTime;
audio.play().catch(e => console.error('Play failed:', e));
}
}
});
});
// Add click handlers for footer links with audio state saving
document.querySelectorAll(".footer-links a").forEach(link => {
link.addEventListener("click", (e) => {
e.preventDefault();
const target = link.dataset.target;
if (!target) return;
// Save audio state before navigation
const audio = document.getElementById('me-audio');
const wasPlaying = audio && !audio.paused;
const currentTime = audio ? audio.currentTime : 0;
showOnly(target);
// Handle me-page specifically
if (target === "me-page" && audio) {
// Restore audio state if it was playing
if (wasPlaying) {
audio.currentTime = currentTime;
audio.play().catch(e => console.error('Play failed:', e));
}
}
});
});
}
function initBackButtons() {
document.querySelectorAll('a[data-back]').forEach(btn => {
btn.addEventListener("click", e => {
e.preventDefault();
const target = btn.dataset.back;
if (target) showOnly(target);
// Ensure streams load instantly when stream-page is shown
if (target === "stream-page" && typeof window.maybeLoadStreamsOnShow === "function") {
window.maybeLoadStreamsOnShow();
}
});
});
}
function initStreamLinks() {
const ul = document.getElementById("stream-list");
if (!ul) return;
ul.addEventListener("click", e => {
const a = e.target.closest("a.profile-link");
if (!a || !ul.contains(a)) return;
e.preventDefault();
const url = new URL(a.href, window.location.origin);
const profileUid = url.searchParams.get("profile");
if (profileUid && window.location.search !== `?profile=${encodeURIComponent(profileUid)}`) {
window.profileNavigationTriggered = true;
window.history.pushState({}, '', `/?profile=${encodeURIComponent(profileUid)}`);
window.dispatchEvent(new Event("popstate"));
}
});
}
// Initialize Router
document.addEventListener('visibilitychange', () => {
// Re-check authentication when tab becomes visible again
if (!document.hidden && window.location.hash === '#me-page' && !getCookie('uid')) {
window.location.hash = 'welcome-page';
showOnly('welcome-page');
}
});
Router.init();
// Initial page load
handleNavigation();
});

85
static/personal-player.js Normal file
View File

@ -0,0 +1,85 @@
import { showToast } from "./toast.js";
import { SharedAudioPlayer } from './shared-audio-player.js';
function getPersonalStreamUrl(uid) {
return `/audio/${encodeURIComponent(uid)}/stream.opus`;
}
function updatePlayPauseButton(button, isPlaying) {
if (button) button.textContent = isPlaying ? '⏸️' : '▶️';
// Optionally, update other UI elements here
}
const personalPlayer = new SharedAudioPlayer({
playerType: 'personal',
getStreamUrl: getPersonalStreamUrl,
onUpdateButton: updatePlayPauseButton
});
/**
* Finds or creates the audio element for the personal stream.
* @returns {HTMLAudioElement | null}
*/
function cleanupPersonalAudio() {
if (audioElement) {
try {
if (audioElement._eventHandlers) {
const { onPlay, onPause, onEnded, onError } = audioElement._eventHandlers;
if (onPlay) audioElement.removeEventListener('play', onPlay);
if (onPause) audioElement.removeEventListener('pause', onPause);
if (onEnded) audioElement.removeEventListener('ended', onEnded);
if (onError) audioElement.removeEventListener('error', onError);
}
audioElement.pause();
audioElement.removeAttribute('src');
audioElement.load();
if (audioElement._eventHandlers) delete audioElement._eventHandlers;
// Remove from DOM
if (audioElement.parentNode) audioElement.parentNode.removeChild(audioElement);
} catch (e) {
console.warn('[personal-player.js] Error cleaning up audio element:', e);
}
audioElement = null;
}
}
// Use the shared player for loading and playing the personal stream
export function loadProfileStream(uid, playPauseBtn) {
if (!uid) {
showToast('No UID provided for profile stream', 'error');
return;
}
personalPlayer.play(uid, playPauseBtn);
}
/**
* Initializes the personal audio player, setting up event listeners.
*/
export function initPersonalPlayer() {
const mePageSection = document.getElementById('me-page');
if (!mePageSection) return;
// Use a delegated event listener for the play button
mePageSection.addEventListener('click', (e) => {
const playPauseBtn = e.target.closest('.play-pause-btn');
if (!playPauseBtn) return;
e.stopPropagation();
const uid = localStorage.getItem('uid');
if (!uid) {
showToast('Please log in to play audio.', 'error');
return;
}
// Toggle play/pause
if (personalPlayer.audioElement && !personalPlayer.audioElement.paused && !personalPlayer.audioElement.ended) {
personalPlayer.pause();
} else {
loadProfileStream(uid, playPauseBtn);
}
});
// Make loadProfileStream globally accessible for upload.js
window.loadProfileStream = loadProfileStream;
}

View File

View File

@ -0,0 +1,70 @@
/**
* Cleanup Script: Remove Redundant confirmed_uid from localStorage
*
* This script removes the redundant confirmed_uid field from localStorage
* for users who might have it stored from the old authentication system.
*/
(function() {
'use strict';
console.log('[CONFIRMED_UID_CLEANUP] Starting cleanup of redundant confirmed_uid field...');
// Check if confirmed_uid exists in localStorage
const confirmedUid = localStorage.getItem('confirmed_uid');
const currentUid = localStorage.getItem('uid');
if (confirmedUid) {
console.log(`[CONFIRMED_UID_CLEANUP] Found confirmed_uid: ${confirmedUid}`);
console.log(`[CONFIRMED_UID_CLEANUP] Current uid: ${currentUid}`);
// Verify that uid exists and is properly set
if (!currentUid) {
console.warn('[CONFIRMED_UID_CLEANUP] No uid found, setting uid from confirmed_uid');
localStorage.setItem('uid', confirmedUid);
} else if (currentUid !== confirmedUid) {
console.warn(`[CONFIRMED_UID_CLEANUP] UID mismatch - uid: ${currentUid}, confirmed_uid: ${confirmedUid}`);
console.log('[CONFIRMED_UID_CLEANUP] Keeping current uid value');
}
// Remove the redundant confirmed_uid
localStorage.removeItem('confirmed_uid');
console.log('[CONFIRMED_UID_CLEANUP] Removed redundant confirmed_uid from localStorage');
// Log the cleanup action
console.log('[CONFIRMED_UID_CLEANUP] Cleanup completed successfully');
} else {
console.log('[CONFIRMED_UID_CLEANUP] No confirmed_uid found, no cleanup needed');
}
// Also check for any other potential redundant fields
const redundantFields = [
'confirmed_uid', // Main target
'confirmedUid', // Camel case variant
'confirmed-uid' // Hyphenated variant
];
let removedCount = 0;
redundantFields.forEach(field => {
if (localStorage.getItem(field)) {
localStorage.removeItem(field);
removedCount++;
console.log(`[CONFIRMED_UID_CLEANUP] Removed redundant field: ${field}`);
}
});
if (removedCount > 0) {
console.log(`[CONFIRMED_UID_CLEANUP] Removed ${removedCount} redundant authentication fields`);
}
console.log('[CONFIRMED_UID_CLEANUP] Cleanup process completed');
})();
// Export for manual execution if needed
if (typeof window !== 'undefined') {
window.removeConfirmedUidCleanup = function() {
const script = document.createElement('script');
script.src = '/static/remove-confirmed-uid.js';
document.head.appendChild(script);
};
}

View File

@ -1,168 +0,0 @@
// static/router.js — core routing for SPA navigation
export const Router = {
sections: [],
// Map URL hashes to section IDs
sectionMap: {
'welcome': 'welcome-page',
'streams': 'stream-page',
'account': 'register-page',
'login': 'login-page',
'me': 'me-page',
'your-stream': 'me-page' // Map 'your-stream' to 'me-page'
},
init() {
this.sections = Array.from(document.querySelectorAll("main > section"));
// Set up hash change handler
window.addEventListener('hashchange', this.handleHashChange.bind(this));
// Initial route
this.handleHashChange();
},
handleHashChange() {
let hash = window.location.hash.substring(1) || 'welcome';
// First check if the hash matches any direct section ID
const directSection = this.sections.find(sec => sec.id === hash);
if (directSection) {
// If it's a direct section ID match, show it directly
this.showOnly(hash);
} else {
// Otherwise, use the section map
const sectionId = this.sectionMap[hash] || hash;
this.showOnly(sectionId);
}
},
showOnly(id) {
if (!id) return;
// Update URL hash without triggering hashchange
if (window.location.hash !== `#${id}`) {
window.history.pushState(null, '', `#${id}`);
}
const isAuthenticated = document.body.classList.contains('authenticated');
const isMePage = id === 'me-page' || id === 'your-stream';
// Helper function to update section visibility
const updateSection = (sec) => {
const isTarget = sec.id === id;
const isGuestOnly = sec.classList.contains('guest-only');
const isAuthOnly = sec.classList.contains('auth-only');
const isAlwaysVisible = sec.classList.contains('always-visible');
const isQuotaMeter = sec.id === 'quota-meter';
const isUserUploadArea = sec.id === 'user-upload-area';
const isLogOut = sec.id === 'log-out';
// Determine if section should be visible
let shouldShow = isTarget;
// Always show sections with always-visible class
if (isAlwaysVisible) {
shouldShow = true;
}
// Handle guest-only sections
if (isGuestOnly && isAuthenticated) {
shouldShow = false;
}
// Handle auth-only sections
if (isAuthOnly && !isAuthenticated) {
shouldShow = false;
}
// Special case for me-page and its children
const isChildOfMePage = sec.closest('#me-page') !== null;
const shouldBeActive = isTarget ||
(isQuotaMeter && isMePage) ||
(isUserUploadArea && isMePage) ||
(isLogOut && isMePage) ||
(isChildOfMePage && isMePage);
// Update visibility and tab index
sec.hidden = !shouldShow;
sec.tabIndex = shouldShow ? 0 : -1;
// Update active state and ARIA attributes
if (shouldBeActive) {
sec.setAttribute('aria-current', 'page');
sec.classList.add('active');
// Ensure target section is visible
if (sec.hidden) {
sec.style.display = 'block';
sec.hidden = false;
}
// Show all children of the active section
if (isTarget) {
sec.focus();
// Make sure all auth-only children are visible
const authChildren = sec.querySelectorAll('.auth-only');
authChildren.forEach(child => {
if (isAuthenticated) {
child.style.display = '';
child.hidden = false;
}
});
}
} else {
sec.removeAttribute('aria-current');
sec.classList.remove('active');
// Reset display property for sections when not active
if (shouldShow && !isAlwaysVisible) {
sec.style.display = ''; // Reset to default from CSS
}
}
};
// Update all sections
this.sections.forEach(updateSection);
// Update active nav links
document.querySelectorAll('[data-target], [href^="#"]').forEach(link => {
let target = link.getAttribute('data-target');
const href = link.getAttribute('href');
// If no data-target, try to get from href
if (!target && href) {
// Remove any query parameters and # from the href
const hash = href.split('?')[0].substring(1);
// Use mapped section ID or the hash as is
target = this.sectionMap[hash] || hash;
}
// Check if this link points to the current section or its mapped equivalent
const linkId = this.sectionMap[target] || target;
const currentId = this.sectionMap[id] || id;
if (linkId === currentId) {
link.setAttribute('aria-current', 'page');
link.classList.add('active');
} else {
link.removeAttribute('aria-current');
link.classList.remove('active');
}
});
// Close mobile menu if open
const menuToggle = document.querySelector('.menu-toggle');
if (menuToggle && menuToggle.getAttribute('aria-expanded') === 'true') {
menuToggle.setAttribute('aria-expanded', 'false');
document.body.classList.remove('menu-open');
}
localStorage.setItem("last_page", id);
}
};
// Initialize router when DOM is loaded
document.addEventListener('DOMContentLoaded', () => {
Router.init();
});
export const showOnly = Router.showOnly.bind(Router);

View File

@ -0,0 +1,162 @@
// shared-audio-player.js
// Unified audio player logic for both streams and personal player
import { globalAudioManager } from './global-audio-manager.js';
export class SharedAudioPlayer {
constructor({ playerType, getStreamUrl, onUpdateButton }) {
this.playerType = playerType; // 'streams' or 'personal'
this.getStreamUrl = getStreamUrl; // function(uid) => url
this.onUpdateButton = onUpdateButton; // function(button, isPlaying)
this.audioElement = null;
this.currentUid = null;
this.isPlaying = false;
this.currentButton = null;
this._eventHandlers = {};
// Register stop listener
globalAudioManager.addListener(playerType, () => {
this.stop();
});
}
pause() {
if (this.audioElement && !this.audioElement.paused && !this.audioElement.ended) {
this.audioElement.pause();
this.isPlaying = false;
if (this.onUpdateButton && this.currentButton) {
this.onUpdateButton(this.currentButton, false);
}
}
}
async play(uid, button) {
const ctx = `[SharedAudioPlayer][${this.playerType}]${uid ? `[${uid}]` : ''}`;
const isSameUid = this.currentUid === uid;
const isActive = this.audioElement && !this.audioElement.paused && !this.audioElement.ended;
// Guard: If already playing the requested UID and not paused/ended, do nothing
if (isSameUid && isActive) {
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, true);
return;
}
// If same UID but paused, resume
if (isSameUid && this.audioElement && this.audioElement.paused && !this.audioElement.ended) {
try {
await this.audioElement.play();
this.isPlaying = true;
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, true);
globalAudioManager.startPlayback(this.playerType, uid);
} catch (err) {
this.isPlaying = false;
if (this.onUpdateButton) this.onUpdateButton(button || this.currentButton, false);
console.error(`${ctx} play() resume failed:`, err);
}
return;
}
// Otherwise, stop current and start new
if (!isSameUid && this.audioElement) {
} else {
}
this.stop();
this.currentUid = uid;
this.currentButton = button;
const url = this.getStreamUrl(uid);
this.audioElement = new Audio(url);
this.audioElement.preload = 'auto';
this.audioElement.crossOrigin = 'anonymous';
this.audioElement.style.display = 'none';
document.body.appendChild(this.audioElement);
this._attachEventHandlers();
try {
await this.audioElement.play();
this.isPlaying = true;
if (this.onUpdateButton) this.onUpdateButton(button, true);
globalAudioManager.startPlayback(this.playerType, uid);
} catch (err) {
this.isPlaying = false;
if (this.onUpdateButton) this.onUpdateButton(button, false);
console.error(`${ctx} play() failed:`, err);
}
}
stop() {
if (this.audioElement) {
this._removeEventHandlers();
try {
this.audioElement.pause();
this.audioElement.removeAttribute('src');
this.audioElement.load();
if (this.audioElement.parentNode) {
this.audioElement.parentNode.removeChild(this.audioElement);
}
} catch (e) {
console.warn('[shared-audio-player] Error cleaning up audio element:', e);
}
this.audioElement = null;
}
this.isPlaying = false;
this.currentUid = null;
if (this.currentButton && this.onUpdateButton) {
this.onUpdateButton(this.currentButton, false);
}
this.currentButton = null;
}
_attachEventHandlers() {
if (!this.audioElement) return;
const ctx = `[SharedAudioPlayer][${this.playerType}]${this.currentUid ? `[${this.currentUid}]` : ''}`;
const logEvent = (event) => {
// Debug logging disabled
};
// Core handlers
const onPlay = (e) => {
logEvent(e);
this.isPlaying = true;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, true);
};
const onPause = (e) => {
logEvent(e);
// console.trace(`${ctx} Audio pause stack trace:`);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
};
const onEnded = (e) => {
logEvent(e);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
};
const onError = (e) => {
logEvent(e);
this.isPlaying = false;
if (this.currentButton && this.onUpdateButton) this.onUpdateButton(this.currentButton, false);
console.error(`${ctx} Audio error:`, e);
};
// Attach handlers
this.audioElement.addEventListener('play', onPlay);
this.audioElement.addEventListener('pause', onPause);
this.audioElement.addEventListener('ended', onEnded);
this.audioElement.addEventListener('error', onError);
// Attach debug logging for all relevant events
const debugEvents = [
'abort','canplay','canplaythrough','durationchange','emptied','encrypted','loadeddata','loadedmetadata',
'loadstart','playing','progress','ratechange','seeked','seeking','stalled','suspend','timeupdate','volumechange','waiting'
];
debugEvents.forEach(evt => {
this.audioElement.addEventListener(evt, logEvent);
}); // Logging now disabled
this._eventHandlers = { onPlay, onPause, onEnded, onError, debugEvents, logEvent };
}
_removeEventHandlers() {
if (!this.audioElement || !this._eventHandlers) return;
const { onPlay, onPause, onEnded, onError } = this._eventHandlers;
if (onPlay) this.audioElement.removeEventListener('play', onPlay);
if (onPause) this.audioElement.removeEventListener('pause', onPause);
if (onEnded) this.audioElement.removeEventListener('ended', onEnded);
if (onError) this.audioElement.removeEventListener('error', onError);
this._eventHandlers = {};
}
}

View File

@ -1,17 +1,30 @@
// sound.js — reusable Web Audio beep
export function playBeep(frequency = 432, duration = 0.2, type = 'sine') {
const ctx = new (window.AudioContext || window.webkitAudioContext)();
const osc = ctx.createOscillator();
const gain = ctx.createGain();
try {
// Validate parameters to prevent audio errors
if (!Number.isFinite(frequency) || frequency <= 0) {
frequency = 432; // fallback to default
}
if (!Number.isFinite(duration) || duration <= 0) {
duration = 0.2; // fallback to default
}
const ctx = new (window.AudioContext || window.webkitAudioContext)();
const osc = ctx.createOscillator();
const gain = ctx.createGain();
osc.type = type;
osc.frequency.value = frequency;
osc.type = type;
osc.frequency.value = frequency;
osc.connect(gain);
gain.connect(ctx.destination);
osc.connect(gain);
gain.connect(ctx.destination);
gain.gain.setValueAtTime(0.1, ctx.currentTime); // subtle volume
osc.start();
osc.stop(ctx.currentTime + duration);
gain.gain.setValueAtTime(0.1, ctx.currentTime); // subtle volume
osc.start();
osc.stop(ctx.currentTime + duration);
} catch (error) {
// Silently handle audio errors to prevent breaking upload flow
console.warn('[SOUND] Audio beep failed:', error.message);
}
}

View File

@ -1,5 +1,6 @@
// static/streams-ui.js — public streams loader and profile-link handling
import { showOnly } from './router.js';
import { globalAudioManager } from './global-audio-manager.js';
// Global variable to track if we should force refresh the stream list
let shouldForceRefresh = false;
@ -24,6 +25,12 @@ export function initStreamsUI() {
});
document.addEventListener('visibilitychange', maybeLoadStreamsOnShow);
maybeLoadStreamsOnShow();
// Register with global audio manager to handle stop requests from other players
globalAudioManager.addListener('streams', () => {
// Debug messages disabled
stopPlayback();
});
}
function maybeLoadStreamsOnShow() {
@ -72,10 +79,10 @@ document.addEventListener('DOMContentLoaded', () => {
function loadAndRenderStreams() {
const ul = document.getElementById('stream-list');
if (!ul) {
console.error('[STREAMS-UI] Stream list element not found');
// Debug messages disabled
return;
}
console.log('[STREAMS-UI] loadAndRenderStreams called, shouldForceRefresh:', shouldForceRefresh);
// Debug messages disabled
// Don't start a new connection if one is already active and we're not forcing a refresh
if (activeSSEConnection && !shouldForceRefresh) {
@ -133,7 +140,7 @@ function loadAndRenderStreams() {
window.location.hostname === '127.0.0.1';
if (isLocalDevelopment || window.DEBUG_STREAMS) {
const duration = Date.now() - connectionStartTime;
console.group('[streams-ui] Connection timeout reached');
// Debug messages disabled
console.log(`Duration: ${duration}ms`);
console.log('Current time:', new Date().toISOString());
console.log('Streams received:', streams.length);
@ -196,18 +203,18 @@ function loadAndRenderStreams() {
// Process the stream
function processStream({ done, value }) {
console.log('[STREAMS-UI] processStream called with done:', done);
// Debug messages disabled
if (done) {
console.log('[STREAMS-UI] Stream processing complete');
// Debug messages disabled
// Process any remaining data in the buffer
if (buffer.trim()) {
console.log('[STREAMS-UI] Processing remaining buffer data');
// Debug messages disabled
try {
const data = JSON.parse(buffer);
console.log('[STREAMS-UI] Parsed data from buffer:', data);
// Debug messages disabled
processSSEEvent(data);
} catch (e) {
console.error('[STREAMS-UI] Error parsing buffer data:', e);
// Debug messages disabled
}
}
return;
@ -230,7 +237,7 @@ function loadAndRenderStreams() {
const data = JSON.parse(dataMatch[1]);
processSSEEvent(data);
} catch (e) {
console.error('[streams-ui] Error parsing event data:', e, 'Event:', event);
// Debug messages disabled
}
}
}
@ -291,7 +298,7 @@ function loadAndRenderStreams() {
// Function to process SSE events
function processSSEEvent(data) {
console.log('[STREAMS-UI] Processing SSE event:', data);
// Debug messages disabled
if (data.end) {
if (streams.length === 0) {
ul.innerHTML = '<li>No active streams.</li>';
@ -307,6 +314,7 @@ function loadAndRenderStreams() {
// Render each stream in sorted order
streams.forEach((stream, index) => {
const uid = stream.uid || `stream-${index}`;
const username = stream.username || 'Unknown User';
const sizeMb = stream.size ? (stream.size / (1024 * 1024)).toFixed(1) : '?';
const mtime = stream.mtime ? new Date(stream.mtime * 1000).toISOString().split('T')[0].replace(/-/g, '/') : '';
@ -316,7 +324,7 @@ function loadAndRenderStreams() {
try {
li.innerHTML = `
<article class="stream-player" data-uid="${escapeHtml(uid)}">
<h3>${escapeHtml(uid)}</h3>
<h3>${escapeHtml(username)}</h3>
<div class="audio-controls">
<button class="play-pause-btn" data-uid="${escapeHtml(uid)}" aria-label="Play">▶️</button>
</div>
@ -348,7 +356,7 @@ function loadAndRenderStreams() {
// Function to handle SSE errors
function handleSSEError(error) {
console.error('[streams-ui] SSE error:', error);
// Debug messages disabled
// Only show error if we haven't already loaded any streams
if (streams.length === 0) {
@ -378,11 +386,11 @@ function loadAndRenderStreams() {
export function renderStreamList(streams) {
const ul = document.getElementById('stream-list');
if (!ul) {
console.warn('[STREAMS-UI] renderStreamList: #stream-list not found');
// Debug messages disabled
return;
}
console.log('[STREAMS-UI] Rendering stream list with', streams.length, 'streams');
console.debug('[STREAMS-UI] Streams data:', streams);
// Debug messages disabled
// Debug messages disabled
if (Array.isArray(streams)) {
if (streams.length) {
// Sort by mtime descending (most recent first)
@ -390,9 +398,10 @@ export function renderStreamList(streams) {
ul.innerHTML = streams
.map(stream => {
const uid = stream.uid || '';
const username = stream.username || 'Unknown User';
const sizeKb = stream.size ? (stream.size / 1024).toFixed(1) : '?';
const mtime = stream.mtime ? new Date(stream.mtime * 1000).toLocaleString() : '';
return `<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${uid}</a> <span style='color:var(--text-muted);font-size:90%'>[${sizeKb} KB, ${mtime}]</span></li>`;
return `<li><a href="/?profile=${encodeURIComponent(uid)}" class="profile-link">▶ ${escapeHtml(username)}</a> <span style='color:var(--text-muted);font-size:90%'>[${sizeKb} KB, ${mtime}]</span></li>`;
})
.join('');
} else {
@ -400,10 +409,10 @@ export function renderStreamList(streams) {
}
} else {
ul.innerHTML = '<li>Error: Invalid stream data.</li>';
console.error('[streams-ui] renderStreamList: streams is not an array', streams);
// Debug messages disabled
}
highlightActiveProfileLink();
console.debug('[streams-ui] renderStreamList complete');
// Debug messages disabled
}
export function highlightActiveProfileLink() {
@ -454,12 +463,7 @@ function escapeHtml(unsafe) {
.replace(/'/g, "&#039;");
}
// Function to update play/pause button state
function updatePlayPauseButton(button, isPlaying) {
if (!button) return;
button.textContent = isPlaying ? '⏸️' : '▶️';
button.setAttribute('aria-label', isPlaying ? 'Pause' : 'Play');
}
// Audio context for Web Audio API
let audioContext = null;
@ -483,7 +487,7 @@ function getAudioContext() {
// Stop current playback completely
function stopPlayback() {
console.log('[streams-ui] Stopping playback');
// Debug messages disabled
// Stop Web Audio API if active
if (audioSource) {
@ -539,6 +543,9 @@ function stopPlayback() {
pauseTime = 0;
audioStartTime = 0;
// Notify global audio manager that streams player has stopped
globalAudioManager.stopPlayback('streams');
// Update UI
if (currentlyPlayingButton) {
updatePlayPauseButton(currentlyPlayingButton, false);
@ -549,117 +556,28 @@ function stopPlayback() {
currentlyPlayingAudio = null;
}
// Load and play audio using HTML5 Audio element for Opus
async function loadAndPlayAudio(uid, playPauseBtn) {
// If we already have an audio element for this UID and it's paused, just resume it
if (audioElement && currentUid === uid && audioElement.paused) {
try {
await audioElement.play();
isPlaying = true;
updatePlayPauseButton(playPauseBtn, true);
return;
} catch (error) {
// Fall through to reload if resume fails
}
}
// Stop any current playback
stopPlayback();
// Update UI
updatePlayPauseButton(playPauseBtn, true);
currentlyPlayingButton = playPauseBtn;
currentUid = uid;
try {
// Create a new audio element with the correct MIME type
const audioUrl = `/audio/${encodeURIComponent(uid)}/stream.opus`;
// Create a new audio element with a small delay to prevent race conditions
await new Promise(resolve => setTimeout(resolve, 50));
audioElement = new Audio(audioUrl);
audioElement.preload = 'auto';
audioElement.crossOrigin = 'anonymous'; // Important for CORS
// Set up event handlers with proper binding
const onPlay = () => {
isPlaying = true;
updatePlayPauseButton(playPauseBtn, true);
};
const onPause = () => {
isPlaying = false;
updatePlayPauseButton(playPauseBtn, false);
};
const onEnded = () => {
isPlaying = false;
cleanupAudio();
};
const onError = (e) => {
// Ignore errors from previous audio elements that were cleaned up
if (!audioElement || audioElement.readyState === 0) {
return;
}
isPlaying = false;
updatePlayPauseButton(playPauseBtn, false);
// Don't show error to user for aborted requests
if (audioElement.error && audioElement.error.code === MediaError.MEDIA_ERR_ABORTED) {
return;
}
// Show error to user for other errors
if (typeof showToast === 'function') {
showToast('Error playing audio. The format may not be supported.', 'error');
}
};
// Add event listeners
audioElement.addEventListener('play', onPlay, { once: true });
audioElement.addEventListener('pause', onPause);
audioElement.addEventListener('ended', onEnded, { once: true });
audioElement.addEventListener('error', onError);
// Store references for cleanup
audioElement._eventHandlers = { onPlay, onPause, onEnded, onError };
// Start playback with error handling
try {
const playPromise = audioElement.play();
if (playPromise !== undefined) {
await playPromise.catch(error => {
// Ignore abort errors when switching between streams
if (error.name !== 'AbortError') {
throw error;
}
});
}
isPlaying = true;
} catch (error) {
// Only log unexpected errors
if (error.name !== 'AbortError') {
console.error('[streams-ui] Error during playback:', error);
throw error;
}
}
} catch (error) {
console.error('[streams-ui] Error loading/playing audio:', error);
if (playPauseBtn) {
updatePlayPauseButton(playPauseBtn, false);
}
// Only show error if it's not an abort error
if (error.name !== 'AbortError' && typeof showToast === 'function') {
showToast('Error playing audio. Please try again.', 'error');
}
}
// --- Shared Audio Player Integration ---
import { SharedAudioPlayer } from './shared-audio-player.js';
function getStreamUrl(uid) {
return `/audio/${encodeURIComponent(uid)}/stream.opus`;
}
function updatePlayPauseButton(button, isPlaying) {
if (button) button.textContent = isPlaying ? '⏸️' : '▶️';
// Optionally, update other UI elements here
}
// Only this definition should remain; remove any other updatePlayPauseButton functions.
const streamsPlayer = new SharedAudioPlayer({
playerType: 'streams',
getStreamUrl,
onUpdateButton: updatePlayPauseButton
});
// Load and play audio using SharedAudioPlayer
function loadAndPlayAudio(uid, playPauseBtn) {
streamsPlayer.play(uid, playPauseBtn);
}
// Handle audio ended event
@ -673,7 +591,7 @@ function handleAudioEnded() {
// Clean up audio resources
function cleanupAudio() {
console.log('[streams-ui] Cleaning up audio resources');
// Debug messages disabled
// Clean up Web Audio API resources if they exist
if (audioSource) {
@ -741,32 +659,14 @@ if (streamList) {
e.preventDefault();
const uid = playPauseBtn.dataset.uid;
if (!uid) {
return;
if (!uid) return;
// Toggle play/pause using SharedAudioPlayer
if (streamsPlayer.currentUid === uid && streamsPlayer.audioElement && !streamsPlayer.audioElement.paused && !streamsPlayer.audioElement.ended) {
streamsPlayer.pause();
} else {
await loadAndPlayAudio(uid, playPauseBtn);
}
// If clicking the currently playing button, toggle pause/play
if (currentUid === uid) {
if (isPlaying) {
await audioElement.pause();
isPlaying = false;
updatePlayPauseButton(playPauseBtn, false);
} else {
try {
await audioElement.play();
isPlaying = true;
updatePlayPauseButton(playPauseBtn, true);
} catch (error) {
// If resume fails, try reloading the audio
await loadAndPlayAudio(uid, playPauseBtn);
}
}
return;
}
// If a different stream is playing, stop it and start the new one
stopPlayback();
await loadAndPlayAudio(uid, playPauseBtn);
});
}

View File

@ -490,7 +490,7 @@ nav#guest-dashboard.dashboard-nav {
box-shadow: 0 4px 20px rgba(0, 0, 0, 0.4), 0 0 0 1px rgba(255, 255, 255, 0.1);
margin-top: 0.8em;
opacity: 0;
animation: fadeInOut 3.5s both;
animation: fadeInOut 15s both;
font-size: 1.1em;
pointer-events: auto;
border: 1px solid rgba(255, 255, 255, 0.1);
@ -580,7 +580,7 @@ nav#guest-dashboard.dashboard-nav {
}
/* Quota meter and uploaded files section */
#quota-meter {
#uploaded-files {
background: var(--surface); /* Match article background */
border: 1px solid var(--border);
border-radius: 8px;
@ -593,19 +593,19 @@ nav#guest-dashboard.dashboard-nav {
color: var(--text-light);
}
#quota-meter {
#uploaded-files {
transition: all 0.2s ease;
}
#quota-meter h4 {
#uploaded-files h3 {
font-weight: 400;
text-align: center;
margin: 1.5rem 0 0.75rem;
margin: 0 0 27px 0;
color: var(--text);
}
#quota-meter > h4 {
margin-top: 1.5rem;
#uploaded-files > h3 {
margin: 0 0 27px 0;
text-align: center;
font-weight: 400;
color: var(--text);
@ -732,7 +732,7 @@ nav#guest-dashboard.dashboard-nav {
border-bottom: none;
}
#quota-meter:hover {
#uploaded-files:hover {
transform: translateY(-2px);
box-shadow: 0 6px 16px rgba(0, 0, 0, 0.15);
}
@ -740,7 +740,7 @@ nav#guest-dashboard.dashboard-nav {
.quota-meter {
font-size: 0.9em;
color: var(--text-muted);
margin: 0 0 1rem 0;
margin: 1rem 0 0 0;
}
#file-list {

View File

@ -1,240 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Audio Player Test</title>
<style>
:root {
--success: #2e8b57;
--error: #ff4444;
--border: #444;
--text-color: #f0f0f0;
--surface: #2a2a2a;
}
body {
font-family: Arial, sans-serif;
max-width: 800px;
margin: 0 auto;
padding: 20px;
line-height: 1.6;
background: #1a1a1a;
color: var(--text-color);
}
.test-case {
margin-bottom: 20px;
padding: 15px;
border: 1px solid var(--border);
border-radius: 5px;
background: var(--surface);
}
.success { color: var(--success); }
.error { color: var(--error); }
button {
padding: 8px 16px;
margin: 5px;
cursor: pointer;
background: #4a6fa5;
color: white;
border: none;
border-radius: 4px;
}
button:hover {
background: #3a5a8c;
}
#log {
margin-top: 20px;
padding: 10px;
border: 1px solid #ccc;
border-radius: 5px;
max-height: 300px;
overflow-y: auto;
font-family: monospace;
background: #f5f5f5;
}
.audio-container {
margin: 20px 0;
}
audio {
width: 100%;
margin: 10px 0;
}
</style>
</head>
<body>
<h1>Audio Player Test</h1>
<div class="test-case">
<h2>Test 1: Direct Audio Element</h2>
<div class="audio-container">
<audio id="direct-audio" controls>
<source src="/audio/devuser/stream.opus" type="audio/ogg; codecs=opus">
Your browser does not support the audio element.
</audio>
</div>
<div>
<button onclick="document.getElementById('direct-audio').play()">Play</button>
<button onclick="document.getElementById('direct-audio').pause()">Pause</button>
</div>
</div>
<div class="test-case">
<h2>Test 2: Dynamic Audio Element</h2>
<div id="dynamic-audio-container">
<button onclick="setupDynamicAudio()">Initialize Dynamic Audio</button>
</div>
</div>
<div class="test-case">
<h2>Test 3: Using loadProfileStream</h2>
<div id="load-profile-container">
<button onclick="testLoadProfileStream()">Test loadProfileStream</button>
<div id="test3-status">Not started</div>
<div class="audio-container">
<audio id="profile-audio" controls></audio>
</div>
</div>
</div>
<div class="test-case">
<h2>Browser Audio Support</h2>
<div id="codec-support">Testing codec support...</div>
</div>
<div class="test-case">
<h2>Console Log</h2>
<div id="log"></div>
<button onclick="document.getElementById('log').innerHTML = ''">Clear Log</button>
</div>
<script>
// Logging function
function log(message, type = 'info') {
const logDiv = document.getElementById('log');
const entry = document.createElement('div');
entry.className = type;
entry.textContent = `[${new Date().toISOString()}] ${message}`;
logDiv.appendChild(entry);
logDiv.scrollTop = logDiv.scrollHeight;
console.log(`[${type.toUpperCase()}] ${message}`);
}
// Test 2: Dynamic Audio Element
function setupDynamicAudio() {
log('Setting up dynamic audio element...');
const container = document.getElementById('dynamic-audio-container');
container.innerHTML = '';
try {
const audio = document.createElement('audio');
audio.controls = true;
audio.preload = 'auto';
audio.crossOrigin = 'anonymous';
const source = document.createElement('source');
source.src = '/audio/devuser/stream.opus';
source.type = 'audio/ogg; codecs=opus';
audio.appendChild(source);
container.appendChild(audio);
container.appendChild(document.createElement('br'));
const playBtn = document.createElement('button');
playBtn.textContent = 'Play';
playBtn.onclick = () => {
audio.play().catch(e => log(`Play error: ${e}`, 'error'));
};
container.appendChild(playBtn);
const pauseBtn = document.createElement('button');
pauseBtn.textContent = 'Pause';
pauseBtn.onclick = () => audio.pause();
container.appendChild(pauseBtn);
log('Dynamic audio element created successfully');
} catch (e) {
log(`Error creating dynamic audio: ${e}`, 'error');
}
}
// Test 3: loadProfileStream
async function testLoadProfileStream() {
const status = document.getElementById('test3-status');
status.textContent = 'Loading...';
status.className = '';
try {
// Import the loadProfileStream function from app.js
const { loadProfileStream } = await import('./app.js');
if (typeof loadProfileStream !== 'function') {
throw new Error('loadProfileStream function not found');
}
// Call loadProfileStream with test user
const audio = await loadProfileStream('devuser');
if (audio) {
status.textContent = 'Audio loaded successfully!';
status.className = 'success';
log('Audio loaded successfully', 'success');
// Add the audio element to the page
const audioContainer = document.querySelector('#load-profile-container .audio-container');
audioContainer.innerHTML = '';
audio.controls = true;
audioContainer.appendChild(audio);
} else {
status.textContent = 'No audio available for test user';
status.className = '';
log('No audio available for test user', 'info');
}
} catch (e) {
status.textContent = `Error: ${e.message}`;
status.className = 'error';
log(`Error in loadProfileStream: ${e}`, 'error');
console.error(e);
}
}
// Check browser audio support
function checkAudioSupport() {
const supportDiv = document.getElementById('codec-support');
const audio = document.createElement('audio');
const codecs = {
'audio/ogg; codecs=opus': 'Opus (OGG)',
'audio/webm; codecs=opus': 'Opus (WebM)',
'audio/mp4; codecs=mp4a.40.2': 'AAC (MP4)',
'audio/mpeg': 'MP3'
};
let results = [];
for (const [type, name] of Object.entries(codecs)) {
const canPlay = audio.canPlayType(type);
results.push(`${name}: ${canPlay || 'Not supported'}`);
}
supportDiv.innerHTML = results.join('<br>');
}
// Initialize tests
document.addEventListener('DOMContentLoaded', () => {
log('Test page loaded');
checkAudioSupport();
// Log audio element events for debugging
const audioElements = document.getElementsByTagName('audio');
Array.from(audioElements).forEach((audio, index) => {
['play', 'pause', 'error', 'stalled', 'suspend', 'abort', 'emptied', 'ended'].forEach(event => {
audio.addEventListener(event, (e) => {
log(`Audio ${index + 1} ${event} event: ${e.type}`);
});
});
});
});
</script>
</body>
</html>

View File

@ -1,210 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Audio Player Test</title>
<style>
:root {
--success: #2e8b57;
--error: #ff4444;
--border: #444;
--text-color: #f0f0f0;
--surface: #2a2a2a;
}
body {
font-family: Arial, sans-serif;
max-width: 800px;
margin: 0 auto;
padding: 20px;
line-height: 1.6;
background: #1a1a1a;
color: var(--text-color);
}
.test-case {
margin-bottom: 20px;
padding: 15px;
border: 1px solid var(--border);
border-radius: 5px;
background: var(--surface);
}
.success { color: var(--success); }
.error { color: var(--error); }
button {
padding: 8px 16px;
margin: 5px;
cursor: pointer;
background: #4a6fa5;
color: white;
border: none;
border-radius: 4px;
}
button:hover {
background: #3a5a8c;
}
#log {
margin-top: 20px;
padding: 10px;
border: 1px solid #ccc;
border-radius: 5px;
max-height: 300px;
overflow-y: auto;
font-family: monospace;
background: #f5f5f5;
}
</style>
</head>
<body>
<h1>Audio Player Test</h1>
<div class="test-case">
<h2>Test 1: Basic Audio Element</h2>
<audio id="test1" controls>
<source src="/static/test-audio.opus" type="audio/ogg; codecs=opus">
Your browser does not support the audio element.
</audio>
<div>
<button onclick="document.getElementById('test1').play()">Play</button>
<button onclick="document.getElementById('test1').pause()">Pause</button>
</div>
</div>
<div class="test-case">
<h2>Test 2: Dynamic Audio Element</h2>
<div id="test2-container">
<button onclick="setupTest2()">Initialize Audio</button>
</div>
</div>
<div class="test-case">
<h2>Test 3: Using loadProfileStream</h2>
<div id="test3-container">
<button onclick="testLoadProfileStream()">Test loadProfileStream</button>
<div id="test3-status">Not started</div>
</div>
</div>
<div class="test-case">
<h2>Browser Audio Support</h2>
<div id="codec-support">Testing codec support...</div>
</div>
<div class="test-case">
<h2>Console Log</h2>
<div id="log"></div>
<button onclick="document.getElementById('log').innerHTML = ''">Clear Log</button>
</div>
<script>
// Logging function
function log(message, type = 'info') {
const logDiv = document.getElementById('log');
const entry = document.createElement('div');
entry.className = type;
entry.textContent = `[${new Date().toISOString()}] ${message}`;
logDiv.appendChild(entry);
logDiv.scrollTop = logDiv.scrollHeight;
console.log(`[${type.toUpperCase()}] ${message}`);
}
// Test 2: Dynamic Audio Element
function setupTest2() {
log('Setting up dynamic audio element...');
const container = document.getElementById('test2-container');
container.innerHTML = '';
try {
const audio = document.createElement('audio');
audio.controls = true;
audio.preload = 'auto';
const source = document.createElement('source');
source.src = '/static/test-audio.opus';
source.type = 'audio/ogg; codecs=opus';
audio.appendChild(source);
container.appendChild(audio);
container.appendChild(document.createElement('br'));
const playBtn = document.createElement('button');
playBtn.textContent = 'Play';
playBtn.onclick = () => audio.play().catch(e => log(`Play error: ${e}`, 'error'));
container.appendChild(playBtn);
const pauseBtn = document.createElement('button');
pauseBtn.textContent = 'Pause';
pauseBtn.onclick = () => audio.pause();
container.appendChild(pauseBtn);
log('Dynamic audio element created successfully');
} catch (e) {
log(`Error creating dynamic audio: ${e}`, 'error');
}
}
// Test 3: loadProfileStream
async function testLoadProfileStream() {
const status = document.getElementById('test3-status');
status.textContent = 'Loading...';
status.className = '';
try {
// Create a test user ID
const testUid = 'test-user-' + Math.random().toString(36).substr(2, 8);
log(`Testing with user: ${testUid}`);
// Call loadProfileStream
const audio = await window.loadProfileStream(testUid);
if (audio) {
status.textContent = 'Audio loaded successfully!';
status.className = 'success';
log('Audio loaded successfully', 'success');
} else {
status.textContent = 'No audio available for test user';
status.className = '';
log('No audio available for test user', 'info');
}
} catch (e) {
status.textContent = `Error: ${e.message}`;
status.className = 'error';
log(`Error in loadProfileStream: ${e}`, 'error');
}
}
// Check browser audio support
function checkAudioSupport() {
const supportDiv = document.getElementById('codec-support');
const audio = document.createElement('audio');
const codecs = {
'audio/ogg; codecs=opus': 'Opus (OGG)',
'audio/webm; codecs=opus': 'Opus (WebM)',
'audio/mp4; codecs=mp4a.40.2': 'AAC (MP4)',
'audio/mpeg': 'MP3'
};
let results = [];
for (const [type, name] of Object.entries(codecs)) {
const canPlay = audio.canPlayType(type);
results.push(`${name}: ${canPlay || 'Not supported'}`);
}
supportDiv.innerHTML = results.join('<br>');
}
// Initialize tests
document.addEventListener('DOMContentLoaded', () => {
log('Test page loaded');
checkAudioSupport();
// Expose loadProfileStream for testing
if (!window.loadProfileStream) {
log('Warning: loadProfileStream not found in global scope', 'warning');
}
});
</script>
</body>
</html>

Binary file not shown.

View File

@ -14,6 +14,6 @@ export function showToast(message) {
setTimeout(() => {
toast.remove();
// Do not remove the container; let it persist for stacking
}, 3500);
}, 15000);
}

169
static/uid-validator.js Normal file
View File

@ -0,0 +1,169 @@
/**
* UID Validation Utility
*
* Provides comprehensive UID format validation and sanitization
* to ensure all UIDs are properly formatted as email addresses.
*/
export class UidValidator {
constructor() {
// RFC 5322 compliant email regex (basic validation)
this.emailRegex = /^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*$/;
// Common invalid patterns to check against
this.invalidPatterns = [
/^devuser$/i, // Legacy username pattern
/^user\d+$/i, // Generic user patterns
/^test$/i, // Test user
/^admin$/i, // Admin user
/^\d+$/, // Pure numeric
/^[a-zA-Z]+$/, // Pure alphabetic (no @ symbol)
];
}
/**
* Validate UID format - must be a valid email address
*/
isValidFormat(uid) {
if (!uid || typeof uid !== 'string') {
return {
valid: false,
error: 'UID must be a non-empty string',
code: 'INVALID_TYPE'
};
}
const trimmed = uid.trim();
if (trimmed.length === 0) {
return {
valid: false,
error: 'UID cannot be empty',
code: 'EMPTY_UID'
};
}
// Check against invalid patterns
for (const pattern of this.invalidPatterns) {
if (pattern.test(trimmed)) {
return {
valid: false,
error: `UID matches invalid pattern: ${pattern}`,
code: 'INVALID_PATTERN'
};
}
}
// Validate email format
if (!this.emailRegex.test(trimmed)) {
return {
valid: false,
error: 'UID must be a valid email address',
code: 'INVALID_EMAIL_FORMAT'
};
}
return {
valid: true,
sanitized: trimmed.toLowerCase()
};
}
/**
* Sanitize and validate UID - ensures consistent format
*/
sanitize(uid) {
const validation = this.isValidFormat(uid);
if (!validation.valid) {
console.error('[UID-VALIDATOR] Validation failed:', validation.error, { uid });
return null;
}
return validation.sanitized;
}
/**
* Validate and throw error if invalid
*/
validateOrThrow(uid, context = 'UID') {
const validation = this.isValidFormat(uid);
if (!validation.valid) {
throw new Error(`${context} validation failed: ${validation.error} (${validation.code})`);
}
return validation.sanitized;
}
/**
* Check if a UID needs migration (legacy format)
*/
needsMigration(uid) {
if (!uid || typeof uid !== 'string') {
return false;
}
const trimmed = uid.trim();
// Check if it's already a valid email
if (this.emailRegex.test(trimmed)) {
return false;
}
// Check if it matches known legacy patterns
for (const pattern of this.invalidPatterns) {
if (pattern.test(trimmed)) {
return true;
}
}
return true; // Any non-email format needs migration
}
/**
* Get validation statistics for debugging
*/
getValidationStats(uids) {
const stats = {
total: uids.length,
valid: 0,
invalid: 0,
needsMigration: 0,
errors: {}
};
uids.forEach(uid => {
const validation = this.isValidFormat(uid);
if (validation.valid) {
stats.valid++;
} else {
stats.invalid++;
const code = validation.code || 'UNKNOWN';
stats.errors[code] = (stats.errors[code] || 0) + 1;
}
if (this.needsMigration(uid)) {
stats.needsMigration++;
}
});
return stats;
}
}
// Create singleton instance
export const uidValidator = new UidValidator();
// Legacy exports for backward compatibility
export function validateUidFormat(uid) {
return uidValidator.isValidFormat(uid).valid;
}
export function sanitizeUid(uid) {
return uidValidator.sanitize(uid);
}
export function validateUidOrThrow(uid, context) {
return uidValidator.validateOrThrow(uid, context);
}

View File

@ -1,266 +1,178 @@
// upload.js — Frontend file upload handler
import { showToast } from "./toast.js";
import { playBeep } from "./sound.js";
import { logToServer } from "./app.js";
// Initialize upload system when DOM is loaded
document.addEventListener('DOMContentLoaded', () => {
// This module handles the file upload functionality, including drag-and-drop,
// progress indication, and post-upload actions like refreshing the file list.
// DOM elements are fetched once the DOM is ready
const dropzone = document.getElementById("user-upload-area");
if (dropzone) {
dropzone.setAttribute("aria-label", "Upload area. Click or drop an audio file to upload.");
}
const fileInput = document.getElementById("fileInputUser");
const fileInfo = document.createElement("div");
fileInfo.id = "file-info";
fileInfo.style.textAlign = "center";
if (fileInput) {
fileInput.parentNode.insertBefore(fileInfo, fileInput.nextSibling);
}
const streamInfo = document.getElementById("stream-info");
const streamUrlEl = document.getElementById("streamUrl");
const spinner = document.getElementById("spinner") || { style: { display: 'none' } };
let abortController;
const fileList = document.getElementById("file-list");
// Upload function
const upload = async (file) => {
if (abortController) abortController.abort();
abortController = new AbortController();
fileInfo.innerText = `📁 ${file.name}${(file.size / 1024 / 1024).toFixed(2)} MB`;
if (file.size > 100 * 1024 * 1024) {
showToast("❌ File too large. Please upload a file smaller than 100MB.");
return;
}
spinner.style.display = "block";
showToast('📡 Uploading…');
// Early exit if critical UI elements are missing
if (!dropzone || !fileInput || !fileList) {
// Debug messages disabled
return;
}
fileInput.disabled = true;
dropzone.classList.add("uploading");
const formData = new FormData();
const sessionUid = localStorage.getItem("uid");
formData.append("uid", sessionUid);
formData.append("file", file);
// Attach all event listeners
initializeUploadListeners();
const res = await fetch("/upload", {
signal: abortController.signal,
method: "POST",
body: formData,
});
let data, parseError;
try {
data = await res.json();
} catch (e) {
parseError = e;
}
if (!data) {
showToast("❌ Upload failed: " + (parseError && parseError.message ? parseError.message : "Unknown error"));
spinner.style.display = "none";
fileInput.disabled = false;
dropzone.classList.remove("uploading");
return;
}
if (res.ok) {
if (data.quota && data.quota.used_mb !== undefined) {
const bar = document.getElementById("quota-bar");
const text = document.getElementById("quota-text");
const quotaSec = document.getElementById("quota-meter");
if (bar && text && quotaSec) {
quotaSec.hidden = false;
const used = parseFloat(data.quota.used_mb);
bar.value = used;
bar.max = 100;
text.textContent = `${used.toFixed(1)} MB used`;
}
}
spinner.style.display = "none";
fileInput.disabled = false;
dropzone.classList.remove("uploading");
showToast("✅ Upload successful.");
// Refresh the audio player and file list
const uid = localStorage.getItem("uid");
if (uid) {
try {
if (window.loadProfileStream) {
await window.loadProfileStream(uid);
}
// Refresh the file list
if (window.fetchAndDisplayFiles) {
await window.fetchAndDisplayFiles(uid);
}
// Refresh the stream list to update the last update time
if (window.refreshStreamList) {
await window.refreshStreamList();
}
} catch (e) {
console.error('Failed to refresh:', e);
}
}
playBeep(432, 0.25, "sine");
} else {
if (streamInfo) streamInfo.hidden = true;
if (spinner) spinner.style.display = "none";
if ((data.detail || data.error || "").includes("music")) {
showToast("🎵 Upload rejected: singing or music detected.");
} else {
showToast(`❌ Upload failed: ${data.detail || data.error}`);
}
if (fileInput) fileInput.value = null;
if (dropzone) dropzone.classList.remove("uploading");
if (fileInput) fileInput.disabled = false;
if (streamInfo) streamInfo.classList.remove("visible", "slide-in");
}
};
// Function to fetch and display uploaded files
async function fetchAndDisplayFiles(uidFromParam) {
console.log('[UPLOAD] fetchAndDisplayFiles called with uid:', uidFromParam);
// Get the file list element
const fileList = document.getElementById('file-list');
if (!fileList) {
const errorMsg = 'File list element not found in DOM';
console.error(errorMsg);
return showErrorInUI(errorMsg);
}
// Get UID from parameter, localStorage, or cookie
const uid = uidFromParam || localStorage.getItem('uid') || getCookie('uid');
const authToken = localStorage.getItem('authToken');
const headers = {
'Accept': 'application/json',
};
// Include auth token in headers if available, but don't fail if it's not
// The server should handle both token-based and UID-based auth
if (authToken) {
headers['Authorization'] = `Bearer ${authToken}`;
} else {
console.debug('[UPLOAD] No auth token available, using UID-only authentication');
}
console.log('[UPLOAD] Auth state - UID:', uid, 'Token exists:', !!authToken);
/**
* Main upload function
* @param {File} file - The file to upload
*/
async function upload(file) {
// Get user ID from localStorage or cookie
const uid = localStorage.getItem('uid') || getCookie('uid');
if (!uid) {
console.error('[UPLOAD] No UID found in any source');
fileList.innerHTML = '<li class="error-message">User session expired. Please refresh the page.</li>';
// Debug messages disabled
showToast("You must be logged in to upload files.", "error");
return;
}
// Log the authentication method being used
if (!authToken) {
console.debug('[UPLOAD] No auth token found, using UID-only authentication');
} else {
console.debug('[UPLOAD] Using token-based authentication');
}
// Debug messages disabled
// Show loading state
fileList.innerHTML = '<li class="loading-message">Loading files...</li>';
// Create and display the upload status indicator
const statusDiv = createStatusIndicator(file.name);
fileList.prepend(statusDiv);
const progressBar = statusDiv.querySelector('.progress-bar');
const statusText = statusDiv.querySelector('.status-text');
const formData = new FormData();
formData.append("file", file);
formData.append("uid", uid);
try {
console.log(`[DEBUG] Fetching files for user: ${uid}`);
const response = await fetch(`/me/${uid}`, {
const response = await fetch(`/upload`, {
method: "POST",
body: formData,
headers: {
'Authorization': authToken ? `Bearer ${authToken}` : '',
'Content-Type': 'application/json',
'Accept': 'application/json',
},
});
console.log('[DEBUG] Response status:', response.status, response.statusText);
if (!response.ok) {
const errorText = await response.text();
const errorMsg = `Failed to fetch files: ${response.status} ${response.statusText} - ${errorText}`;
console.error(`[ERROR] ${errorMsg}`);
throw new Error(errorMsg);
}
const data = await response.json();
console.log('[DEBUG] Received files data:', data);
if (!data.files) {
throw new Error('Invalid response format: missing files array');
}
if (data.files.length > 0) {
// Sort files by name
const sortedFiles = [...data.files].sort((a, b) => a.name.localeCompare(b.name));
fileList.innerHTML = sortedFiles.map(file => {
const sizeMB = (file.size / (1024 * 1024)).toFixed(2);
const displayName = file.original_name || file.name;
const isRenamed = file.original_name && file.original_name !== file.name;
return `
<li class="file-item" data-filename="${file.name}">
<div class="file-name" title="${isRenamed ? `Stored as: ${file.name}` : displayName}">
${displayName}
${isRenamed ? `<div class="stored-as"><button class="delete-file" data-filename="${file.name}" data-original-name="${file.original_name}" title="Delete file">🗑️</button></div>` :
`<button class="delete-file" data-filename="${file.name}" data-original-name="${file.original_name}" title="Delete file">🗑️</button>`}
</div>
<span class="file-size">${sizeMB} MB</span>
</li>
`;
}).join('');
} else {
fileList.innerHTML = '<li class="empty-message">No files uploaded yet</li>';
}
// Delete button handling is now managed by dashboard.js
// Update quota display if available
if (data.quota !== undefined) {
const bar = document.getElementById('quota-bar');
const text = document.getElementById('quota-text');
const quotaSec = document.getElementById('quota-meter');
if (bar && text && quotaSec) {
quotaSec.hidden = false;
bar.value = data.quota;
bar.max = 100;
text.textContent = `${data.quota.toFixed(1)} MB`;
}
const errorData = await response.json().catch(() => ({ detail: 'Upload failed with non-JSON response.' }));
throw new Error(errorData.detail || 'Unknown upload error');
}
const result = await response.json();
// Debug messages disabled
playBeep(800, 0.2); // Success beep - higher frequency
// Update UI to show success
statusText.textContent = 'Success!';
progressBar.style.width = '100%';
progressBar.style.backgroundColor = 'var(--success-color)';
// Remove the status indicator after a short delay
setTimeout(() => {
statusDiv.remove();
}, 2000);
// --- Post-Upload Actions ---
await postUploadActions(uid);
} catch (error) {
const errorMessage = `Error loading file list: ${error.message || 'Unknown error'}`;
console.error('[ERROR]', errorMessage, error);
showErrorInUI(errorMessage, fileList);
}
// Helper function to show error messages in the UI
function showErrorInUI(message, targetElement = null) {
const errorHtml = `
<div style="
padding: 10px;
margin: 5px 0;
background: #2a0f0f;
border-left: 3px solid #f55;
color: var(--error-hover);
font-family: monospace;
font-size: 0.9em;
white-space: pre-wrap;
word-break: break-word;
">
<div style="font-weight: bold; color: var(--error);">Error loading files</div>
<div style="margin-top: 5px;">${message}</div>
<div style="margin-top: 10px; font-size: 0.8em; color: var(--text-muted);">
Check browser console for details
</div>
</div>
`;
if (targetElement) {
targetElement.innerHTML = errorHtml;
} else {
// If no target element, try to find it
const fileList = document.getElementById('file-list');
if (fileList) fileList.innerHTML = errorHtml;
}
// Debug messages disabled
playBeep(200, 0.5); // Error beep - lower frequency, longer duration
statusText.textContent = `Error: ${error.message}`;
progressBar.style.backgroundColor = 'var(--error-color)';
statusDiv.classList.add('upload-error');
}
}
// Helper function to get cookie value by name
/**
* Actions to perform after a successful upload.
* @param {string} uid - The user's ID
*/
async function postUploadActions(uid) {
// 1. Refresh the user's personal stream if the function is available
if (window.loadProfileStream) {
await window.loadProfileStream(uid);
}
// 2. Refresh the file list by re-fetching and then displaying.
if (window.fetchAndDisplayFiles) {
// Use email-based UID for file operations if available, fallback to uid
const fileOperationUid = localStorage.getItem('uid') || uid; // uid is now email-based
// Debug messages disabled
await window.fetchAndDisplayFiles(fileOperationUid);
}
// 3. Update quota display after upload
if (window.updateQuotaDisplay) {
const quotaUid = localStorage.getItem('uid') || uid;
// Debug messages disabled
await window.updateQuotaDisplay(quotaUid);
}
// 4. Refresh the public stream list to update the last update time
if (window.refreshStreamList) {
await window.refreshStreamList();
}
}
/**
* Creates the DOM element for the upload status indicator.
* @param {string} fileName - The name of the file being uploaded.
* @returns {HTMLElement}
*/
function createStatusIndicator(fileName) {
const statusDiv = document.createElement('div');
statusDiv.className = 'upload-status-indicator';
statusDiv.innerHTML = `
<div class="file-info">
<span class="file-name">${fileName}</span>
<span class="status-text">Uploading...</span>
</div>
<div class="progress-container">
<div class="progress-bar"></div>
</div>
`;
return statusDiv;
}
/**
* Initializes all event listeners for the upload UI.
*/
function initializeUploadListeners() {
dropzone.addEventListener("click", () => {
fileInput.click();
});
dropzone.addEventListener("dragover", (e) => {
e.preventDefault();
dropzone.classList.add("dragover");
});
dropzone.addEventListener("dragleave", () => {
dropzone.classList.remove("dragover");
});
dropzone.addEventListener("drop", (e) => {
e.preventDefault();
dropzone.classList.remove("dragover");
const file = e.dataTransfer.files[0];
if (file) {
upload(file);
}
});
fileInput.addEventListener("change", (e) => {
const file = e.target.files[0];
if (file) {
upload(file);
}
});
}
/**
* Helper function to get a cookie value by name.
* @param {string} name - The name of the cookie.
* @returns {string|null}
*/
function getCookie(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
@ -268,35 +180,6 @@ document.addEventListener('DOMContentLoaded', () => {
return null;
}
// Export functions for use in other modules
// Make the upload function globally accessible if needed by other scripts
window.upload = upload;
window.fetchAndDisplayFiles = fetchAndDisplayFiles;
if (dropzone && fileInput) {
dropzone.addEventListener("click", () => {
console.log("[DEBUG] Dropzone clicked");
fileInput.click();
console.log("[DEBUG] fileInput.click() called");
});
dropzone.addEventListener("dragover", (e) => {
e.preventDefault();
dropzone.classList.add("dragover");
dropzone.style.transition = "background-color 0.3s ease";
});
dropzone.addEventListener("dragleave", () => {
dropzone.classList.remove("dragover");
});
dropzone.addEventListener("drop", (e) => {
dropzone.classList.add("pulse");
setTimeout(() => dropzone.classList.remove("pulse"), 400);
e.preventDefault();
dropzone.classList.remove("dragover");
const file = e.dataTransfer.files[0];
if (file) upload(file);
});
fileInput.addEventListener("change", (e) => {
const file = e.target.files[0];
if (file) upload(file);
});
}
});

View File

@ -1,11 +0,0 @@
import smtplib
from email.message import EmailMessage
msg = EmailMessage()
msg["From"] = "test@keisanki.net"
msg["To"] = "oib@bubuit.net"
msg["Subject"] = "Test"
msg.set_content("Hello world")
with smtplib.SMTP("localhost") as smtp:
smtp.send_message(msg)

398
upload.py
View File

@ -23,7 +23,8 @@ DATA_ROOT = Path("./data")
@limiter.limit("5/minute")
@router.post("/upload")
async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), file: UploadFile = Form(...)):
def upload(request: Request, uid: str = Form(...), file: UploadFile = Form(...)):
# Import here to avoid circular imports
from log import log_violation
import time
@ -32,183 +33,259 @@ async def upload(request: Request, db = Depends(get_db), uid: str = Form(...), f
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Starting upload of {file.filename}")
try:
# First, verify the user exists and is confirmed
user = db.exec(select(User).where((User.username == uid) | (User.email == uid))).first()
if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"):
user = user[0]
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] User check - found: {user is not None}, confirmed: {getattr(user, 'confirmed', False) if user else 'N/A'}")
if not user or not hasattr(user, "confirmed") or not user.confirmed:
raise HTTPException(status_code=403, detail="Account not confirmed")
# Check quota before doing any file operations
quota = db.get(UserQuota, uid) or UserQuota(uid=uid, storage_bytes=0)
if quota.storage_bytes >= 100 * 1024 * 1024:
raise HTTPException(status_code=400, detail="Quota exceeded")
# Create user directory if it doesn't exist
user_dir = DATA_ROOT / uid
user_dir.mkdir(parents=True, exist_ok=True)
# Generate a unique filename for the processed file first
import uuid
unique_name = f"{uuid.uuid4()}.opus"
raw_ext = file.filename.split(".")[-1].lower()
raw_path = user_dir / ("raw." + raw_ext)
processed_path = user_dir / unique_name
# Clean up any existing raw files first (except the one we're about to create)
for old_file in user_dir.glob('raw.*'):
# Use the database session context manager to handle the session
with get_db() as db:
try:
if old_file != raw_path: # Don't delete the file we're about to create
old_file.unlink(missing_ok=True)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Cleaned up old file: {old_file}")
except Exception as e:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_file}: {e}")
# Save the uploaded file temporarily
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Saving temporary file to {raw_path}")
try:
with open(raw_path, "wb") as f:
content = await file.read()
if not content:
raise ValueError("Uploaded file is empty")
f.write(content)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Successfully wrote {len(content)} bytes to {raw_path}")
except Exception as e:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to save {raw_path}: {e}")
raise HTTPException(status_code=500, detail=f"Failed to save uploaded file: {e}")
# Ollama music/singing check is disabled for this release
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Ollama music/singing check is disabled")
try:
convert_to_opus(str(raw_path), str(processed_path))
except Exception as e:
raw_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=str(e))
original_size = raw_path.stat().st_size
raw_path.unlink(missing_ok=True) # cleanup
# First, verify the file was created and has content
if not processed_path.exists() or processed_path.stat().st_size == 0:
raise HTTPException(status_code=500, detail="Failed to process audio file")
# Concatenate all .opus files in random order to stream.opus for public playback
# This is now done after the file is in its final location with log ID
from concat_opus import concat_opus_files
def update_stream_opus():
try:
concat_opus_files(user_dir, user_dir / "stream.opus")
except Exception as e:
# fallback: just use the latest processed file if concat fails
import shutil
stream_path = user_dir / "stream.opus"
shutil.copy2(processed_path, stream_path)
log_violation("STREAM_UPDATE", request.client.host, uid,
f"[fallback] Updated stream.opus with {processed_path}")
# We'll call this after the file is in its final location
# Get the final file size
size = processed_path.stat().st_size
# Start a transaction
try:
# Create a log entry with the original filename
log = UploadLog(
uid=uid,
ip=request.client.host,
filename=file.filename, # Store original filename
processed_filename=unique_name, # Store the processed filename
size_bytes=size
)
db.add(log)
db.flush() # Get the log ID without committing
# Rename the processed file to include the log ID for better tracking
processed_with_id = user_dir / f"{log.id}_{unique_name}"
if processed_path.exists():
# First check if there's already a file with the same UUID but different prefix
for existing_file in user_dir.glob(f"*_{unique_name}"):
if existing_file != processed_path:
log_violation("CLEANUP", request.client.host, uid,
f"[UPLOAD] Removing duplicate file: {existing_file}")
existing_file.unlink(missing_ok=True)
# First, verify the user exists and is confirmed
user = db.query(User).filter(
(User.username == uid) | (User.email == uid)
).first()
# Now do the rename
if processed_path != processed_with_id:
if processed_with_id.exists():
processed_with_id.unlink(missing_ok=True)
processed_path.rename(processed_with_id)
processed_path = processed_with_id
if user is not None and not isinstance(user, User) and hasattr(user, "__getitem__"):
user = user[0]
if not user:
log_violation("UPLOAD", request.client.host, uid, f"User {uid} not found")
raise HTTPException(status_code=404, detail="User not found")
# Only clean up raw.* files, not previously uploaded opus files
for old_temp_file in user_dir.glob('raw.*'):
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] User check - found: {user is not None}, confirmed: {getattr(user, 'confirmed', False) if user else 'N/A'}")
# Check if user is confirmed
if not hasattr(user, 'confirmed') or not user.confirmed:
raise HTTPException(status_code=403, detail="Account not confirmed")
# Use user.email as the proper UID for quota and directory operations
user_email = user.email
quota = db.get(UserQuota, user_email) or UserQuota(uid=user_email, storage_bytes=0)
if quota.storage_bytes >= 100 * 1024 * 1024:
raise HTTPException(status_code=400, detail="Quota exceeded")
# Create user directory using email (proper UID) - not the uid parameter which could be username
user_dir = DATA_ROOT / user_email
user_dir.mkdir(parents=True, exist_ok=True)
# Generate a unique filename for the processed file first
import uuid
unique_name = f"{uuid.uuid4()}.opus"
raw_ext = file.filename.split(".")[-1].lower()
raw_path = user_dir / ("raw." + raw_ext)
processed_path = user_dir / unique_name
# Clean up any existing raw files first (except the one we're about to create)
for old_file in user_dir.glob('raw.*'):
try:
old_temp_file.unlink(missing_ok=True)
log_violation("CLEANUP", request.client.host, uid, f"[{request_id}] Cleaned up temp file: {old_temp_file}")
if old_file != raw_path: # Don't delete the file we're about to create
old_file.unlink(missing_ok=True)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Cleaned up old file: {old_file}")
except Exception as e:
log_violation("CLEANUP_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_temp_file}: {e}")
# Get or create quota
quota = db.query(UserQuota).filter(UserQuota.uid == uid).first()
if not quota:
quota = UserQuota(uid=uid, storage_bytes=0)
db.add(quota)
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_file}: {e}")
# Update quota with the new file size
quota.storage_bytes = sum(
f.stat().st_size
for f in user_dir.glob('*.opus')
if f.name != 'stream.opus' and f != processed_path
) + size
# Update public streams
update_public_streams(uid, quota.storage_bytes, db)
# Commit the transaction
db.commit()
# Now that the transaction is committed and files are in their final location,
# update the stream.opus file to include all files
update_stream_opus()
except Exception as e:
db.rollback()
# Clean up the processed file if something went wrong
if processed_path.exists():
processed_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=f"Database error: {str(e)}")
# Save the uploaded file temporarily
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Saving temporary file to {raw_path}")
try:
with open(raw_path, "wb") as f:
content = file.file.read()
if not content:
raise ValueError("Uploaded file is empty")
f.write(content)
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Successfully wrote {len(content)} bytes to {raw_path}")
# EARLY DB RECORD CREATION: after upload completes, before processing
early_log = UploadLog(
uid=user_email,
ip=request.client.host,
filename=file.filename, # original filename from user
processed_filename=None, # not yet processed
size_bytes=0 # placeholder to satisfy NOT NULL; updated after processing
)
db.add(early_log)
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE FLUSH] Before db.flush() after early_log add")
db.flush()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE FLUSH] After db.flush() after early_log add")
db.commit()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[FORCE COMMIT] After db.commit() after early_log add")
early_log_id = early_log.id
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[DEBUG] Early UploadLog created: id={early_log_id}, filename={file.filename}, UploadLog.filename={early_log.filename}")
except Exception as e:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"[{request_id}] Failed to save {raw_path}: {e}")
raise HTTPException(status_code=500, detail=f"Failed to save uploaded file: {e}")
return {
"filename": file.filename,
"original_size": round(original_size / 1024, 1),
"quota": {
"used_mb": round(quota.storage_bytes / (1024 * 1024), 2)
}
}
# Ollama music/singing check is disabled for this release
log_violation("UPLOAD", request.client.host, uid, f"[{request_id}] Ollama music/singing check is disabled")
try:
convert_to_opus(str(raw_path), str(processed_path))
except Exception as e:
raw_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=str(e))
original_size = raw_path.stat().st_size
raw_path.unlink(missing_ok=True) # cleanup
# First, verify the file was created and has content
if not processed_path.exists() or processed_path.stat().st_size == 0:
raise HTTPException(status_code=500, detail="Failed to process audio file")
# Get the final file size
size = processed_path.stat().st_size
# Concatenate all .opus files in random order to stream.opus for public playback
# This is now done after the file is in its final location with log ID
from concat_opus import concat_opus_files
def update_stream_opus():
try:
concat_opus_files(user_dir, user_dir / "stream.opus")
except Exception as e:
# fallback: just use the latest processed file if concat fails
import shutil
stream_path = user_dir / "stream.opus"
shutil.copy2(processed_path, stream_path)
log_violation("STREAM_UPDATE", request.client.host, uid,
f"[fallback] Updated stream.opus with {processed_path}")
# Start a transaction
try:
# Update the early DB record with processed filename and size
log = db.get(UploadLog, early_log_id)
log.processed_filename = unique_name
log.size_bytes = size
db.add(log)
db.flush() # Ensure update is committed
# Assert that log.filename is still the original filename, never overwritten
if log.filename is None or (log.filename.endswith('.opus') and log.filename == log.processed_filename):
log_violation("UPLOAD_ERROR", request.client.host, uid,
f"[ASSERTION FAILED] UploadLog.filename was overwritten! id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
raise RuntimeError(f"UploadLog.filename was overwritten! id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
else:
log_violation("UPLOAD_DEBUG", request.client.host, uid,
f"[ASSERTION OK] After update: id={log.id}, filename={log.filename}, processed_filename={log.processed_filename}")
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[COMMIT] Committing UploadLog for id={log.id}")
db.commit()
log_violation("UPLOAD_DEBUG", request.client.host, uid, f"[COMMIT OK] UploadLog committed for id={log.id}")
# Rename the processed file to include the log ID for better tracking
processed_with_id = user_dir / f"{log.id}_{unique_name}"
if processed_path.exists():
# First check if there's already a file with the same UUID but different prefix
for existing_file in user_dir.glob(f"*_{unique_name}"):
if existing_file != processed_path:
log_violation("CLEANUP", request.client.host, uid,
f"[UPLOAD] Removing duplicate file: {existing_file}")
existing_file.unlink(missing_ok=True)
# Now do the rename
if processed_path != processed_with_id:
if processed_with_id.exists():
processed_with_id.unlink(missing_ok=True)
processed_path.rename(processed_with_id)
processed_path = processed_with_id
# Only clean up raw.* files, not previously uploaded opus files
for old_temp_file in user_dir.glob('raw.*'):
try:
old_temp_file.unlink(missing_ok=True)
log_violation("CLEANUP", request.client.host, uid, f"[{request_id}] Cleaned up temp file: {old_temp_file}")
except Exception as e:
log_violation("CLEANUP_ERROR", request.client.host, uid, f"[{request_id}] Failed to clean up {old_temp_file}: {e}")
# Get or create quota
quota = db.query(UserQuota).filter(UserQuota.uid == user_email).first()
if not quota:
quota = UserQuota(uid=user_email, storage_bytes=0)
db.add(quota)
# Update quota with the new file size
quota.storage_bytes = sum(
f.stat().st_size
for f in user_dir.glob('*.opus')
if f.name != 'stream.opus' and f != processed_path
) + size
# Update public streams
update_public_streams(user_email, quota.storage_bytes, db)
# The context manager will handle commit/rollback
# Now that the transaction is committed and files are in their final location,
# update the stream.opus file to include all files
update_stream_opus()
return {
"filename": file.filename,
"original_size": round(original_size / 1024, 1),
"quota": {
"used_mb": round(quota.storage_bytes / (1024 * 1024), 2)
}
}
except HTTPException as e:
# Re-raise HTTP exceptions as they are already properly formatted
db.rollback()
raise e
except Exception as e:
# Log the error and return a 500 response
db.rollback()
import traceback
tb = traceback.format_exc()
# Try to log the error
try:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"Error processing upload: {str(e)}\n{tb}")
except Exception:
pass # If logging fails, continue with the error response
# Clean up the processed file if it exists
if 'processed_path' in locals() and processed_path.exists():
processed_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=f"Error processing upload: {str(e)}")
except HTTPException as e:
# Re-raise HTTP exceptions as they are already properly formatted
db.rollback()
raise e
except Exception as e:
# Log the error and return a 500 response
db.rollback()
import traceback
tb = traceback.format_exc()
# Try to log the error
try:
log_violation("UPLOAD_ERROR", request.client.host, uid, f"Error processing upload: {str(e)}\n{tb}")
except Exception:
pass # If logging fails, continue with the error response
# Clean up the processed file if it exists
if 'processed_path' in locals() and processed_path.exists():
processed_path.unlink(missing_ok=True)
raise HTTPException(status_code=500, detail=f"Error processing upload: {str(e)}")
except HTTPException as e:
# Already a JSON response, just re-raise
# Re-raise HTTP exceptions as they are already properly formatted
raise e
except Exception as e:
# Catch any other exceptions that might occur outside the main processing block
import traceback
tb = traceback.format_exc()
# Log and return a JSON error
try:
log_violation("UPLOAD", request.client.host, uid, f"Unexpected error: {type(e).__name__}: {str(e)}\n{tb}")
except Exception:
pass
return {"detail": f"Server error: {type(e).__name__}: {str(e)}"}
log_violation("UPLOAD_ERROR", request.client.host, uid, f"Unhandled error in upload handler: {str(e)}\n{tb}")
except:
pass # If logging fails, continue with the error response
raise HTTPException(status_code=500, detail=f"Internal server error: {str(e)}")
def update_public_streams(uid: str, storage_bytes: int, db: Session):
"""Update the public streams list in the database with the latest user upload info"""
try:
# Get the user's info
user = db.query(User).filter(User.username == uid).first()
# Get the user's info - uid is now email-based
user = db.query(User).filter(User.email == uid).first()
if not user:
print(f"[WARNING] User {uid} not found when updating public streams")
return
@ -221,7 +298,6 @@ def update_public_streams(uid: str, storage_bytes: int, db: Session):
# Update the public stream info
public_stream.username = user.username
public_stream.display_name = user.display_name or user.username
public_stream.storage_bytes = storage_bytes
public_stream.last_updated = datetime.utcnow()