- Add Prometheus metrics for marketplace API throughput and error rates with new dashboard panels - Implement confidential transaction models with encryption support and access control - Add key management system with registration, rotation, and audit logging - Create services and registry routers for service discovery and management - Integrate ZK proof generation for privacy-preserving receipts - Add metrics instru
3.7 KiB
3.7 KiB
Coordinator API – Task Breakdown
Status (2025-12-22)
- Stage 1 delivery: Core FastAPI service, persistence, job lifecycle, and miner flows implemented under
apps/coordinator-api/. Receipt signing now includes optional coordinator attestations with history retrieval endpoints. - Testing & tooling: Pytest suites cover job scheduling, miner flows, and receipt verification; the shared CI script
scripts/ci/run_python_tests.shexecutes these tests in GitHub Actions. - Documentation:
docs/run.mdandapps/coordinator-api/README.mddescribe configuration forRECEIPT_SIGNING_KEY_HEXandRECEIPT_ATTESTATION_KEY_HEXplus the receipt history API. - Service APIs: Implemented specific service endpoints for common GPU workloads (Whisper, Stable Diffusion, LLM inference, FFmpeg, Blender) with typed schemas and validation.
- Service Registry: Created dynamic service registry framework supporting 30+ GPU services across 6 categories (AI/ML, Media Processing, Scientific Computing, Data Analytics, Gaming, Development Tools).
Stage 1 (MVP)
-
Project Setup
- Initialize FastAPI app under
apps/coordinator-api/src/app/withmain.py,config.py,deps.py. - Add
.env.examplecovering host/port, database URL, API key lists, rate limit configuration. - Create
pyproject.toml(orrequirements.txt) listing FastAPI, uvicorn, pydantic, SQL driver, httpx, redis (optional).
- Initialize FastAPI app under
-
Models & Persistence
- Design Pydantic schemas for jobs, miners, constraints, state transitions (
models.py). - Implement DB layer (
db.py) using SQLite (or Postgres) with tables for jobs, miners, sessions, worker sessions. - Provide migrations or schema creation script.
- Design Pydantic schemas for jobs, miners, constraints, state transitions (
-
Business Logic
- Implement
queue.pyandmatching.pyfor job scheduling. - Create state machine utilities (
states.py) for job transitions. - Add settlement stubs in
settlement.pyfor future token accounting.
- Implement
-
Routers
- Build
/v1/jobsendpoints (submit, get status, get result, cancel) with idempotency support. - Build
/v1/minersendpoints (register, heartbeat, poll, result, fail, drain). - Build
/v1/adminendpoints (stats, job listing, miner listing) with admin auth. - Build
/v1/servicesendpoints for specific GPU workloads:/v1/services/whisper/transcribe- Audio transcription/v1/services/stable-diffusion/generate- Image generation/v1/services/llm/inference- Text generation/v1/services/ffmpeg/transcode- Video transcoding/v1/services/blender/render- 3D rendering
- Build
/v1/registryendpoints for dynamic service management:/v1/registry/services- List all available services/v1/registry/services/{id}- Get service definition/v1/registry/services/{id}/schema- Get JSON schema/v1/registry/services/{id}/requirements- Get hardware requirements
- Optionally add WebSocket endpoints under
ws/for streaming updates.
- Build
-
Receipts & Attestations
- ✅ Persist signed receipts (latest + history), expose
/v1/jobs/{job_id}/receipt(s)endpoints, and attach optional coordinator attestations whenRECEIPT_ATTESTATION_KEY_HEXis configured.
- ✅ Persist signed receipts (latest + history), expose
-
Auth & Rate Limiting
- Implement dependencies in
deps.pyto validate API keys and optional HMAC signatures. - Add rate limiting (e.g.,
slowapi) per key.
- Implement dependencies in
-
Testing & Examples
- Create
.httpfiles or pytest suites for client/miner flows. - Document curl examples and quickstart instructions in
apps/coordinator-api/README.md.
- Create
Stage 2+
- Integrate with blockchain receipts for settlement triggers.
- Add Redis-backed queues for scalability.
- Implement metrics and tracing (Prometheus/OpenTelemetry).
- Support multi-region coordinators with pool hub integration.