Compare commits
10 Commits
c390ba07c1
...
1ef55d1b16
| Author | SHA1 | Date | |
|---|---|---|---|
| 1ef55d1b16 | |||
| 4c04652291 | |||
| a0d0d22a4a | |||
| ee430ebb49 | |||
| e7af9ac365 | |||
| 3bdada174c | |||
| d29a54e98f | |||
| 8fee73a2ec | |||
| 4c2ada682a | |||
| 6223e0b582 |
43
README.md
43
README.md
@@ -87,6 +87,49 @@ aitbc --help --language german
|
||||
aitbc marketplace list --translate-to french
|
||||
```
|
||||
|
||||
## 🔗 Blockchain Node (Brother Chain)
|
||||
|
||||
A minimal asset-backed blockchain that validates compute receipts and mints AIT tokens.
|
||||
|
||||
### ✅ Current Status
|
||||
- **Chain ID**: `ait-devnet`
|
||||
- **Consensus**: Proof-of-Authority (single proposer)
|
||||
- **RPC Endpoint**: `http://localhost:8026/rpc`
|
||||
- **Health Check**: `http://localhost:8026/health`
|
||||
- **Metrics**: `http://localhost:8026/metrics` (Prometheus format)
|
||||
- **Status**: 🟢 Operational and fully functional
|
||||
|
||||
### 🚀 Quick Launch
|
||||
|
||||
```bash
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
bash scripts/devnet_up.sh
|
||||
```
|
||||
|
||||
The node starts:
|
||||
- Proposer loop (block production)
|
||||
- RPC API on port 8026
|
||||
- Mock coordinator on port 8090 (for testing)
|
||||
|
||||
### 🛠️ CLI Interaction
|
||||
|
||||
```bash
|
||||
# Check node status
|
||||
aitbc blockchain status
|
||||
|
||||
# Get chain head
|
||||
aitbc blockchain head
|
||||
|
||||
# Check balance
|
||||
aitbc blockchain balance --address <your-address>
|
||||
|
||||
# Fund an address (devnet faucet)
|
||||
aitbc blockchain faucet --address <your-address> --amount 1000
|
||||
```
|
||||
|
||||
For full documentation, see: [`apps/blockchain-node/README.md`](./apps/blockchain-node/README.md)
|
||||
|
||||
## 🤖 Agent-First Computing
|
||||
|
||||
AITBC creates an ecosystem where AI agents are the primary participants:
|
||||
|
||||
@@ -1,9 +1,18 @@
|
||||
CHAIN_ID=ait-devnet
|
||||
DB_PATH=./data/chain.db
|
||||
RPC_BIND_HOST=127.0.0.1
|
||||
RPC_BIND_PORT=8080
|
||||
P2P_BIND_HOST=0.0.0.0
|
||||
P2P_BIND_PORT=7070
|
||||
PROPOSER_KEY=change_me
|
||||
MINT_PER_UNIT=1000
|
||||
COORDINATOR_RATIO=0.05
|
||||
# Blockchain Node Configuration
|
||||
chain_id=ait-devnet
|
||||
supported_chains=ait-devnet
|
||||
|
||||
rpc_bind_host=0.0.0.0
|
||||
rpc_bind_port=8006
|
||||
|
||||
p2p_bind_host=0.0.0.0
|
||||
p2p_bind_port=7070
|
||||
|
||||
proposer_id=aitbc1-proposer
|
||||
|
||||
# Gossip backend: use broadcast with Redis for cross-node communication
|
||||
gossip_backend=broadcast
|
||||
gossip_broadcast_url=redis://localhost:6379
|
||||
|
||||
# Data
|
||||
db_path=./data/chain.db
|
||||
|
||||
@@ -1,25 +1,169 @@
|
||||
# Blockchain Node
|
||||
# Blockchain Node (Brother Chain)
|
||||
|
||||
## Purpose & Scope
|
||||
|
||||
Minimal asset-backed blockchain node that validates compute receipts and mints AIT tokens as described in `docs/bootstrap/blockchain_node.md`.
|
||||
Minimal asset-backed blockchain node that validates compute receipts and mints AIT tokens.
|
||||
|
||||
## Status
|
||||
|
||||
Scaffolded. Implementation pending per staged roadmap.
|
||||
✅ **Operational** — Core blockchain functionality implemented and running.
|
||||
|
||||
## Devnet Tooling
|
||||
### Capabilities
|
||||
- PoA consensus with single proposer (devnet)
|
||||
- Transaction processing (TRANSFER, RECEIPT_CLAIM)
|
||||
- Receipt validation and minting
|
||||
- Gossip-based peer-to-peer networking (in-memory backend)
|
||||
- RESTful RPC API (`/rpc/*`)
|
||||
- Prometheus metrics (`/metrics`)
|
||||
- Health check endpoint (`/health`)
|
||||
- SQLite persistence with Alembic migrations
|
||||
|
||||
- `scripts/make_genesis.py` — Generate a deterministic devnet genesis file (`data/devnet/genesis.json`).
|
||||
- `scripts/keygen.py` — Produce throwaway devnet keypairs (printed or written to disk).
|
||||
- `scripts/devnet_up.sh` — Launch the blockchain node and RPC API with a freshly generated genesis file.
|
||||
## Quickstart (Devnet)
|
||||
|
||||
### Quickstart
|
||||
The blockchain node is already set up with a virtualenv. To launch:
|
||||
|
||||
```bash
|
||||
cd apps/blockchain-node
|
||||
python scripts/make_genesis.py --force
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
bash scripts/devnet_up.sh
|
||||
```
|
||||
|
||||
The script sets `PYTHONPATH=src` and starts the proposer loop plus the FastAPI app (via `uvicorn`). Press `Ctrl+C` to stop the devnet.
|
||||
This will:
|
||||
1. Generate genesis block at `data/devnet/genesis.json`
|
||||
2. Start the blockchain node proposer loop (PID logged)
|
||||
3. Start RPC API on `http://127.0.0.1:8026`
|
||||
4. Start mock coordinator on `http://127.0.0.1:8090`
|
||||
|
||||
Press `Ctrl+C` to stop all processes.
|
||||
|
||||
### Manual Startup
|
||||
|
||||
If you prefer to start components separately:
|
||||
|
||||
```bash
|
||||
# Terminal 1: Blockchain node
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src python -m aitbc_chain.main
|
||||
|
||||
# Terminal 2: RPC API
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026
|
||||
|
||||
# Terminal 3: Mock coordinator (optional, for testing)
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src uvicorn mock_coordinator:app --host 127.0.0.1 --port 8090
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
Once running, the RPC API is available at `http://127.0.0.1:8026/rpc`.
|
||||
|
||||
### Health & Metrics
|
||||
- `GET /health` — Health check with node info
|
||||
- `GET /metrics` — Prometheus-format metrics
|
||||
|
||||
### Blockchain Queries
|
||||
- `GET /rpc/head` — Current chain head block
|
||||
- `GET /rpc/blocks/{height}` — Get block by height
|
||||
- `GET /rpc/blocks-range?start=0&end=10` — Get block range
|
||||
- `GET /rpc/info` — Chain information
|
||||
- `GET /rpc/supply` — Token supply info
|
||||
- `GET /rpc/validators` — List validators
|
||||
- `GET /rpc/state` — Full state dump
|
||||
|
||||
### Transactions
|
||||
- `POST /rpc/sendTx` — Submit transaction (JSON body: `TransactionRequest`)
|
||||
- `GET /rpc/transactions` — Latest transactions
|
||||
- `GET /rpc/tx/{tx_hash}` — Get transaction by hash
|
||||
- `POST /rpc/estimateFee` — Estimate fee for transaction type
|
||||
|
||||
### Receipts (Compute Proofs)
|
||||
- `POST /rpc/submitReceipt` — Submit receipt claim
|
||||
- `GET /rpc/receipts` — Latest receipts
|
||||
- `GET /rpc/receipts/{receipt_id}` — Get receipt by ID
|
||||
|
||||
### Accounts
|
||||
- `GET /rpc/getBalance/{address}` — Account balance
|
||||
- `GET /rpc/address/{address}` — Address details + txs
|
||||
- `GET /rpc/addresses` — List active addresses
|
||||
|
||||
### Admin
|
||||
- `POST /rpc/admin/mintFaucet` — Mint devnet funds (requires admin key)
|
||||
|
||||
### Sync
|
||||
- `GET /rpc/syncStatus` — Chain sync status
|
||||
|
||||
## CLI Integration
|
||||
|
||||
Use the AITBC CLI to interact with the node:
|
||||
|
||||
```bash
|
||||
source /opt/aitbc/cli/venv/bin/activate
|
||||
aitbc blockchain status
|
||||
aitbc blockchain head
|
||||
aitbc blockchain balance --address <your-address>
|
||||
aitbc blockchain faucet --address <your-address> --amount 1000
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `.env` in this directory to change:
|
||||
|
||||
```
|
||||
CHAIN_ID=ait-devnet
|
||||
DB_PATH=./data/chain.db
|
||||
RPC_BIND_HOST=0.0.0.0
|
||||
RPC_BIND_PORT=8026
|
||||
P2P_BIND_HOST=0.0.0.0
|
||||
P2P_BIND_PORT=7070
|
||||
PROPOSER_KEY=proposer_key_<timestamp>
|
||||
MINT_PER_UNIT=1000
|
||||
COORDINATOR_RATIO=0.05
|
||||
GOSSIP_BACKEND=memory
|
||||
```
|
||||
|
||||
Restart the node after changes.
|
||||
|
||||
## Project Layout
|
||||
|
||||
```
|
||||
blockchain-node/
|
||||
├── src/aitbc_chain/
|
||||
│ ├── app.py # FastAPI app + routes
|
||||
│ ├── main.py # Proposer loop + startup
|
||||
│ ├── config.py # Settings from .env
|
||||
│ ├── database.py # DB init + session mgmt
|
||||
│ ├── mempool.py # Transaction mempool
|
||||
│ ├── gossip/ # P2P message bus
|
||||
│ ├── consensus/ # PoA proposer logic
|
||||
│ ├── rpc/ # RPC endpoints
|
||||
│ ├── contracts/ # Smart contract logic
|
||||
│ └── models.py # SQLModel definitions
|
||||
├── data/
|
||||
│ └── devnet/
|
||||
│ └── genesis.json # Generated by make_genesis.py
|
||||
├── scripts/
|
||||
│ ├── make_genesis.py # Genesis generator
|
||||
│ ├── devnet_up.sh # Devnet launcher
|
||||
│ └── keygen.py # Keypair generator
|
||||
└── .env # Node configuration
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- The node uses proof-of-authority (PoA) consensus with a single proposer for the devnet.
|
||||
- Transactions require a valid signature (ed25519) unless running in test mode.
|
||||
- Receipts represent compute work attestations and mint new AIT tokens to the miner.
|
||||
- Gossip backend defaults to in-memory; for multi-node networks, configure a Redis backend.
|
||||
- RPC API does not require authentication on devnet (add in production).
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Port already in use:** Change `RPC_BIND_PORT` in `.env` and restart.
|
||||
|
||||
**Database locked:** Ensure only one node instance is running; delete `data/chain.db` if corrupted.
|
||||
|
||||
**No blocks proposed:** Check proposer logs; ensure `PROPOSER_KEY` is set and no other proposers are conflicting.
|
||||
|
||||
**Mock coordinator not responding:** It's only needed for certain tests; the blockchain node can run standalone.
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
{
|
||||
"accounts": [
|
||||
{
|
||||
"address": "ait1faucet000000000000000000000000000000000",
|
||||
"balance": 1000000000,
|
||||
"nonce": 0
|
||||
}
|
||||
],
|
||||
"authorities": [
|
||||
{
|
||||
"address": "ait1devproposer000000000000000000000000000000",
|
||||
"weight": 1
|
||||
}
|
||||
],
|
||||
"chain_id": "ait-devnet",
|
||||
"params": {
|
||||
"base_fee": 10,
|
||||
"coordinator_ratio": 0.05,
|
||||
"fee_per_byte": 1,
|
||||
"mint_per_unit": 1000
|
||||
},
|
||||
"timestamp": 1772895053
|
||||
}
|
||||
@@ -26,6 +26,8 @@ rich = "^13.7.1"
|
||||
cryptography = "^46.0.5"
|
||||
asyncpg = ">=0.29.0"
|
||||
requests = "^2.32.5"
|
||||
# Pin starlette to a version with Broadcast (removed in 0.38)
|
||||
starlette = ">=0.37.2,<0.38.0"
|
||||
|
||||
[tool.poetry.extras]
|
||||
uvloop = ["uvloop"]
|
||||
|
||||
@@ -16,6 +16,7 @@ from .mempool import init_mempool
|
||||
from .metrics import metrics_registry
|
||||
from .rpc.router import router as rpc_router
|
||||
from .rpc.websocket import router as websocket_router
|
||||
from .escrow_routes import router as escrow_router
|
||||
|
||||
_app_logger = get_logger("aitbc_chain.app")
|
||||
|
||||
@@ -128,9 +129,12 @@ def create_app() -> FastAPI:
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Include routers
|
||||
app.include_router(rpc_router, prefix="/rpc", tags=["rpc"])
|
||||
app.include_router(websocket_router, prefix="/rpc")
|
||||
app.include_router(escrow_router, prefix="/rpc")
|
||||
|
||||
# Metrics and health endpoints
|
||||
metrics_router = APIRouter()
|
||||
|
||||
@metrics_router.get("/metrics", response_class=PlainTextResponse, tags=["metrics"], summary="Prometheus metrics")
|
||||
|
||||
@@ -7,6 +7,9 @@ from sqlalchemy import event
|
||||
|
||||
from .config import settings
|
||||
|
||||
# Import all models to ensure they are registered with SQLModel.metadata
|
||||
from .models import Block, Transaction, Account, Receipt, Escrow # noqa: F401
|
||||
|
||||
_engine = create_engine(f"sqlite:///{settings.db_path}", echo=False)
|
||||
|
||||
@event.listens_for(_engine, "connect")
|
||||
@@ -29,3 +32,6 @@ def init_db() -> None:
|
||||
def session_scope() -> Session:
|
||||
with Session(_engine) as session:
|
||||
yield session
|
||||
|
||||
# Expose engine for escrow routes
|
||||
engine = _engine
|
||||
|
||||
@@ -2,11 +2,14 @@ from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import warnings
|
||||
from collections import defaultdict
|
||||
from contextlib import suppress
|
||||
from dataclasses import dataclass
|
||||
from typing import Any, Callable, Dict, List, Optional, Set
|
||||
|
||||
warnings.filterwarnings("ignore", message="coroutine.* was never awaited", category=RuntimeWarning)
|
||||
|
||||
try:
|
||||
from starlette.broadcast import Broadcast
|
||||
except ImportError: # pragma: no cover - Starlette removed Broadcast in recent versions
|
||||
|
||||
@@ -155,3 +155,12 @@ class Account(SQLModel, table=True):
|
||||
balance: int = 0
|
||||
nonce: int = 0
|
||||
updated_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
|
||||
class Escrow(SQLModel, table=True):
|
||||
__tablename__ = "escrow"
|
||||
job_id: str = Field(primary_key=True)
|
||||
buyer: str = Field(foreign_key="account.address")
|
||||
provider: str = Field(foreign_key="account.address")
|
||||
amount: int
|
||||
created_at: datetime = Field(default_factory=datetime.utcnow)
|
||||
released_at: Optional[datetime] = None
|
||||
|
||||
@@ -468,3 +468,7 @@ def create_app() -> FastAPI:
|
||||
|
||||
|
||||
app = create_app()
|
||||
|
||||
# Register jobs router
|
||||
from .routers import jobs as jobs_router
|
||||
app.include_router(jobs_router.router)
|
||||
|
||||
@@ -426,6 +426,26 @@ async def send_payment(
|
||||
}
|
||||
|
||||
|
||||
@router.delete("/marketplace/gpu/{gpu_id}")
|
||||
async def delete_gpu(
|
||||
gpu_id: str,
|
||||
session: Annotated[Session, Depends(get_session)],
|
||||
force: bool = Query(default=False, description="Force delete even if GPU is booked")
|
||||
) -> Dict[str, Any]:
|
||||
"""Delete (unregister) a GPU from the marketplace."""
|
||||
gpu = _get_gpu_or_404(session, gpu_id)
|
||||
|
||||
if gpu.status == "booked" and not force:
|
||||
raise HTTPException(
|
||||
status_code=http_status.HTTP_409_CONFLICT,
|
||||
detail=f"GPU {gpu_id} is currently booked. Use force=true to delete anyway."
|
||||
)
|
||||
|
||||
session.delete(gpu)
|
||||
session.commit()
|
||||
return {"status": "deleted", "gpu_id": gpu_id}
|
||||
|
||||
|
||||
@router.get("/marketplace/gpu/{gpu_id}/reviews")
|
||||
async def get_gpu_reviews(
|
||||
gpu_id: str,
|
||||
|
||||
@@ -7,6 +7,7 @@ Provides SQLite and PostgreSQL support with connection pooling.
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import logging
|
||||
from contextlib import contextmanager
|
||||
from contextlib import asynccontextmanager
|
||||
from typing import Generator, AsyncGenerator
|
||||
@@ -15,9 +16,12 @@ from sqlalchemy import create_engine
|
||||
from sqlalchemy.pool import QueuePool
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine, async_sessionmaker
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
from sqlalchemy.exc import OperationalError
|
||||
|
||||
from sqlmodel import SQLModel
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from ..config import settings
|
||||
|
||||
_engine = None
|
||||
@@ -64,6 +68,14 @@ def init_db() -> Engine:
|
||||
"""Initialize database tables and ensure data directory exists."""
|
||||
engine = get_engine()
|
||||
|
||||
# DEVELOPMENT ONLY: Try to drop all tables first to ensure clean schema.
|
||||
# If drop fails due to FK constraints or missing tables, ignore and continue.
|
||||
if "sqlite" in str(engine.url):
|
||||
try:
|
||||
SQLModel.metadata.drop_all(engine)
|
||||
except Exception as e:
|
||||
logger.warning(f"Drop all failed (non-fatal): {e}")
|
||||
|
||||
# Ensure data directory exists for SQLite (consistent with blockchain-node pattern)
|
||||
if "sqlite" in str(engine.url):
|
||||
db_path = engine.url.database
|
||||
@@ -75,7 +87,17 @@ def init_db() -> Engine:
|
||||
data_dir = Path(db_path).parent
|
||||
data_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# DEVELOPMENT: Try create_all; if OperationalError about existing index, ignore
|
||||
try:
|
||||
SQLModel.metadata.create_all(engine)
|
||||
except OperationalError as e:
|
||||
if "already exists" in str(e):
|
||||
logger.warning(f"Index already exists during create_all (non-fatal): {e}")
|
||||
else:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error during create_all: {e}")
|
||||
raise
|
||||
return engine
|
||||
|
||||
|
||||
|
||||
@@ -10,29 +10,15 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# Import advanced analytics with robust path resolution
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
if os.path.isdir(_services_path):
|
||||
if _services_path not in sys.path:
|
||||
sys.path.insert(0, _services_path)
|
||||
else:
|
||||
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
|
||||
else:
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
|
||||
sys.path.insert(0, _computed_services)
|
||||
else:
|
||||
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
if os.path.isdir(_fallback) and _fallback not in sys.path:
|
||||
sys.path.insert(0, _fallback)
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
try:
|
||||
from advanced_analytics import (
|
||||
from app.services.advanced_analytics import (
|
||||
start_analytics_monitoring, stop_analytics_monitoring, get_dashboard_data,
|
||||
create_analytics_alert, get_analytics_summary, advanced_analytics,
|
||||
MetricType, Timeframe
|
||||
@@ -43,8 +29,8 @@ except ImportError as e:
|
||||
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Required service module 'advanced_analytics' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.advanced_analytics' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
start_analytics_monitoring = stop_analytics_monitoring = get_dashboard_data = _missing
|
||||
create_analytics_alert = get_analytics_summary = _missing
|
||||
|
||||
159
cli/aitbc_cli/commands/ai.py
Normal file
159
cli/aitbc_cli/commands/ai.py
Normal file
@@ -0,0 +1,159 @@
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
import uuid
|
||||
import click
|
||||
import uvicorn
|
||||
from fastapi import FastAPI, HTTPException
|
||||
from pydantic import BaseModel
|
||||
import httpx
|
||||
|
||||
@click.group(name='ai')
|
||||
def ai_group():
|
||||
"""AI marketplace commands."""
|
||||
pass
|
||||
|
||||
@ai_group.command()
|
||||
@click.option('--port', default=8008, show_default=True, help='Port to listen on')
|
||||
@click.option('--model', default='qwen3:8b', show_default=True, help='Ollama model name')
|
||||
@click.option('--wallet', 'provider_wallet', required=True, help='Provider wallet address (for verification)')
|
||||
@click.option('--marketplace-url', default='http://127.0.0.1:8014', help='Marketplace API base URL')
|
||||
def serve(port, model, provider_wallet, marketplace_url):
|
||||
"""Start AI provider daemon (FastAPI server)."""
|
||||
click.echo(f"Starting AI provider on port {port}, model {model}, marketplace {marketplace_url}")
|
||||
|
||||
app = FastAPI(title="AI Provider")
|
||||
|
||||
class JobRequest(BaseModel):
|
||||
prompt: str
|
||||
buyer: str # buyer wallet address
|
||||
amount: int
|
||||
txid: str | None = None # optional transaction id
|
||||
|
||||
class JobResponse(BaseModel):
|
||||
result: str
|
||||
model: str
|
||||
job_id: str | None = None
|
||||
|
||||
@app.get("/health")
|
||||
async def health():
|
||||
return {"status": "ok", "model": model, "wallet": provider_wallet}
|
||||
|
||||
@app.post("/job")
|
||||
async def handle_job(req: JobRequest):
|
||||
click.echo(f"Received job from {req.buyer}: {req.prompt[:50]}...")
|
||||
# Generate a job_id
|
||||
job_id = str(uuid.uuid4())
|
||||
# Register job with marketplace (optional, best-effort)
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
create_resp = await client.post(
|
||||
f"{marketplace_url}/v1/jobs",
|
||||
json={
|
||||
"payload": {"prompt": req.prompt, "model": model},
|
||||
"constraints": {},
|
||||
"payment_amount": req.amount,
|
||||
"payment_currency": "AITBC"
|
||||
},
|
||||
headers={"X-Api-Key": ""}, # optional API key
|
||||
timeout=5.0
|
||||
)
|
||||
if create_resp.status_code in (200, 201):
|
||||
job_data = create_resp.json()
|
||||
job_id = job_data.get("job_id", job_id)
|
||||
click.echo(f"Registered job {job_id} with marketplace")
|
||||
else:
|
||||
click.echo(f"Marketplace job registration failed: {create_resp.status_code}", err=True)
|
||||
except Exception as e:
|
||||
click.echo(f"Warning: marketplace registration skipped: {e}", err=True)
|
||||
# Process with Ollama
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
resp = await client.post(
|
||||
"http://127.0.0.1:11434/api/generate",
|
||||
json={"model": model, "prompt": req.prompt, "stream": False},
|
||||
timeout=60.0
|
||||
)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
result = data.get("response", "")
|
||||
except httpx.HTTPError as e:
|
||||
raise HTTPException(status_code=500, detail=f"Ollama error: {e}")
|
||||
# Update marketplace with result (if registered)
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
patch_resp = await client.patch(
|
||||
f"{marketplace_url}/v1/jobs/{job_id}",
|
||||
json={"result": result, "state": "completed"},
|
||||
timeout=5.0
|
||||
)
|
||||
if patch_resp.status_code == 200:
|
||||
click.echo(f"Updated job {job_id} with result")
|
||||
except Exception as e:
|
||||
click.echo(f"Warning: failed to update job in marketplace: {e}", err=True)
|
||||
return JobResponse(result=result, model=model, job_id=job_id)
|
||||
|
||||
uvicorn.run(app, host="0.0.0.0", port=port)
|
||||
|
||||
@ai_group.command()
|
||||
@click.option('--to', required=True, help='Provider host (IP)')
|
||||
@click.option('--port', default=8008, help='Provider port')
|
||||
@click.option('--prompt', required=True, help='Prompt to send')
|
||||
@click.option('--buyer-wallet', 'buyer_wallet', required=True, help='Buyer wallet name (in local wallet store)')
|
||||
@click.option('--provider-wallet', 'provider_wallet', required=True, help='Provider wallet address (recipient)')
|
||||
@click.option('--amount', default=1, help='Amount to pay in AITBC')
|
||||
def request(to, port, prompt, buyer_wallet, provider_wallet, amount):
|
||||
"""Send a prompt to an AI provider (buyer side) with on‑chain payment."""
|
||||
# Helper to get provider balance
|
||||
def get_balance():
|
||||
res = subprocess.run([
|
||||
sys.executable, "-m", "aitbc_cli.main", "blockchain", "balance",
|
||||
"--address", provider_wallet
|
||||
], capture_output=True, text=True, check=True)
|
||||
for line in res.stdout.splitlines():
|
||||
if "Balance:" in line:
|
||||
parts = line.split(":")
|
||||
return float(parts[1].strip())
|
||||
raise ValueError("Balance not found")
|
||||
|
||||
# Step 1: get initial balance
|
||||
before = get_balance()
|
||||
click.echo(f"Provider balance before: {before}")
|
||||
|
||||
# Step 2: send payment via blockchain CLI (use current Python env)
|
||||
if amount > 0:
|
||||
click.echo(f"Sending {amount} AITBC from wallet '{buyer_wallet}' to {provider_wallet}...")
|
||||
try:
|
||||
subprocess.run([
|
||||
sys.executable, "-m", "aitbc_cli.main", "blockchain", "send",
|
||||
"--from", buyer_wallet,
|
||||
"--to", provider_wallet,
|
||||
"--amount", str(amount)
|
||||
], check=True, capture_output=True, text=True)
|
||||
click.echo("Payment sent.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise click.ClickException(f"Blockchain send failed: {e.stderr}")
|
||||
|
||||
# Step 3: get new balance
|
||||
after = get_balance()
|
||||
click.echo(f"Provider balance after: {after}")
|
||||
delta = after - before
|
||||
click.echo(f"Balance delta: {delta}")
|
||||
|
||||
# Step 4: call provider
|
||||
url = f"http://{to}:{port}/job"
|
||||
payload = {
|
||||
"prompt": prompt,
|
||||
"buyer": provider_wallet,
|
||||
"amount": amount
|
||||
}
|
||||
try:
|
||||
resp = httpx.post(url, json=payload, timeout=30.0)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
click.echo("Result: " + data.get("result", ""))
|
||||
except httpx.HTTPError as e:
|
||||
raise click.ClickException(f"Request to provider failed: {e}")
|
||||
|
||||
if __name__ == '__main__':
|
||||
ai_group()
|
||||
@@ -10,29 +10,15 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime
|
||||
|
||||
# Import AI surveillance system with robust path resolution
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
if os.path.isdir(_services_path):
|
||||
if _services_path not in sys.path:
|
||||
sys.path.insert(0, _services_path)
|
||||
else:
|
||||
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
|
||||
else:
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
|
||||
sys.path.insert(0, _computed_services)
|
||||
else:
|
||||
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
if os.path.isdir(_fallback) and _fallback not in sys.path:
|
||||
sys.path.insert(0, _fallback)
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
try:
|
||||
from ai_surveillance import (
|
||||
from app.services.ai_surveillance import (
|
||||
start_ai_surveillance, stop_ai_surveillance, get_surveillance_summary,
|
||||
get_user_risk_profile, list_active_alerts, analyze_behavior_patterns,
|
||||
ai_surveillance, SurveillanceType, RiskLevel, AlertPriority
|
||||
@@ -43,8 +29,8 @@ except ImportError as e:
|
||||
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Required service module 'ai_surveillance' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.ai_surveillance' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
start_ai_surveillance = stop_ai_surveillance = get_surveillance_summary = _missing
|
||||
get_user_risk_profile = list_active_alerts = analyze_behavior_patterns = _missing
|
||||
|
||||
@@ -10,29 +10,15 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# Import AI trading engine with robust path resolution
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
if os.path.isdir(_services_path):
|
||||
if _services_path not in sys.path:
|
||||
sys.path.insert(0, _services_path)
|
||||
else:
|
||||
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
|
||||
else:
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
|
||||
sys.path.insert(0, _computed_services)
|
||||
else:
|
||||
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
if os.path.isdir(_fallback) and _fallback not in sys.path:
|
||||
sys.path.insert(0, _fallback)
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
try:
|
||||
from ai_trading_engine import (
|
||||
from app.services.ai_trading_engine import (
|
||||
initialize_ai_engine, train_strategies, generate_trading_signals,
|
||||
get_engine_status, ai_trading_engine, TradingStrategy
|
||||
)
|
||||
@@ -42,8 +28,8 @@ except ImportError as e:
|
||||
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Required service module 'ai_trading_engine' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.ai_trading_engine' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
initialize_ai_engine = train_strategies = generate_trading_signals = get_engine_status = _missing
|
||||
ai_trading_engine = None
|
||||
|
||||
@@ -10,41 +10,32 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime
|
||||
|
||||
# Import enterprise integration services using importlib to avoid naming conflicts
|
||||
import importlib.util
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
base_dir = _services_path
|
||||
else:
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
base_dir = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if not os.path.isdir(base_dir):
|
||||
base_dir = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
try:
|
||||
from app.services.enterprise_integration import (
|
||||
create_tenant, get_tenant_info, generate_api_key,
|
||||
register_integration, get_system_status, list_tenants,
|
||||
list_integrations
|
||||
)
|
||||
# Get EnterpriseAPIGateway if available
|
||||
import app.services.enterprise_integration as ei_module
|
||||
EnterpriseAPIGateway = getattr(ei_module, 'EnterpriseAPIGateway', None)
|
||||
_import_error = None
|
||||
except ImportError as e:
|
||||
_import_error = e
|
||||
|
||||
module_path = os.path.join(base_dir, 'enterprise_integration.py')
|
||||
if os.path.isfile(module_path):
|
||||
spec = importlib.util.spec_from_file_location("enterprise_integration_service", module_path)
|
||||
ei = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(ei)
|
||||
create_tenant = ei.create_tenant
|
||||
get_tenant_info = ei.get_tenant_info
|
||||
generate_api_key = ei.generate_api_key
|
||||
register_integration = ei.register_integration
|
||||
get_system_status = ei.get_system_status
|
||||
list_tenants = ei.list_tenants
|
||||
list_integrations = ei.list_integrations
|
||||
EnterpriseAPIGateway = getattr(ei, 'EnterpriseAPIGateway', None)
|
||||
else:
|
||||
# Provide stubs if module not found
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Could not load enterprise_integration.py from {module_path}. "
|
||||
"Ensure coordinator-api services are available or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.enterprise_integration' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
create_tenant = get_tenant_info = generate_api_key = _missing
|
||||
register_integration = get_system_status = list_tenants = list_integrations = _missing
|
||||
create_tenant = get_tenant_info = generate_api_key = register_integration = get_system_status = list_tenants = list_integrations = _missing
|
||||
EnterpriseAPIGateway = None
|
||||
|
||||
@click.group()
|
||||
|
||||
@@ -300,6 +300,34 @@ def pay(ctx, booking_id: str, amount: float, from_wallet: str, to_wallet: str, t
|
||||
except Exception as e:
|
||||
error(f"Payment failed: {e}")
|
||||
|
||||
@gpu.command()
|
||||
@click.argument("gpu_id")
|
||||
@click.option("--force", is_flag=True, help="Force delete even if GPU is booked")
|
||||
@click.pass_context
|
||||
def unregister(ctx, gpu_id: str, force: bool):
|
||||
"""Unregister (delete) a GPU from marketplace"""
|
||||
config = ctx.obj['config']
|
||||
|
||||
try:
|
||||
with httpx.Client() as client:
|
||||
response = client.delete(
|
||||
f"{config.coordinator_url}/v1/marketplace/gpu/{gpu_id}",
|
||||
params={"force": force},
|
||||
headers={"X-Api-Key": config.api_key or ""}
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
result = response.json()
|
||||
success(f"GPU {gpu_id} unregistered")
|
||||
output(result, ctx.obj['output_format'])
|
||||
else:
|
||||
error(f"Failed to unregister GPU: {response.status_code}")
|
||||
if response.text:
|
||||
error(response.text)
|
||||
except Exception as e:
|
||||
error(f"Network error: {e}")
|
||||
|
||||
|
||||
@gpu.command()
|
||||
@click.argument("gpu_id")
|
||||
@click.pass_context
|
||||
|
||||
@@ -10,29 +10,15 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# Import regulatory reporting system with robust path resolution
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
if os.path.isdir(_services_path):
|
||||
if _services_path not in sys.path:
|
||||
sys.path.insert(0, _services_path)
|
||||
else:
|
||||
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
|
||||
else:
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
|
||||
sys.path.insert(0, _computed_services)
|
||||
else:
|
||||
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
if os.path.isdir(_fallback) and _fallback not in sys.path:
|
||||
sys.path.insert(0, _fallback)
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
try:
|
||||
from regulatory_reporting import (
|
||||
from app.services.regulatory_reporting import (
|
||||
generate_sar, generate_compliance_summary, list_reports,
|
||||
regulatory_reporter, ReportType, ReportStatus, RegulatoryBody
|
||||
)
|
||||
@@ -42,8 +28,8 @@ except ImportError as e:
|
||||
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Required service module 'regulatory_reporting' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.regulatory_reporting' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
generate_sar = generate_compliance_summary = list_reports = regulatory_reporter = _missing
|
||||
|
||||
|
||||
@@ -10,32 +10,15 @@ import json
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# Import surveillance system with robust path resolution
|
||||
# Ensure coordinator-api src is on path for app.services imports
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Determine services path: use AITBC_SERVICES_PATH if set, else compute relative to repo layout
|
||||
_services_path = os.environ.get('AITBC_SERVICES_PATH')
|
||||
if _services_path:
|
||||
if os.path.isdir(_services_path):
|
||||
if _services_path not in sys.path:
|
||||
sys.path.insert(0, _services_path)
|
||||
else:
|
||||
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
|
||||
else:
|
||||
# Compute project root relative to this file: cli/aitbc_cli/commands -> 3 levels up to project root
|
||||
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
|
||||
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
|
||||
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
|
||||
sys.path.insert(0, _computed_services)
|
||||
else:
|
||||
# Fallback to known hardcoded path if it exists (for legacy deployments)
|
||||
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
|
||||
if os.path.isdir(_fallback) and _fallback not in sys.path:
|
||||
sys.path.insert(0, _fallback)
|
||||
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
|
||||
if _src_path not in sys.path:
|
||||
sys.path.insert(0, _src_path)
|
||||
|
||||
try:
|
||||
from trading_surveillance import (
|
||||
from app.services.trading_surveillance import (
|
||||
start_surveillance, stop_surveillance, get_alerts,
|
||||
get_surveillance_summary, AlertLevel
|
||||
)
|
||||
@@ -45,8 +28,8 @@ except ImportError as e:
|
||||
|
||||
def _missing(*args, **kwargs):
|
||||
raise ImportError(
|
||||
f"Required service module 'trading_surveillance' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
|
||||
f"Required service module 'app.services.trading_surveillance' could not be imported: {_import_error}. "
|
||||
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
|
||||
)
|
||||
start_surveillance = stop_surveillance = get_alerts = get_surveillance_summary = _missing
|
||||
|
||||
@@ -237,7 +220,7 @@ def resolve(ctx, alert_id: str, resolution: str):
|
||||
click.echo(f"🔍 Resolving alert: {alert_id}")
|
||||
|
||||
# Import surveillance to access resolve function
|
||||
from trading_surveillance import surveillance
|
||||
from app.services.trading_surveillance import surveillance
|
||||
|
||||
success = surveillance.resolve_alert(alert_id, resolution)
|
||||
|
||||
@@ -263,7 +246,7 @@ def test(ctx, symbols: str, duration: int):
|
||||
click.echo(f"⏱️ Duration: {duration} seconds")
|
||||
|
||||
# Import test function
|
||||
from trading_surveillance import test_trading_surveillance
|
||||
from app.services.trading_surveillance import test_trading_surveillance
|
||||
|
||||
# Run test
|
||||
asyncio.run(test_trading_surveillance())
|
||||
@@ -289,7 +272,7 @@ def test(ctx, symbols: str, duration: int):
|
||||
def status(ctx):
|
||||
"""Show current surveillance status"""
|
||||
try:
|
||||
from trading_surveillance import surveillance
|
||||
from app.services.trading_surveillance import surveillance
|
||||
|
||||
click.echo(f"📊 Trading Surveillance Status")
|
||||
|
||||
|
||||
@@ -58,7 +58,16 @@ from .commands.regulatory import regulatory
|
||||
from .commands.ai_trading import ai_trading
|
||||
from .commands.advanced_analytics import advanced_analytics_group
|
||||
from .commands.ai_surveillance import ai_surveillance_group
|
||||
|
||||
# AI provider commands
|
||||
from .commands.ai import ai_group
|
||||
|
||||
# Enterprise integration (optional)
|
||||
try:
|
||||
from .commands.enterprise_integration import enterprise_integration_group
|
||||
except ImportError:
|
||||
enterprise_integration_group = None
|
||||
|
||||
from .commands.explorer import explorer
|
||||
from .plugins import plugin, load_plugins
|
||||
|
||||
@@ -242,6 +251,7 @@ cli.add_command(transfer_control)
|
||||
cli.add_command(agent)
|
||||
cli.add_command(multimodal)
|
||||
cli.add_command(optimize)
|
||||
cli.add_command(ai_group)
|
||||
# cli.add_command(openclaw) # Temporarily disabled
|
||||
cli.add_command(swarm)
|
||||
cli.add_command(chain)
|
||||
@@ -258,6 +268,7 @@ cli.add_command(regulatory)
|
||||
cli.add_command(ai_trading)
|
||||
cli.add_command(advanced_analytics_group)
|
||||
cli.add_command(ai_surveillance_group)
|
||||
if enterprise_integration_group is not None:
|
||||
cli.add_command(enterprise_integration_group)
|
||||
cli.add_command(explorer)
|
||||
cli.add_command(plugin)
|
||||
|
||||
@@ -20,7 +20,7 @@ genesis:
|
||||
- address: aitbc1genesis
|
||||
balance: '2100000000'
|
||||
type: genesis
|
||||
- address: aitbc1aitbc1_simple
|
||||
- address: aitbc1aitbc1_simple_simple
|
||||
balance: '500'
|
||||
type: gift
|
||||
metadata:
|
||||
|
||||
@@ -118,7 +118,13 @@ dependencies = [
|
||||
"tabulate==0.9.0",
|
||||
"colorama==0.4.6",
|
||||
"python-dotenv==1.0.0",
|
||||
"asyncpg==0.31.0"
|
||||
"asyncpg==0.31.0",
|
||||
# Dependencies for service module imports (coordinator-api services)
|
||||
"numpy>=1.26.0",
|
||||
"pandas>=2.0.0",
|
||||
"aiohttp>=3.9.0",
|
||||
"fastapi>=0.111.0",
|
||||
"uvicorn[standard]>=0.30.0"
|
||||
]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
@@ -162,11 +168,12 @@ requires = ["setuptools>=61.0", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["cli"]
|
||||
include = ["aitbc_cli*"]
|
||||
where = ["cli", "apps/coordinator-api"]
|
||||
include = ["aitbc_cli*", "aitbc*"]
|
||||
|
||||
[tool.setuptools.package-dir]
|
||||
"aitbc_cli" = "cli/aitbc_cli"
|
||||
"aitbc" = "apps/coordinator-api/aitbc"
|
||||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
|
||||
Reference in New Issue
Block a user