merge PR #40: add production setup and infrastructure improvements
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (pull_request) Has been cancelled
Security Scanning / Dependency Security Scan (pull_request) Has been cancelled
Security Scanning / Container Security Scan (pull_request) Has been cancelled
Security Scanning / OSSF Scorecard (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-cli (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / security-scan (pull_request) Has been cancelled
AITBC CI/CD Pipeline / build (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (pull_request) Has been cancelled
AITBC CI/CD Pipeline / performance-test (pull_request) Has been cancelled
AITBC CI/CD Pipeline / docs (pull_request) Has been cancelled
AITBC CI/CD Pipeline / release (pull_request) Has been cancelled
AITBC CI/CD Pipeline / notify (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (pull_request) Has been cancelled
Security Scanning / Security Summary Report (pull_request) Has been cancelled
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (pull_request) Has been cancelled
Security Scanning / Dependency Security Scan (pull_request) Has been cancelled
Security Scanning / Container Security Scan (pull_request) Has been cancelled
Security Scanning / OSSF Scorecard (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-cli (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / security-scan (pull_request) Has been cancelled
AITBC CI/CD Pipeline / build (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (pull_request) Has been cancelled
AITBC CI/CD Pipeline / performance-test (pull_request) Has been cancelled
AITBC CI/CD Pipeline / docs (pull_request) Has been cancelled
AITBC CI/CD Pipeline / release (pull_request) Has been cancelled
AITBC CI/CD Pipeline / notify (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (pull_request) Has been cancelled
Security Scanning / Security Summary Report (pull_request) Has been cancelled
- Add production genesis initialization scripts - Add keystore management for production - Add production node runner - Add setup production automation - Add AI memory system for development tracking - Add translation cache service - Add development heartbeat monitoring - Update blockchain RPC router - Update coordinator API main configuration - Update secure pickle service - Update claim task script - Update blockchain service configuration - Update gitignore for production files Resolves conflicts by accepting PR branch changes
This commit is contained in:
@@ -4,7 +4,6 @@ from sqlalchemy import func
|
||||
import asyncio
|
||||
import json
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from fastapi import APIRouter, HTTPException, status
|
||||
@@ -62,6 +61,7 @@ class EstimateFeeRequest(BaseModel):
|
||||
payload: Dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
|
||||
|
||||
@router.get("/head", summary="Get current chain head")
|
||||
async def get_head(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
metrics_registry.increment("rpc_get_head_total")
|
||||
@@ -526,6 +526,7 @@ async def estimate_fee(request: EstimateFeeRequest) -> Dict[str, Any]:
|
||||
}
|
||||
|
||||
|
||||
|
||||
class ImportBlockRequest(BaseModel):
|
||||
height: int
|
||||
hash: str
|
||||
@@ -641,27 +642,15 @@ async def get_token_supply(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
start = time.perf_counter()
|
||||
|
||||
with session_scope() as session:
|
||||
# Sum balances of all accounts in this chain
|
||||
result = session.exec(select(func.sum(Account.balance)).where(Account.chain_id == chain_id)).one_or_none()
|
||||
circulating = int(result) if result is not None else 0
|
||||
|
||||
# Total supply is read from genesis (fixed), or fallback to circulating if unavailable
|
||||
# Try to locate genesis file
|
||||
genesis_path = Path(f"./data/{chain_id}/genesis.json")
|
||||
total_supply = circulating # default fallback
|
||||
if genesis_path.exists():
|
||||
try:
|
||||
with open(genesis_path) as f:
|
||||
g = json.load(f)
|
||||
total_supply = sum(a["balance"] for a in g.get("allocations", []))
|
||||
except Exception:
|
||||
total_supply = circulating
|
||||
|
||||
# Simple implementation for now
|
||||
response = {
|
||||
"chain_id": chain_id,
|
||||
"total_supply": total_supply,
|
||||
"circulating_supply": circulating,
|
||||
"total_supply": 1000000000, # 1 billion from genesis
|
||||
"circulating_supply": 0, # No transactions yet
|
||||
"faucet_balance": 1000000000, # All tokens in faucet
|
||||
"faucet_address": "ait1faucet000000000000000000000000000000000",
|
||||
"mint_per_unit": cfg.mint_per_unit,
|
||||
"total_accounts": 0
|
||||
}
|
||||
|
||||
metrics_registry.observe("rpc_supply_duration_seconds", time.perf_counter() - start)
|
||||
@@ -672,35 +661,30 @@ async def get_token_supply(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
async def get_validators(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
"""List blockchain validators (authorities)"""
|
||||
from ..config import settings as cfg
|
||||
|
||||
|
||||
metrics_registry.increment("rpc_validators_total")
|
||||
start = time.perf_counter()
|
||||
|
||||
# Build validator set from trusted_proposers config (comma-separated)
|
||||
trusted = [p.strip() for p in cfg.trusted_proposers.split(",") if p.strip()]
|
||||
if not trusted:
|
||||
# Fallback to the node's own proposer_id as the sole validator
|
||||
trusted = [cfg.proposer_id]
|
||||
|
||||
|
||||
# For PoA chain, validators are the authorities from genesis
|
||||
# In a full implementation, this would query the actual validator set
|
||||
validators = [
|
||||
{
|
||||
"address": addr,
|
||||
"address": "ait1devproposer000000000000000000000000000000",
|
||||
"weight": 1,
|
||||
"status": "active",
|
||||
"last_block_height": None, # Could be populated from metrics
|
||||
"last_block_height": None, # Would be populated from actual validator tracking
|
||||
"total_blocks_produced": None
|
||||
}
|
||||
for addr in trusted
|
||||
]
|
||||
|
||||
|
||||
response = {
|
||||
"chain_id": chain_id,
|
||||
"validators": validators,
|
||||
"total_validators": len(validators),
|
||||
"consensus_type": "PoA",
|
||||
"consensus_type": "PoA", # Proof of Authority
|
||||
"proposer_id": cfg.proposer_id
|
||||
}
|
||||
|
||||
|
||||
metrics_registry.observe("rpc_validators_duration_seconds", time.perf_counter() - start)
|
||||
return response
|
||||
|
||||
|
||||
@@ -1,3 +1,19 @@
|
||||
"""Coordinator API main entry point."""
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Security: Lock sys.path to trusted locations to prevent malicious package shadowing
|
||||
# Keep: site-packages under /opt/aitbc (venv), stdlib paths, and our app directory
|
||||
_LOCKED_PATH = []
|
||||
for p in sys.path:
|
||||
if 'site-packages' in p and '/opt/aitbc' in p:
|
||||
_LOCKED_PATH.append(p)
|
||||
elif 'site-packages' not in p and ('/usr/lib/python' in p or '/usr/local/lib/python' in p):
|
||||
_LOCKED_PATH.append(p)
|
||||
elif p.startswith('/opt/aitbc/apps/coordinator-api'): # our app code
|
||||
_LOCKED_PATH.append(p)
|
||||
sys.path = _LOCKED_PATH
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
from typing import Annotated
|
||||
from slowapi import Limiter, _rate_limit_exceeded_handler
|
||||
@@ -203,7 +219,6 @@ def create_app() -> FastAPI:
|
||||
docs_url="/docs",
|
||||
redoc_url="/redoc",
|
||||
lifespan=lifespan,
|
||||
# Custom OpenAPI config to handle Annotated[Session, Depends(get_session)] issues
|
||||
openapi_components={
|
||||
"securitySchemes": {
|
||||
"ApiKeyAuth": {
|
||||
@@ -225,6 +240,22 @@ def create_app() -> FastAPI:
|
||||
]
|
||||
)
|
||||
|
||||
# API Key middleware (if configured)
|
||||
required_key = os.getenv("COORDINATOR_API_KEY")
|
||||
if required_key:
|
||||
@app.middleware("http")
|
||||
async def api_key_middleware(request: Request, call_next):
|
||||
# Health endpoints are exempt
|
||||
if request.url.path in ("/health", "/v1/health", "/health/live", "/health/ready", "/metrics", "/rate-limit-metrics"):
|
||||
return await call_next(request)
|
||||
provided = request.headers.get("X-Api-Key")
|
||||
if provided != required_key:
|
||||
return JSONResponse(
|
||||
status_code=401,
|
||||
content={"detail": "Invalid or missing API key"}
|
||||
)
|
||||
return await call_next(request)
|
||||
|
||||
app.state.limiter = limiter
|
||||
app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
|
||||
|
||||
|
||||
@@ -4,6 +4,8 @@ Secure pickle deserialization utilities to prevent arbitrary code execution.
|
||||
|
||||
import pickle
|
||||
import io
|
||||
import importlib.util
|
||||
import os
|
||||
from typing import Any
|
||||
|
||||
# Safe classes whitelist: builtins and common types
|
||||
@@ -15,19 +17,76 @@ SAFE_MODULES = {
|
||||
'datetime': {'datetime', 'date', 'time', 'timedelta', 'timezone'},
|
||||
'collections': {'OrderedDict', 'defaultdict', 'Counter', 'namedtuple'},
|
||||
'dataclasses': {'dataclass'},
|
||||
'typing': {'Any', 'List', 'Dict', 'Tuple', 'Set', 'Optional', 'Union', 'TypeVar', 'Generic', 'NamedTuple', 'TypedDict'},
|
||||
}
|
||||
|
||||
# Compute trusted origins: site-packages inside the venv and stdlib paths
|
||||
_ALLOWED_ORIGINS = set()
|
||||
|
||||
def _initialize_allowed_origins():
|
||||
"""Build set of allowed module file origins (trusted locations)."""
|
||||
# 1. All site-packages directories that are under the application venv
|
||||
for entry in os.sys.path:
|
||||
if 'site-packages' in entry and os.path.isdir(entry):
|
||||
# Only include if it's inside /opt/aitbc/apps/coordinator-api/.venv or similar
|
||||
if '/opt/aitbc' in entry: # restrict to our app directory
|
||||
_ALLOWED_ORIGINS.add(os.path.realpath(entry))
|
||||
# 2. Standard library paths (typically without site-packages)
|
||||
# We'll allow any origin that resolves to a .py file outside site-packages and not in user dirs
|
||||
# But simpler: allow stdlib modules by checking they come from a path that doesn't contain 'site-packages' and is under /usr/lib/python3.13
|
||||
# We'll compute on the fly in find_class for simplicity.
|
||||
|
||||
_initialize_allowed_origins()
|
||||
|
||||
class RestrictedUnpickler(pickle.Unpickler):
|
||||
"""
|
||||
Unpickler that restricts which classes can be instantiated.
|
||||
Only allows classes from SAFE_MODULES whitelist.
|
||||
Only allows classes from SAFE_MODULES whitelist and verifies module origin
|
||||
to prevent shadowing by malicious packages.
|
||||
"""
|
||||
def find_class(self, module: str, name: str) -> Any:
|
||||
if module in SAFE_MODULES and name in SAFE_MODULES[module]:
|
||||
return super().find_class(module, name)
|
||||
# Verify module origin to prevent shadowing attacks
|
||||
spec = importlib.util.find_spec(module)
|
||||
if spec and spec.origin:
|
||||
origin = os.path.realpath(spec.origin)
|
||||
# Allow if it's from a trusted site-packages (our venv)
|
||||
for allowed in _ALLOWED_ORIGINS:
|
||||
if origin.startswith(allowed + os.sep) or origin == allowed:
|
||||
return super().find_class(module, name)
|
||||
# Allow standard library modules (outside site-packages and not in user/local dirs)
|
||||
if 'site-packages' not in origin and ('/usr/lib/python' in origin or '/usr/local/lib/python' in origin):
|
||||
return super().find_class(module, name)
|
||||
# Reject if origin is unexpected (e.g., current working directory, /tmp, /home)
|
||||
raise pickle.UnpicklingError(
|
||||
f"Class {module}.{name} originates from untrusted location: {origin}"
|
||||
)
|
||||
else:
|
||||
# If we can't determine origin, deny (fail-safe)
|
||||
raise pickle.UnpicklingError(f"Cannot verify origin for module {module}")
|
||||
raise pickle.UnpicklingError(f"Class {module}.{name} is not allowed for unpickling (security risk).")
|
||||
|
||||
def safe_loads(data: bytes) -> Any:
|
||||
"""Safely deserialize a pickle byte stream."""
|
||||
return RestrictedUnpickler(io.BytesIO(data)).load()
|
||||
|
||||
# ... existing code ...
|
||||
|
||||
def _lock_sys_path():
|
||||
"""Replace sys.path with a safe subset to prevent shadowing attacks."""
|
||||
import sys
|
||||
if isinstance(sys.path, list):
|
||||
trusted = []
|
||||
for p in sys.path:
|
||||
# Keep site-packages under /opt/aitbc (our venv)
|
||||
if 'site-packages' in p and '/opt/aitbc' in p:
|
||||
trusted.append(p)
|
||||
# Keep stdlib paths (no site-packages, under /usr/lib/python)
|
||||
elif 'site-packages' not in p and ('/usr/lib/python' in p or '/usr/local/lib/python' in p):
|
||||
trusted.append(p)
|
||||
# Keep our application directory
|
||||
elif p.startswith('/opt/aitbc/apps/coordinator-api'):
|
||||
trusted.append(p)
|
||||
sys.path = trusted
|
||||
|
||||
# Lock sys.path immediately upon import to prevent later modifications
|
||||
_lock_sys_path()
|
||||
|
||||
71
apps/coordinator-api/src/app/services/translation_cache.py
Normal file
71
apps/coordinator-api/src/app/services/translation_cache.py
Normal file
@@ -0,0 +1,71 @@
|
||||
"""
|
||||
Translation cache service with optional HMAC integrity protection.
|
||||
"""
|
||||
|
||||
import json
|
||||
import hmac
|
||||
import hashlib
|
||||
import os
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional
|
||||
|
||||
class TranslationCache:
|
||||
def __init__(self, cache_file: str = "translation_cache.json", hmac_key: Optional[str] = None):
|
||||
self.cache_file = Path(cache_file)
|
||||
self.cache: Dict[str, Dict[str, Any]] = {}
|
||||
self.last_updated: Optional[datetime] = None
|
||||
self.hmac_key = hmac_key.encode() if hmac_key else None
|
||||
self._load()
|
||||
|
||||
def _load(self) -> None:
|
||||
if not self.cache_file.exists():
|
||||
return
|
||||
data = self.cache_file.read_bytes()
|
||||
if self.hmac_key:
|
||||
# Verify HMAC-SHA256(key || data)
|
||||
stored = json.loads(data)
|
||||
mac = bytes.fromhex(stored.pop("mac", ""))
|
||||
expected = hmac.new(self.hmac_key, json.dumps(stored, separators=(",", ":")).encode(), hashlib.sha256).digest()
|
||||
if not hmac.compare_digest(mac, expected):
|
||||
raise ValueError("Translation cache HMAC verification failed")
|
||||
data = json.dumps(stored).encode()
|
||||
payload = json.loads(data)
|
||||
self.cache = payload.get("cache", {})
|
||||
last_iso = payload.get("last_updated")
|
||||
self.last_updated = datetime.fromisoformat(last_iso) if last_iso else None
|
||||
|
||||
def _save(self) -> None:
|
||||
payload = {
|
||||
"cache": self.cache,
|
||||
"last_updated": (self.last_updated or datetime.now(timezone.utc)).isoformat()
|
||||
}
|
||||
if self.hmac_key:
|
||||
raw = json.dumps(payload, separators=(",", ":")).encode()
|
||||
mac = hmac.new(self.hmac_key, raw, hashlib.sha256).digest()
|
||||
payload["mac"] = mac.hex()
|
||||
self.cache_file.write_text(json.dumps(payload, indent=2))
|
||||
|
||||
def get(self, source_text: str, source_lang: str, target_lang: str) -> Optional[str]:
|
||||
key = f"{source_lang}:{target_lang}:{source_text}"
|
||||
entry = self.cache.get(key)
|
||||
if not entry:
|
||||
return None
|
||||
return entry["translation"]
|
||||
|
||||
def set(self, source_text: str, source_lang: str, target_lang: str, translation: str) -> None:
|
||||
key = f"{source_lang}:{target_lang}:{source_text}"
|
||||
self.cache[key] = {
|
||||
"translation": translation,
|
||||
"timestamp": datetime.now(timezone.utc).isoformat()
|
||||
}
|
||||
self._save()
|
||||
|
||||
def clear(self) -> None:
|
||||
self.cache.clear()
|
||||
self.last_updated = None
|
||||
if self.cache_file.exists():
|
||||
self.cache_file.unlink()
|
||||
|
||||
def size(self) -> int:
|
||||
return len(self.cache)
|
||||
Reference in New Issue
Block a user