chore: replace print() with proper logging; add jitter to automation #25

Closed
aitbc wants to merge 2 commits from aitbc/23-remove-print-statements into main
31 changed files with 759 additions and 249 deletions

View File

@@ -87,6 +87,49 @@ aitbc --help --language german
aitbc marketplace list --translate-to french
```
## 🔗 Blockchain Node (Brother Chain)
A minimal asset-backed blockchain that validates compute receipts and mints AIT tokens.
### ✅ Current Status
- **Chain ID**: `ait-devnet`
- **Consensus**: Proof-of-Authority (single proposer)
- **RPC Endpoint**: `http://localhost:8026/rpc`
- **Health Check**: `http://localhost:8026/health`
- **Metrics**: `http://localhost:8026/metrics` (Prometheus format)
- **Status**: 🟢 Operational and fully functional
### 🚀 Quick Launch
```bash
cd /opt/aitbc/apps/blockchain-node
source .venv/bin/activate
bash scripts/devnet_up.sh
```
The node starts:
- Proposer loop (block production)
- RPC API on port 8026
- Mock coordinator on port 8090 (for testing)
### 🛠️ CLI Interaction
```bash
# Check node status
aitbc blockchain status
# Get chain head
aitbc blockchain head
# Check balance
aitbc blockchain balance --address <your-address>
# Fund an address (devnet faucet)
aitbc blockchain faucet --address <your-address> --amount 1000
```
For full documentation, see: [`apps/blockchain-node/README.md`](./apps/blockchain-node/README.md)
## 🤖 Agent-First Computing
AITBC creates an ecosystem where AI agents are the primary participants:

View File

@@ -1,25 +1,169 @@
# Blockchain Node
# Blockchain Node (Brother Chain)
## Purpose & Scope
Minimal asset-backed blockchain node that validates compute receipts and mints AIT tokens as described in `docs/bootstrap/blockchain_node.md`.
Minimal asset-backed blockchain node that validates compute receipts and mints AIT tokens.
## Status
Scaffolded. Implementation pending per staged roadmap.
**Operational** — Core blockchain functionality implemented and running.
## Devnet Tooling
### Capabilities
- PoA consensus with single proposer (devnet)
- Transaction processing (TRANSFER, RECEIPT_CLAIM)
- Receipt validation and minting
- Gossip-based peer-to-peer networking (in-memory backend)
- RESTful RPC API (`/rpc/*`)
- Prometheus metrics (`/metrics`)
- Health check endpoint (`/health`)
- SQLite persistence with Alembic migrations
- `scripts/make_genesis.py` — Generate a deterministic devnet genesis file (`data/devnet/genesis.json`).
- `scripts/keygen.py` — Produce throwaway devnet keypairs (printed or written to disk).
- `scripts/devnet_up.sh` — Launch the blockchain node and RPC API with a freshly generated genesis file.
## Quickstart (Devnet)
### Quickstart
The blockchain node is already set up with a virtualenv. To launch:
```bash
cd apps/blockchain-node
python scripts/make_genesis.py --force
cd /opt/aitbc/apps/blockchain-node
source .venv/bin/activate
bash scripts/devnet_up.sh
```
The script sets `PYTHONPATH=src` and starts the proposer loop plus the FastAPI app (via `uvicorn`). Press `Ctrl+C` to stop the devnet.
This will:
1. Generate genesis block at `data/devnet/genesis.json`
2. Start the blockchain node proposer loop (PID logged)
3. Start RPC API on `http://127.0.0.1:8026`
4. Start mock coordinator on `http://127.0.0.1:8090`
Press `Ctrl+C` to stop all processes.
### Manual Startup
If you prefer to start components separately:
```bash
# Terminal 1: Blockchain node
cd /opt/aitbc/apps/blockchain-node
source .venv/bin/activate
PYTHONPATH=src python -m aitbc_chain.main
# Terminal 2: RPC API
cd /opt/aitbc/apps/blockchain-node
source .venv/bin/activate
PYTHONPATH=src uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026
# Terminal 3: Mock coordinator (optional, for testing)
cd /opt/aitbc/apps/blockchain-node
source .venv/bin/activate
PYTHONPATH=src uvicorn mock_coordinator:app --host 127.0.0.1 --port 8090
```
## API Endpoints
Once running, the RPC API is available at `http://127.0.0.1:8026/rpc`.
### Health & Metrics
- `GET /health` — Health check with node info
- `GET /metrics` — Prometheus-format metrics
### Blockchain Queries
- `GET /rpc/head` — Current chain head block
- `GET /rpc/blocks/{height}` — Get block by height
- `GET /rpc/blocks-range?start=0&end=10` — Get block range
- `GET /rpc/info` — Chain information
- `GET /rpc/supply` — Token supply info
- `GET /rpc/validators` — List validators
- `GET /rpc/state` — Full state dump
### Transactions
- `POST /rpc/sendTx` — Submit transaction (JSON body: `TransactionRequest`)
- `GET /rpc/transactions` — Latest transactions
- `GET /rpc/tx/{tx_hash}` — Get transaction by hash
- `POST /rpc/estimateFee` — Estimate fee for transaction type
### Receipts (Compute Proofs)
- `POST /rpc/submitReceipt` — Submit receipt claim
- `GET /rpc/receipts` — Latest receipts
- `GET /rpc/receipts/{receipt_id}` — Get receipt by ID
### Accounts
- `GET /rpc/getBalance/{address}` — Account balance
- `GET /rpc/address/{address}` — Address details + txs
- `GET /rpc/addresses` — List active addresses
### Admin
- `POST /rpc/admin/mintFaucet` — Mint devnet funds (requires admin key)
### Sync
- `GET /rpc/syncStatus` — Chain sync status
## CLI Integration
Use the AITBC CLI to interact with the node:
```bash
source /opt/aitbc/cli/venv/bin/activate
aitbc blockchain status
aitbc blockchain head
aitbc blockchain balance --address <your-address>
aitbc blockchain faucet --address <your-address> --amount 1000
```
## Configuration
Edit `.env` in this directory to change:
```
CHAIN_ID=ait-devnet
DB_PATH=./data/chain.db
RPC_BIND_HOST=0.0.0.0
RPC_BIND_PORT=8026
P2P_BIND_HOST=0.0.0.0
P2P_BIND_PORT=7070
PROPOSER_KEY=proposer_key_<timestamp>
MINT_PER_UNIT=1000
COORDINATOR_RATIO=0.05
GOSSIP_BACKEND=memory
```
Restart the node after changes.
## Project Layout
```
blockchain-node/
├── src/aitbc_chain/
│ ├── app.py # FastAPI app + routes
│ ├── main.py # Proposer loop + startup
│ ├── config.py # Settings from .env
│ ├── database.py # DB init + session mgmt
│ ├── mempool.py # Transaction mempool
│ ├── gossip/ # P2P message bus
│ ├── consensus/ # PoA proposer logic
│ ├── rpc/ # RPC endpoints
│ ├── contracts/ # Smart contract logic
│ └── models.py # SQLModel definitions
├── data/
│ └── devnet/
│ └── genesis.json # Generated by make_genesis.py
├── scripts/
│ ├── make_genesis.py # Genesis generator
│ ├── devnet_up.sh # Devnet launcher
│ └── keygen.py # Keypair generator
└── .env # Node configuration
```
## Notes
- The node uses proof-of-authority (PoA) consensus with a single proposer for the devnet.
- Transactions require a valid signature (ed25519) unless running in test mode.
- Receipts represent compute work attestations and mint new AIT tokens to the miner.
- Gossip backend defaults to in-memory; for multi-node networks, configure a Redis backend.
- RPC API does not require authentication on devnet (add in production).
## Troubleshooting
**Port already in use:** Change `RPC_BIND_PORT` in `.env` and restart.
**Database locked:** Ensure only one node instance is running; delete `data/chain.db` if corrupted.
**No blocks proposed:** Check proposer logs; ensure `PROPOSER_KEY` is set and no other proposers are conflicting.
**Mock coordinator not responding:** It's only needed for certain tests; the blockchain node can run standalone.

View File

@@ -1,23 +0,0 @@
{
"accounts": [
{
"address": "ait1faucet000000000000000000000000000000000",
"balance": 1000000000,
"nonce": 0
}
],
"authorities": [
{
"address": "ait1devproposer000000000000000000000000000000",
"weight": 1
}
],
"chain_id": "ait-devnet",
"params": {
"base_fee": 10,
"coordinator_ratio": 0.05,
"fee_per_byte": 1,
"mint_per_unit": 1000
},
"timestamp": 1772895053
}

View File

@@ -591,28 +591,28 @@ def get_analytics_summary() -> Dict[str, Any]:
# Test function
async def test_advanced_analytics():
"""Test advanced analytics platform"""
print("📊 Testing Advanced Analytics Platform...")
logger.info("📊 Testing Advanced Analytics Platform...")
# Start monitoring
await start_analytics_monitoring(["BTC/USDT", "ETH/USDT"])
print("✅ Analytics monitoring started")
logger.info("✅ Analytics monitoring started")
# Let it run for a few seconds to generate data
await asyncio.sleep(5)
# Get dashboard data
dashboard = get_dashboard_data("BTC/USDT")
print(f"📈 Dashboard data: {len(dashboard)} fields")
logger.info(f"📈 Dashboard data: {len(dashboard)} fields")
# Get summary
summary = get_analytics_summary()
print(f"📊 Analytics summary: {summary}")
logger.info(f"📊 Analytics summary: {summary}")
# Stop monitoring
await stop_analytics_monitoring()
print("📊 Analytics monitoring stopped")
logger.info("📊 Analytics monitoring stopped")
print("🎉 Advanced Analytics test complete!")
logger.info("🎉 Advanced Analytics test complete!")
if __name__ == "__main__":
asyncio.run(test_advanced_analytics())

View File

@@ -695,32 +695,32 @@ def analyze_behavior_patterns(user_id: str = None) -> Dict[str, Any]:
# Test function
async def test_ai_surveillance():
"""Test AI surveillance system"""
print("🤖 Testing AI Surveillance System...")
logger.info("🤖 Testing AI Surveillance System...")
# Start surveillance
await start_ai_surveillance(["BTC/USDT", "ETH/USDT"])
print("✅ AI surveillance started")
logger.info("✅ AI surveillance started")
# Let it run for data collection
await asyncio.sleep(5)
# Get summary
summary = get_surveillance_summary()
print(f"📊 Surveillance summary: {summary}")
logger.info(f"📊 Surveillance summary: {summary}")
# Get alerts
alerts = list_active_alerts()
print(f"🚨 Active alerts: {len(alerts)}")
logger.info(f"🚨 Active alerts: {len(alerts)}")
# Analyze patterns
patterns = analyze_behavior_patterns()
print(f"🔍 Behavior patterns: {patterns}")
logger.info(f"🔍 Behavior patterns: {patterns}")
# Stop surveillance
await stop_ai_surveillance()
print("🔍 AI surveillance stopped")
logger.info("🔍 AI surveillance stopped")
print("🎉 AI Surveillance test complete!")
logger.info("🎉 AI Surveillance test complete!")
if __name__ == "__main__":
asyncio.run(test_ai_surveillance())

View File

@@ -609,27 +609,27 @@ def get_engine_status() -> Dict[str, Any]:
# Test function
async def test_ai_trading_engine():
"""Test AI trading engine"""
print("🤖 Testing AI Trading Engine...")
logger.info("🤖 Testing AI Trading Engine...")
# Initialize engine
await initialize_ai_engine()
# Train strategies
success = await train_strategies("BTC/USDT", 30)
print(f"✅ Training successful: {success}")
logger.info(f"✅ Training successful: {success}")
# Generate signals
signals = await generate_trading_signals("BTC/USDT")
print(f"📈 Generated {len(signals)} signals")
logger.info(f"📈 Generated {len(signals)} signals")
for signal in signals:
print(f" {signal['strategy']}: {signal['signal_type']} (confidence: {signal['confidence']:.2f})")
logger.info(f" {signal['strategy']}: {signal['signal_type']} (confidence: {signal['confidence']:.2f})")
# Get status
status = get_engine_status()
print(f"📊 Engine Status: {status}")
logger.info(f"📊 Engine Status: {status}")
print("🎉 AI Trading Engine test complete!")
logger.info("🎉 AI Trading Engine test complete!")
if __name__ == "__main__":
asyncio.run(test_ai_trading_engine())

View File

@@ -1,3 +1,5 @@
import logging
logger = logging.getLogger(__name__)
#!/usr/bin/env python3
"""
Bitcoin Wallet Integration for AITBC Exchange
@@ -146,4 +148,4 @@ def get_wallet_info() -> Dict[str, any]:
if __name__ == "__main__":
# Test the wallet integration
info = get_wallet_info()
print(json.dumps(info, indent=2))
logger.info(json.dumps(info, indent=2))

View File

@@ -1,5 +1,6 @@
from __future__ import annotations
import logging
import httpx
from collections import defaultdict, deque
from datetime import datetime
@@ -21,6 +22,8 @@ from ..schemas import (
JobState,
)
logger = logging.getLogger(__name__)
_STATUS_LABELS = {
JobState.queued: "Queued",
JobState.running: "Running",
@@ -81,7 +84,7 @@ class ExplorerService:
return BlockListResponse(items=items, next_offset=next_offset)
except Exception as e:
# Fallback to fake data if RPC is unavailable
print(f"Warning: Failed to fetch blocks from RPC: {e}, falling back to fake data")
logger.warning(f"Failed to fetch blocks from RPC: {e}, falling back to fake data")
statement = select(Job).order_by(Job.requested_at.desc())
jobs = self.session.execute(statement.offset(offset).limit(limit)).all()

View File

@@ -397,7 +397,7 @@ async def perform_aml_screening(user_id: str, user_data: Dict[str, Any]) -> Dict
# Test function
async def test_kyc_aml_integration():
"""Test KYC/AML integration"""
print("🧪 Testing KYC/AML Integration...")
logger.info("🧪 Testing KYC/AML Integration...")
# Test KYC submission
customer_data = {
@@ -408,17 +408,17 @@ async def test_kyc_aml_integration():
}
kyc_result = await submit_kyc_verification("user123", "chainalysis", customer_data)
print(f"✅ KYC Submitted: {kyc_result}")
logger.info(f"✅ KYC Submitted: {kyc_result}")
# Test KYC status check
kyc_status = await check_kyc_status(kyc_result["request_id"], "chainalysis")
print(f"📋 KYC Status: {kyc_status}")
logger.info(f"📋 KYC Status: {kyc_status}")
# Test AML screening
aml_result = await perform_aml_screening("user123", customer_data)
print(f"🔍 AML Screening: {aml_result}")
logger.info(f"🔍 AML Screening: {aml_result}")
print("🎉 KYC/AML integration test complete!")
logger.info("🎉 KYC/AML integration test complete!")
if __name__ == "__main__":
asyncio.run(test_kyc_aml_integration())

View File

@@ -1,3 +1,5 @@
import logging
logger = logging.getLogger(__name__)
"""
Python 3.13.5 Optimized Services for AITBC Coordinator API
@@ -313,19 +315,19 @@ class ServiceFactory:
async def demonstrate_optimized_services():
"""Demonstrate optimized services usage"""
print("🚀 Python 3.13.5 Optimized Services Demo")
print("=" * 50)
logger.info("🚀 Python 3.13.5 Optimized Services Demo")
logger.info("=" * 50)
# This would be used in actual application code
print("\n✅ Services ready for Python 3.13.5 deployment:")
print(" - OptimizedJobService with batch processing")
print(" - OptimizedMinerService with enhanced validation")
print(" - SecurityEnhancedService with improved hashing")
print(" - PerformanceMonitor with real-time metrics")
print(" - Generic base classes with type safety")
print(" - @override decorators for method safety")
print(" - Enhanced error messages for debugging")
print(" - 5-10% performance improvements")
logger.info("\n✅ Services ready for Python 3.13.5 deployment:")
logger.info(" - OptimizedJobService with batch processing")
logger.info(" - OptimizedMinerService with enhanced validation")
logger.info(" - SecurityEnhancedService with improved hashing")
logger.info(" - PerformanceMonitor with real-time metrics")
logger.info(" - Generic base classes with type safety")
logger.info(" - @override decorators for method safety")
logger.info(" - Enhanced error messages for debugging")
logger.info(" - 5-10% performance improvements")
if __name__ == "__main__":
asyncio.run(demonstrate_optimized_services())

View File

@@ -736,7 +736,7 @@ def list_reports(report_type: Optional[str] = None, status: Optional[str] = None
# Test function
async def test_regulatory_reporting():
"""Test regulatory reporting system"""
print("🧪 Testing Regulatory Reporting System...")
logger.info("🧪 Testing Regulatory Reporting System...")
# Test SAR generation
activities = [
@@ -755,20 +755,20 @@ async def test_regulatory_reporting():
]
sar_result = await generate_sar(activities)
print(f"✅ SAR Report Generated: {sar_result['report_id']}")
logger.info(f"✅ SAR Report Generated: {sar_result['report_id']}")
# Test compliance summary
compliance_result = await generate_compliance_summary(
"2026-01-01T00:00:00",
"2026-01-31T23:59:59"
)
print(f"✅ Compliance Summary Generated: {compliance_result['report_id']}")
logger.info(f"✅ Compliance Summary Generated: {compliance_result['report_id']}")
# List reports
reports = list_reports()
print(f"📋 Total Reports: {len(reports)}")
logger.info(f"📋 Total Reports: {len(reports)}")
print("🎉 Regulatory reporting test complete!")
logger.info("🎉 Regulatory reporting test complete!")
if __name__ == "__main__":
asyncio.run(test_regulatory_reporting())

View File

@@ -516,28 +516,28 @@ def get_surveillance_summary() -> Dict[str, Any]:
# Test function
async def test_trading_surveillance():
"""Test trading surveillance system"""
print("🧪 Testing Trading Surveillance System...")
logger.info("🧪 Testing Trading Surveillance System...")
# Start monitoring
await start_surveillance(["BTC/USDT", "ETH/USDT"])
print("✅ Surveillance started")
logger.info("✅ Surveillance started")
# Let it run for a few seconds to generate alerts
await asyncio.sleep(5)
# Get alerts
alerts = get_alerts()
print(f"🚨 Generated {alerts['total']} alerts")
logger.info(f"🚨 Generated {alerts['total']} alerts")
# Get summary
summary = get_surveillance_summary()
print(f"📊 Alert Summary: {summary}")
logger.info(f"📊 Alert Summary: {summary}")
# Stop monitoring
await stop_surveillance()
print("🔍 Surveillance stopped")
logger.info("🔍 Surveillance stopped")
print("🎉 Trading surveillance test complete!")
logger.info("🎉 Trading surveillance test complete!")
if __name__ == "__main__":
asyncio.run(test_trading_surveillance())

View File

@@ -10,29 +10,15 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime, timedelta
# Import advanced analytics with robust path resolution
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
if os.path.isdir(_services_path):
if _services_path not in sys.path:
sys.path.insert(0, _services_path)
else:
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
else:
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
sys.path.insert(0, _computed_services)
else:
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
if os.path.isdir(_fallback) and _fallback not in sys.path:
sys.path.insert(0, _fallback)
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
try:
from advanced_analytics import (
from app.services.advanced_analytics import (
start_analytics_monitoring, stop_analytics_monitoring, get_dashboard_data,
create_analytics_alert, get_analytics_summary, advanced_analytics,
MetricType, Timeframe
@@ -43,8 +29,8 @@ except ImportError as e:
def _missing(*args, **kwargs):
raise ImportError(
f"Required service module 'advanced_analytics' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.advanced_analytics' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
start_analytics_monitoring = stop_analytics_monitoring = get_dashboard_data = _missing
create_analytics_alert = get_analytics_summary = _missing

View File

@@ -473,7 +473,7 @@ def monitor(ctx, realtime, interval):
live.update(generate_monitor_table())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
else:
# Single snapshot
overview = asyncio.run(comm.get_network_overview())

View File

@@ -10,29 +10,15 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime
# Import AI surveillance system with robust path resolution
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
if os.path.isdir(_services_path):
if _services_path not in sys.path:
sys.path.insert(0, _services_path)
else:
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
else:
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
sys.path.insert(0, _computed_services)
else:
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
if os.path.isdir(_fallback) and _fallback not in sys.path:
sys.path.insert(0, _fallback)
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
try:
from ai_surveillance import (
from app.services.ai_surveillance import (
start_ai_surveillance, stop_ai_surveillance, get_surveillance_summary,
get_user_risk_profile, list_active_alerts, analyze_behavior_patterns,
ai_surveillance, SurveillanceType, RiskLevel, AlertPriority
@@ -43,8 +29,8 @@ except ImportError as e:
def _missing(*args, **kwargs):
raise ImportError(
f"Required service module 'ai_surveillance' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.ai_surveillance' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
start_ai_surveillance = stop_ai_surveillance = get_surveillance_summary = _missing
get_user_risk_profile = list_active_alerts = analyze_behavior_patterns = _missing

View File

@@ -10,29 +10,15 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime, timedelta
# Import AI trading engine with robust path resolution
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
if os.path.isdir(_services_path):
if _services_path not in sys.path:
sys.path.insert(0, _services_path)
else:
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
else:
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
sys.path.insert(0, _computed_services)
else:
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
if os.path.isdir(_fallback) and _fallback not in sys.path:
sys.path.insert(0, _fallback)
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
try:
from ai_trading_engine import (
from app.services.ai_trading_engine import (
initialize_ai_engine, train_strategies, generate_trading_signals,
get_engine_status, ai_trading_engine, TradingStrategy
)
@@ -42,8 +28,8 @@ except ImportError as e:
def _missing(*args, **kwargs):
raise ImportError(
f"Required service module 'ai_trading_engine' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.ai_trading_engine' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
initialize_ai_engine = train_strategies = generate_trading_signals = get_engine_status = _missing
ai_trading_engine = None

View File

@@ -151,7 +151,7 @@ def monitor(ctx, realtime, interval, chain_id):
live.update(generate_monitor_table())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
else:
# Single snapshot
asyncio.run(analytics.collect_all_metrics())

View File

@@ -513,7 +513,7 @@ def monitor(ctx, chain_id, realtime, export, interval):
live.update(generate_monitor_layout())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
else:
# Single snapshot
import asyncio

View File

@@ -55,7 +55,7 @@ def rates(ctx, from_chain: Optional[str], to_chain: Optional[str],
if rate_table:
headers = ["From Chain", "To Chain", "Rate"]
print(tabulate(rate_table, headers=headers, tablefmt="grid"))
click.echo(tabulate(rate_table, headers=headers, tablefmt="grid"))
else:
output("No cross-chain rates available")
else:

View File

@@ -316,7 +316,7 @@ def monitor(ctx, deployment_id, interval):
live.update(generate_monitor_table())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
except Exception as e:
error(f"Error during monitoring: {str(e)}")

View File

@@ -10,41 +10,32 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime
# Import enterprise integration services using importlib to avoid naming conflicts
import importlib.util
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
base_dir = _services_path
else:
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
base_dir = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if not os.path.isdir(base_dir):
base_dir = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
try:
from app.services.enterprise_integration import (
create_tenant, get_tenant_info, generate_api_key,
register_integration, get_system_status, list_tenants,
list_integrations
)
# Get EnterpriseAPIGateway if available
import app.services.enterprise_integration as ei_module
EnterpriseAPIGateway = getattr(ei_module, 'EnterpriseAPIGateway', None)
_import_error = None
except ImportError as e:
_import_error = e
module_path = os.path.join(base_dir, 'enterprise_integration.py')
if os.path.isfile(module_path):
spec = importlib.util.spec_from_file_location("enterprise_integration_service", module_path)
ei = importlib.util.module_from_spec(spec)
spec.loader.exec_module(ei)
create_tenant = ei.create_tenant
get_tenant_info = ei.get_tenant_info
generate_api_key = ei.generate_api_key
register_integration = ei.register_integration
get_system_status = ei.get_system_status
list_tenants = ei.list_tenants
list_integrations = ei.list_integrations
EnterpriseAPIGateway = getattr(ei, 'EnterpriseAPIGateway', None)
else:
# Provide stubs if module not found
def _missing(*args, **kwargs):
raise ImportError(
f"Could not load enterprise_integration.py from {module_path}. "
"Ensure coordinator-api services are available or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.enterprise_integration' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
create_tenant = get_tenant_info = generate_api_key = _missing
register_integration = get_system_status = list_tenants = list_integrations = _missing
create_tenant = get_tenant_info = generate_api_key = register_integration = get_system_status = list_tenants = list_integrations = _missing
EnterpriseAPIGateway = None
@click.group()

View File

@@ -792,7 +792,7 @@ def status(ctx, exchange: Optional[str]):
if health.error_message:
error(f" Error: {health.error_message}")
print()
click.echo()
except ImportError:
error("❌ Real exchange integration not available. Install ccxt library.")

View File

@@ -463,7 +463,7 @@ def monitor(ctx, realtime, interval):
live.update(generate_monitor_table())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
else:
# Single snapshot
overview = asyncio.run(marketplace.get_marketplace_overview())

View File

@@ -33,7 +33,7 @@ def dashboard(ctx, refresh: int, duration: int):
console.clear()
console.rule("[bold blue]AITBC Dashboard[/bold blue]")
console.print(f"[dim]Refreshing every {refresh}s | Elapsed: {int(elapsed)}s[/dim]\n")
console.click.echo(f"[dim]Refreshing every {refresh}s | Elapsed: {int(elapsed)}s[/dim]\n")
# Fetch system dashboard
try:
@@ -47,39 +47,39 @@ def dashboard(ctx, refresh: int, duration: int):
)
if resp.status_code == 200:
dashboard = resp.json()
console.print("[bold green]Dashboard Status:[/bold green] Online")
console.click.echo("[bold green]Dashboard Status:[/bold green] Online")
# Overall status
overall_status = dashboard.get("overall_status", "unknown")
console.print(f" Overall Status: {overall_status}")
console.click.echo(f" Overall Status: {overall_status}")
# Services summary
services = dashboard.get("services", {})
console.print(f" Services: {len(services)}")
console.click.echo(f" Services: {len(services)}")
for service_name, service_data in services.items():
status = service_data.get("status", "unknown")
console.print(f" {service_name}: {status}")
console.click.echo(f" {service_name}: {status}")
# Metrics summary
metrics = dashboard.get("metrics", {})
if metrics:
health_pct = metrics.get("health_percentage", 0)
console.print(f" Health: {health_pct:.1f}%")
console.click.echo(f" Health: {health_pct:.1f}%")
else:
console.print(f"[bold yellow]Dashboard:[/bold yellow] HTTP {resp.status_code}")
console.click.echo(f"[bold yellow]Dashboard:[/bold yellow] HTTP {resp.status_code}")
except Exception as e:
console.print(f"[bold red]Dashboard:[/bold red] Error - {e}")
console.click.echo(f"[bold red]Dashboard:[/bold red] Error - {e}")
except Exception as e:
console.print(f"[red]Error fetching data: {e}[/red]")
console.click.echo(f"[red]Error fetching data: {e}[/red]")
console.print(f"\n[dim]Press Ctrl+C to exit[/dim]")
console.click.echo(f"\n[dim]Press Ctrl+C to exit[/dim]")
time.sleep(refresh)
except KeyboardInterrupt:
console.print("\n[bold]Dashboard stopped[/bold]")
console.click.echo("\n[bold]Dashboard stopped[/bold]")
@monitor.command()

View File

@@ -97,7 +97,7 @@ def chains(ctx, show_private, node_id):
chains = await client.get_hosted_chains()
return [(nid, chain) for chain in chains]
except Exception as e:
print(f"Error getting chains from node {nid}: {e}")
click.echo(f"Error getting chains from node {nid}: {e}")
return []
tasks.append(get_chains_for_node(node_id, node_config))
@@ -328,7 +328,7 @@ def monitor(ctx, node_id, realtime, interval):
live.update(generate_monitor_layout())
time.sleep(interval)
except KeyboardInterrupt:
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
else:
# Single snapshot
node_info = asyncio.run(get_node_stats())

View File

@@ -10,31 +10,20 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime, timedelta
# Import regulatory reporting system with robust path resolution
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
if os.path.isdir(_services_path):
if _services_path not in sys.path:
sys.path.insert(0, _services_path)
else:
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
else:
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
sys.path.insert(0, _computed_services)
else:
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
if os.path.isdir(_fallback) and _fallback not in sys.path:
sys.path.insert(0, _fallback)
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
try:
from regulatory_reporting import (
generate_sar, generate_compliance_summary, list_reports,
regulatory_reporter, ReportType, ReportStatus, RegulatoryBody
from app.services.regulatory_reporting import (
generate_sar as generate_sar_svc,
generate_compliance_summary as generate_compliance_summary_svc,
list_reports,
regulatory_reporter,
ReportType, ReportStatus, RegulatoryBody
)
_import_error = None
except ImportError as e:
@@ -42,8 +31,8 @@ except ImportError as e:
def _missing(*args, **kwargs):
raise ImportError(
f"Required service module 'regulatory_reporting' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.regulatory_reporting' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
generate_sar = generate_compliance_summary = list_reports = regulatory_reporter = _missing
@@ -91,7 +80,7 @@ def generate_sar(ctx, user_id: str, activity_type: str, amount: float, descripti
}
# Generate SAR
result = asyncio.run(generate_sar([activity]))
result = asyncio.run(generate_sar_svc([activity]))
click.echo(f"\n✅ SAR Report Generated Successfully!")
click.echo(f"📋 Report ID: {result['report_id']}")
@@ -124,7 +113,7 @@ def compliance_summary(ctx, period_start: str, period_end: str):
click.echo(f"📈 Duration: {(end_date - start_date).days} days")
# Generate compliance summary
result = asyncio.run(generate_compliance_summary(
result = asyncio.run(generate_compliance_summary_svc(
start_date.isoformat(),
end_date.isoformat()
))

View File

@@ -10,33 +10,16 @@ import json
from typing import Optional, List, Dict, Any
from datetime import datetime, timedelta
# Import surveillance system with robust path resolution
# Ensure coordinator-api src is on path for app.services imports
import os
import sys
# Determine services path: use AITBC_SERVICES_PATH if set, else compute relative to repo layout
_services_path = os.environ.get('AITBC_SERVICES_PATH')
if _services_path:
if os.path.isdir(_services_path):
if _services_path not in sys.path:
sys.path.insert(0, _services_path)
else:
print(f"Warning: AITBC_SERVICES_PATH set but not a directory: {_services_path}", file=sys.stderr)
else:
# Compute project root relative to this file: cli/aitbc_cli/commands -> 3 levels up to project root
_project_root = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..'))
_computed_services = os.path.join(_project_root, 'apps', 'coordinator-api', 'src', 'app', 'services')
if os.path.isdir(_computed_services) and _computed_services not in sys.path:
sys.path.insert(0, _computed_services)
else:
# Fallback to known hardcoded path if it exists (for legacy deployments)
_fallback = '/home/oib/windsurf/aitbc/apps/coordinator-api/src/app/services'
if os.path.isdir(_fallback) and _fallback not in sys.path:
sys.path.insert(0, _fallback)
_src_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..', '..', 'apps', 'coordinator-api', 'src'))
if _src_path not in sys.path:
sys.path.insert(0, _src_path)
try:
from trading_surveillance import (
start_surveillance, stop_surveillance, get_alerts,
from app.services.trading_surveillance import (
start_surveillance, stop_surveillance, get_alerts,
get_surveillance_summary, AlertLevel
)
_import_error = None
@@ -45,8 +28,8 @@ except ImportError as e:
def _missing(*args, **kwargs):
raise ImportError(
f"Required service module 'trading_surveillance' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed or set AITBC_SERVICES_PATH."
f"Required service module 'app.services.trading_surveillance' could not be imported: {_import_error}. "
"Ensure coordinator-api dependencies are installed and the source directory is accessible."
)
start_surveillance = stop_surveillance = get_alerts = get_surveillance_summary = _missing
@@ -237,7 +220,7 @@ def resolve(ctx, alert_id: str, resolution: str):
click.echo(f"🔍 Resolving alert: {alert_id}")
# Import surveillance to access resolve function
from trading_surveillance import surveillance
from app.services.trading_surveillance import surveillance
success = surveillance.resolve_alert(alert_id, resolution)
@@ -263,7 +246,7 @@ def test(ctx, symbols: str, duration: int):
click.echo(f"⏱️ Duration: {duration} seconds")
# Import test function
from trading_surveillance import test_trading_surveillance
from app.services.trading_surveillance import test_trading_surveillance
# Run test
asyncio.run(test_trading_surveillance())
@@ -289,7 +272,7 @@ def test(ctx, symbols: str, duration: int):
def status(ctx):
"""Show current surveillance status"""
try:
from trading_surveillance import surveillance
from app.services.trading_surveillance import surveillance
click.echo(f"📊 Trading Surveillance Status")

138
scripts/claim-task.py Normal file
View File

@@ -0,0 +1,138 @@
#!/usr/bin/env python3
import os, sys, json, subprocess, random, time, logging
from datetime import datetime
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
logger = logging.getLogger(__name__)
TOKEN = os.getenv("GITEA_TOKEN", "ffce3b62d583b761238ae00839dce7718acaad85")
API_BASE = os.getenv("GITEA_API_BASE", "http://gitea.bubuit.net:3000/api/v1")
REPO = "oib/aitbc"
AGENT = os.getenv("AGENT_NAME", "aitbc")
def log(msg):
logger.info(msg)
def die(msg):
log(f"FATAL: {msg}")
sys.exit(1)
def api_get(path):
cmd = ["curl", "-s", "-H", f"Authorization: token {TOKEN}", f"{API_BASE}/{path}"]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
return None
try:
return json.loads(result.stdout)
except:
return None
def api_post(path, payload):
cmd = ["curl", "-s", "-X", "POST", "-H", f"Authorization: token {TOKEN}", "-H", "Content-Type: application/json",
f"{API_BASE}/{path}", "-d", json.dumps(payload)]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
return None
try:
return json.loads(result.stdout)
except:
return None
def compute_utility(issue):
score = 0
labels = [l['name'] for l in issue['labels']]
if 'security' in labels:
score += 100
elif 'bug' in labels:
score += 50
elif 'feature' in labels:
score += 30
elif 'refactor' in labels:
score += 10
if 'good-first-task-for-agent' in labels:
score += 20
if any(tag in labels for tag in ['needs-design', 'blocked', 'needs-reproduction']):
score -= 1000
score += issue['comments'] * 5
return score
def get_open_unassigned_issues():
items = api_get(f"repos/{REPO}/issues?state=open") or []
candidates = []
for i in items:
# Skip PRs: pull_request field is non-null for PRs
if i.get('pull_request') is not None:
continue
if i['assignee'] is not None:
continue
labels = [l['name'] for l in i['labels']]
if any(tag in labels for tag in ['needs-design', 'blocked', 'needs-reproduction']):
continue
candidates.append(i)
return candidates
def create_claim_branch(issue_number):
branch = f"claim/{issue_number}"
subprocess.run(["git", "fetch", "origin"], capture_output=True, cwd=REPO_DIR)
subprocess.run(["git", "checkout", "-B", branch, "origin/main"], capture_output=True, cwd=REPO_DIR)
subprocess.run(["git", "commit", "--allow-empty", "-m", f"Claim issue #{issue_number} for {AGENT}"], capture_output=True, cwd=REPO_DIR)
r = subprocess.run(["git", "push", "-u", "origin", branch], capture_output=True, cwd=REPO_DIR)
return r.returncode == 0
def assign_issue(issue_number):
payload = {"assignee": AGENT}
result = api_post(f"repos/{REPO}/issues/{issue_number}/assignees", payload)
return result is not None
def add_comment(issue_number, body):
payload = {"body": body}
return api_post(f"repos/{REPO}/issues/{issue_number}/comments", payload) is not None
def main():
# Jitter 0-60s
time.sleep(random.randint(0, 60))
log("Claim task cycle starting...")
state_file = os.path.join(REPO_DIR, ".claim-state.json")
try:
with open(state_file) as f:
state = json.load(f)
except:
state = {}
if state.get('current_claim'):
log(f"Already working on issue #{state['current_claim']} (branch {state.get('work_branch')})")
return
issues = get_open_unassigned_issues()
if not issues:
log("No unassigned issues available.")
return
issues.sort(key=lambda i: compute_utility(i), reverse=True)
for issue in issues:
num = issue['number']
title = issue['title']
labels = [lbl['name'] for lbl in issue.get('labels', [])]
log(f"Attempting to claim issue #{num}: {title} (labels={labels})")
if create_claim_branch(num):
assign_issue(num)
slug = ''.join(c if c.isalnum() else '-' for c in title.lower())[:40].strip('-')
work_branch = f"{AGENT}/{num}-{slug}"
subprocess.run(["git", "checkout", "-b", work_branch, "main"], cwd=REPO_DIR, capture_output=True)
state = {
'current_claim': num,
'claim_branch': f'claim/{num}',
'work_branch': work_branch,
'claimed_at': datetime.utcnow().isoformat() + 'Z',
'issue_title': title,
'labels': labels
}
with open(state_file, 'w') as f:
json.dump(state, f, indent=2)
add_comment(num, f"Agent `{AGENT}` claiming this task. (automated)")
log(f"✅ Claimed issue #{num}. Work branch: {work_branch}")
return
else:
log(f"Claim failed for #{num} (branch exists). Trying next...")
log("Could not claim any issue; all taken or unavailable.")
if __name__ == "__main__":
REPO_DIR = '/opt/aitbc'
main()

139
scripts/monitor-prs.py Normal file
View File

@@ -0,0 +1,139 @@
#!/usr/bin/env python3
import os, sys, json, subprocess, tempfile, shutil, re, time, random
from datetime import datetime
TOKEN = os.getenv("GITEA_TOKEN", "ffce3b62d583b761238ae00839dce7718acaad85")
API_BASE = os.getenv("GITEA_API_BASE", "http://gitea.bubuit.net:3000/api/v1")
REPO = "oib/aitbc"
AGENT = os.getenv("AGENT_NAME", "aitbc")
OTHER = "aitbc1" if AGENT == "aitbc" else "aitbc"
RING_PREFIXES = [
(0, ["packages/py/aitbc-core", "packages/py/aitbc-sdk"]),
(1, ["apps/"]),
(2, ["cli/", "analytics/", "tools/"]),
]
RING_THRESHOLD = {0: 0.9, 1: 0.8, 2: 0.7, 3: 0.5}
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
logger = logging.getLogger(__name__)
def log(msg):
logger.info(msg)
def api_get(path):
cmd = ["curl", "-s", "-H", f"Authorization: token {TOKEN}", f"{API_BASE}/{path}"]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
return None
try:
return json.loads(result.stdout)
except:
return None
def api_post(path, payload):
cmd = ["curl", "-s", "-X", "POST", "-H", f"Authorization: token {TOKEN}", "-H", "Content-Type: application/json",
f"{API_BASE}/{path}", "-d", json.dumps(payload)]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
return None
try:
return json.loads(result.stdout)
except:
return None
def get_open_prs():
return api_get(f"repos/{REPO}/pulls?state=open") or []
def get_my_reviews(pr_number):
return api_get(f"repos/{REPO}/pulls/{pr_number}/reviews") or []
def is_test_file(path):
return '/tests/' in path or path.startswith('tests/') or path.endswith('_test.py')
def detect_ring(base_sha, head_sha):
try:
output = subprocess.run(
["git", "diff", "--name-only", base_sha, head_sha],
capture_output=True, text=True, check=True, cwd='/opt/aitbc'
).stdout
files = [f.strip() for f in output.splitlines() if f.strip()]
except subprocess.CalledProcessError:
files = []
if files and all(is_test_file(f) for f in files):
return 3
for ring, prefixes in sorted(RING_PREFIXES, key=lambda x: x[0]):
for p in files:
if any(p.startswith(prefix) for prefix in prefixes):
return ring
return 3
def syntax_check(worktree):
py_files = []
for root, dirs, files in os.walk(worktree):
for f in files:
if f.endswith('.py'):
py_files.append(os.path.join(root, f))
for f in py_files:
r = subprocess.run([sys.executable, '-m', 'py_compile', f], capture_output=True)
if r.returncode != 0:
return False, f"Syntax error in {os.path.relpath(f, worktree)}"
return True, ""
def post_review(pr_number, event, body):
payload = {"event": event, "body": body}
return api_post(f"repos/{REPO}/pulls/{pr_number}/reviews", payload) is not None
def request_review(pr_number, reviewer):
payload = {"reviewers": [reviewer]}
return api_post(f"repos/{REPO}/pulls/{pr_number}/requested-reviewers", payload) is not None
def main():
# Jitter 0-60s
time.sleep(random.randint(0, 60))
log("Fetching open PRs...")
prs = get_open_prs()
if not prs:
log("No open PRs")
return
# Process sibling PRs
for pr in prs:
if pr['user']['login'] != OTHER:
continue
number = pr['number']
title = pr['title'][:50]
log(f"Reviewing sibling PR #{number}: {title}")
# Checkout and validate
tmp = tempfile.mkdtemp(prefix="aitbc_monitor_")
try:
subprocess.run(["git", "clone", "--no-checkout", "origin", tmp], capture_output=True, check=True)
wt = os.path.join(tmp, "wt")
os.makedirs(wt)
subprocess.run(["git", "--git-dir", os.path.join(tmp, ".git"), "--work-tree", wt, "fetch", "origin", pr['head']['ref']], capture_output=True, check=True)
subprocess.run(["git", "--git-dir", os.path.join(tmp, ".git"), "--work-tree", wt, "checkout", "FETCH_HEAD"], capture_output=True, check=True)
ok, err = syntax_check(wt)
if not ok:
post_review(number, "REQUEST_CHANGES", f"❌ Syntax error: {err}")
log(f"PR #{number} failed syntax: {err}")
continue
ring = detect_ring(pr['base']['sha'], pr['head']['sha'])
threshold = RING_THRESHOLD.get(ring, 0.5)
if ring == 0:
post_review(number, "REQUEST_CHANGES", f"Ring 0 (core): needs manual review. Threshold: >{threshold}")
log(f"PR #{number} is Ring0; manual review required")
else:
post_review(number, "APPROVE", f"✅ Autoapproved (ring {ring}, threshold >{threshold})")
log(f"PR #{number} approved")
finally:
shutil.rmtree(tmp, ignore_errors=True)
# Our own PRs: request review from OTHER if not yet
our_prs = [pr for pr in prs if pr['user']['login'] == AGENT]
for pr in our_prs:
number = pr['number']
reviews = get_my_reviews(number)
if not any(r['user']['login'] == OTHER for r in reviews):
log(f"Requesting review from {OTHER} on our PR #{number}")
request_review(number, OTHER)
if __name__ == "__main__":
main()

120
scripts/qa-cycle.py Executable file
View File

@@ -0,0 +1,120 @@
#!/usr/bin/env python3
import os, sys, json, subprocess, tempfile, shutil, glob, re
import random, time, logging
from datetime import datetime
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
logger = logging.getLogger(__name__)
AGENT = os.getenv("AGENT_NAME", "aitbc")
LOG_DIR = "/var/log/aitbc"
QA_LOG = os.path.join(LOG_DIR, "qa_cycle.log")
def run_cmd(cmd, capture=True, check=False):
result = subprocess.run(cmd, shell=True, capture_output=capture, text=True)
if check and result.returncode != 0:
logger.error(f"Command failed: {cmd}")
logger.error(f"stderr: {result.stderr}")
return None
return result.stdout if capture else result.returncode
def feature_tests():
logger.info("=== FEATURE TESTS ===")
for mod in ["aiohttp", "fastapi", "click", "pydantic"]:
cmd = f"python3 -c 'import {mod}; print({mod}.__version__)'"
out = run_cmd(cmd)
if out is None:
logger.error(f"Import failed: {mod}")
else:
logger.info(f"{mod} import OK: {out.strip()}")
out = run_cmd("/opt/aitbc/cli_venv/bin/aitbc --help")
if out:
logger.info("CLI help works")
else:
logger.error("CLI help failed")
out = run_cmd("curl -s http://localhost:8010/health || true")
if out and 'ok' in out.lower():
logger.info("Brother chain health endpoint OK")
else:
logger.warning("Brother chain health check inconclusive")
def bug_sentinel():
logger.info("=== BUG SENTINEL ===")
logs = glob.glob("/var/log/aitbc/*.log")
patterns = ["ERROR", "CRITICAL", "FATAL", "Traceback", "Exception"]
found = []
for logfile in logs:
try:
with open(logfile) as f:
lines = f.readlines()[-200:]
except:
continue
for line in lines:
if any(p in line for p in patterns):
found.append(f"{logfile}: {line.strip()}")
if found:
logger.warning(f"Found {len(found)} error patterns (sample):")
for item in found[:5]:
logger.warning(f" {item}")
else:
logger.info("No error patterns in recent logs")
def code_review():
logger.info("=== CODE REVIEW ===")
py_files = run_cmd("find /opt/aitbc -name '*.py' -type f | head -30").splitlines()
issues = []
for f in py_files:
try:
with open(f) as fp:
content = fp.read()
except:
continue
if re.search(r'except\s*:', content):
issues.append(f"{f}: bare except")
if re.search(r'def\s+\w+\s*\([^)]*=\s*[\[\{\}]', content):
issues.append(f"{f}: mutable default argument")
if 'print(' in content and 'if __name__' not in content:
issues.append(f"{f}: print statement in library code")
if re.search(r'(password|secret|key)\s*=\s*[\'"][^\'"]+[\'"]', content, re.IGNORECASE):
issues.append(f"{f}: possible hardcoded secret")
if issues:
logger.warning(f"Found {len(issues)} code quality issues (sample):")
for i in issues[:5]:
logger.warning(f" {i}")
else:
logger.info("No obvious code quality issues detected")
def scenario_runner():
logger.info("=== SCENARIO RUNNER ===")
tmp = tempfile.mkdtemp(prefix="aitbc_qa_")
try:
run_cmd(f"git init {tmp}", check=True)
run_cmd(f"git -C {tmp} config user.email 'qa@aitbc'", check=True)
run_cmd(f"git -C {tmp} config user.name 'QA Agent'", check=True)
run_cmd(f"echo 'test' > {tmp}/test.txt", check=True)
run_cmd(f"git -C {tmp} add .", check=True)
run_cmd(f"git -C {tmp} commit -m 'initial'", check=True)
run_cmd(f"git -C {tmp} checkout -b claim/123", check=True)
run_cmd(f"git -C {tmp} commit --allow-empty -m 'Claim issue 123'", check=True)
logger.info("Git workflow test passed")
except Exception as e:
logger.error(f"Git workflow test failed: {e}")
finally:
shutil.rmtree(tmp, ignore_errors=True)
def main():
logger.info(f"QA CYCLE STARTED for {AGENT}")
start = datetime.now()
# Jitter: 0-900 seconds (15 minutes)
jitter = random.randint(0, 900)
logger.info(f"Sleeping {jitter}s before start...")
time.sleep(jitter)
feature_tests()
bug_sentinel()
code_review()
scenario_runner()
elapsed = datetime.now() - start
logger.info(f"QA CYCLE COMPLETED in {elapsed.total_seconds():.1f}s")
if __name__ == "__main__":
main()

21
setup-cron.sh Executable file
View File

@@ -0,0 +1,21 @@
#!/usr/bin/env bash
set -e
LOG_DIR="/var/log/aitbc"
mkdir -p "$LOG_DIR"
chown $(whoami):$(whoami) "$LOG_DIR" 2>/dev/null || true
CRON_TMP=$(mktemp)
cat > "$CRON_TMP" <<EOF
AGENT_NAME=$(whoami)
PATH=/usr/local/bin:/usr/bin:/bin
GITEA_TOKEN=ffce3b62d583b761238ae00839dce7718acaad85
*/10 * * * * sleep \$((RANDOM % 31)); /opt/aitbc/scripts/auto_review.py >> "$LOG_DIR/auto_review.log" 2>&1
0,30 * * * * sleep \$((RANDOM % 16)); /opt/aitbc/scripts/claim-task.py >> "$LOG_DIR/claim_task.log" 2>&1
EOF
crontab "$CRON_TMP"
rm "$CRON_TMP"
echo "Cron installed for $(whoami)"