chore: replace print() with proper logging; add jitter to automation
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (pull_request) Has been cancelled
Security Scanning / Dependency Security Scan (pull_request) Has been cancelled
Security Scanning / Container Security Scan (pull_request) Has been cancelled
Security Scanning / OSSF Scorecard (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-cli (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / security-scan (pull_request) Has been cancelled
AITBC CI/CD Pipeline / build (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (pull_request) Has been cancelled
AITBC CI/CD Pipeline / performance-test (pull_request) Has been cancelled
AITBC CI/CD Pipeline / docs (pull_request) Has been cancelled
AITBC CI/CD Pipeline / release (pull_request) Has been cancelled
AITBC CI/CD Pipeline / notify (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (pull_request) Has been cancelled
Security Scanning / Security Summary Report (pull_request) Has been cancelled
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (pull_request) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (pull_request) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (pull_request) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (pull_request) Has been cancelled
Security Scanning / Dependency Security Scan (pull_request) Has been cancelled
Security Scanning / Container Security Scan (pull_request) Has been cancelled
Security Scanning / OSSF Scorecard (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-cli (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (pull_request) Has been cancelled
AITBC CI/CD Pipeline / security-scan (pull_request) Has been cancelled
AITBC CI/CD Pipeline / build (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (pull_request) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (pull_request) Has been cancelled
AITBC CI/CD Pipeline / performance-test (pull_request) Has been cancelled
AITBC CI/CD Pipeline / docs (pull_request) Has been cancelled
AITBC CI/CD Pipeline / release (pull_request) Has been cancelled
AITBC CI/CD Pipeline / notify (pull_request) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (pull_request) Has been cancelled
Security Scanning / Security Summary Report (pull_request) Has been cancelled
- CLI commands: replace print with click.echo (ensures proper stdout handling) - Coordinator API services: add logging import and logger; replace print with logger.info - Automation scripts: claim-task.py, monitor-prs.py, qa-cycle.py now use logging and have random jitter at startup - Also includes fix for name shadowing in regulatory.py (aliased service imports) which was pending This addresses issue #23 (print statements) and improves error handling. Note: Many bare except clauses (issue #20) remain; will be addressed separately.
This commit is contained in:
@@ -591,28 +591,28 @@ def get_analytics_summary() -> Dict[str, Any]:
|
||||
# Test function
|
||||
async def test_advanced_analytics():
|
||||
"""Test advanced analytics platform"""
|
||||
print("📊 Testing Advanced Analytics Platform...")
|
||||
logger.info("📊 Testing Advanced Analytics Platform...")
|
||||
|
||||
# Start monitoring
|
||||
await start_analytics_monitoring(["BTC/USDT", "ETH/USDT"])
|
||||
print("✅ Analytics monitoring started")
|
||||
logger.info("✅ Analytics monitoring started")
|
||||
|
||||
# Let it run for a few seconds to generate data
|
||||
await asyncio.sleep(5)
|
||||
|
||||
# Get dashboard data
|
||||
dashboard = get_dashboard_data("BTC/USDT")
|
||||
print(f"📈 Dashboard data: {len(dashboard)} fields")
|
||||
logger.info(f"📈 Dashboard data: {len(dashboard)} fields")
|
||||
|
||||
# Get summary
|
||||
summary = get_analytics_summary()
|
||||
print(f"📊 Analytics summary: {summary}")
|
||||
logger.info(f"📊 Analytics summary: {summary}")
|
||||
|
||||
# Stop monitoring
|
||||
await stop_analytics_monitoring()
|
||||
print("📊 Analytics monitoring stopped")
|
||||
logger.info("📊 Analytics monitoring stopped")
|
||||
|
||||
print("🎉 Advanced Analytics test complete!")
|
||||
logger.info("🎉 Advanced Analytics test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_advanced_analytics())
|
||||
|
||||
@@ -695,32 +695,32 @@ def analyze_behavior_patterns(user_id: str = None) -> Dict[str, Any]:
|
||||
# Test function
|
||||
async def test_ai_surveillance():
|
||||
"""Test AI surveillance system"""
|
||||
print("🤖 Testing AI Surveillance System...")
|
||||
logger.info("🤖 Testing AI Surveillance System...")
|
||||
|
||||
# Start surveillance
|
||||
await start_ai_surveillance(["BTC/USDT", "ETH/USDT"])
|
||||
print("✅ AI surveillance started")
|
||||
logger.info("✅ AI surveillance started")
|
||||
|
||||
# Let it run for data collection
|
||||
await asyncio.sleep(5)
|
||||
|
||||
# Get summary
|
||||
summary = get_surveillance_summary()
|
||||
print(f"📊 Surveillance summary: {summary}")
|
||||
logger.info(f"📊 Surveillance summary: {summary}")
|
||||
|
||||
# Get alerts
|
||||
alerts = list_active_alerts()
|
||||
print(f"🚨 Active alerts: {len(alerts)}")
|
||||
logger.info(f"🚨 Active alerts: {len(alerts)}")
|
||||
|
||||
# Analyze patterns
|
||||
patterns = analyze_behavior_patterns()
|
||||
print(f"🔍 Behavior patterns: {patterns}")
|
||||
logger.info(f"🔍 Behavior patterns: {patterns}")
|
||||
|
||||
# Stop surveillance
|
||||
await stop_ai_surveillance()
|
||||
print("🔍 AI surveillance stopped")
|
||||
logger.info("🔍 AI surveillance stopped")
|
||||
|
||||
print("🎉 AI Surveillance test complete!")
|
||||
logger.info("🎉 AI Surveillance test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_ai_surveillance())
|
||||
|
||||
@@ -609,27 +609,27 @@ def get_engine_status() -> Dict[str, Any]:
|
||||
# Test function
|
||||
async def test_ai_trading_engine():
|
||||
"""Test AI trading engine"""
|
||||
print("🤖 Testing AI Trading Engine...")
|
||||
logger.info("🤖 Testing AI Trading Engine...")
|
||||
|
||||
# Initialize engine
|
||||
await initialize_ai_engine()
|
||||
|
||||
# Train strategies
|
||||
success = await train_strategies("BTC/USDT", 30)
|
||||
print(f"✅ Training successful: {success}")
|
||||
logger.info(f"✅ Training successful: {success}")
|
||||
|
||||
# Generate signals
|
||||
signals = await generate_trading_signals("BTC/USDT")
|
||||
print(f"📈 Generated {len(signals)} signals")
|
||||
logger.info(f"📈 Generated {len(signals)} signals")
|
||||
|
||||
for signal in signals:
|
||||
print(f" {signal['strategy']}: {signal['signal_type']} (confidence: {signal['confidence']:.2f})")
|
||||
logger.info(f" {signal['strategy']}: {signal['signal_type']} (confidence: {signal['confidence']:.2f})")
|
||||
|
||||
# Get status
|
||||
status = get_engine_status()
|
||||
print(f"📊 Engine Status: {status}")
|
||||
logger.info(f"📊 Engine Status: {status}")
|
||||
|
||||
print("🎉 AI Trading Engine test complete!")
|
||||
logger.info("🎉 AI Trading Engine test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_ai_trading_engine())
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Bitcoin Wallet Integration for AITBC Exchange
|
||||
@@ -146,4 +148,4 @@ def get_wallet_info() -> Dict[str, any]:
|
||||
if __name__ == "__main__":
|
||||
# Test the wallet integration
|
||||
info = get_wallet_info()
|
||||
print(json.dumps(info, indent=2))
|
||||
logger.info(json.dumps(info, indent=2))
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import httpx
|
||||
from collections import defaultdict, deque
|
||||
from datetime import datetime
|
||||
@@ -21,6 +22,8 @@ from ..schemas import (
|
||||
JobState,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_STATUS_LABELS = {
|
||||
JobState.queued: "Queued",
|
||||
JobState.running: "Running",
|
||||
@@ -81,7 +84,7 @@ class ExplorerService:
|
||||
return BlockListResponse(items=items, next_offset=next_offset)
|
||||
except Exception as e:
|
||||
# Fallback to fake data if RPC is unavailable
|
||||
print(f"Warning: Failed to fetch blocks from RPC: {e}, falling back to fake data")
|
||||
logger.warning(f"Failed to fetch blocks from RPC: {e}, falling back to fake data")
|
||||
statement = select(Job).order_by(Job.requested_at.desc())
|
||||
jobs = self.session.execute(statement.offset(offset).limit(limit)).all()
|
||||
|
||||
|
||||
@@ -397,7 +397,7 @@ async def perform_aml_screening(user_id: str, user_data: Dict[str, Any]) -> Dict
|
||||
# Test function
|
||||
async def test_kyc_aml_integration():
|
||||
"""Test KYC/AML integration"""
|
||||
print("🧪 Testing KYC/AML Integration...")
|
||||
logger.info("🧪 Testing KYC/AML Integration...")
|
||||
|
||||
# Test KYC submission
|
||||
customer_data = {
|
||||
@@ -408,17 +408,17 @@ async def test_kyc_aml_integration():
|
||||
}
|
||||
|
||||
kyc_result = await submit_kyc_verification("user123", "chainalysis", customer_data)
|
||||
print(f"✅ KYC Submitted: {kyc_result}")
|
||||
logger.info(f"✅ KYC Submitted: {kyc_result}")
|
||||
|
||||
# Test KYC status check
|
||||
kyc_status = await check_kyc_status(kyc_result["request_id"], "chainalysis")
|
||||
print(f"📋 KYC Status: {kyc_status}")
|
||||
logger.info(f"📋 KYC Status: {kyc_status}")
|
||||
|
||||
# Test AML screening
|
||||
aml_result = await perform_aml_screening("user123", customer_data)
|
||||
print(f"🔍 AML Screening: {aml_result}")
|
||||
logger.info(f"🔍 AML Screening: {aml_result}")
|
||||
|
||||
print("🎉 KYC/AML integration test complete!")
|
||||
logger.info("🎉 KYC/AML integration test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_kyc_aml_integration())
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
"""
|
||||
Python 3.13.5 Optimized Services for AITBC Coordinator API
|
||||
|
||||
@@ -313,19 +315,19 @@ class ServiceFactory:
|
||||
|
||||
async def demonstrate_optimized_services():
|
||||
"""Demonstrate optimized services usage"""
|
||||
print("🚀 Python 3.13.5 Optimized Services Demo")
|
||||
print("=" * 50)
|
||||
logger.info("🚀 Python 3.13.5 Optimized Services Demo")
|
||||
logger.info("=" * 50)
|
||||
|
||||
# This would be used in actual application code
|
||||
print("\n✅ Services ready for Python 3.13.5 deployment:")
|
||||
print(" - OptimizedJobService with batch processing")
|
||||
print(" - OptimizedMinerService with enhanced validation")
|
||||
print(" - SecurityEnhancedService with improved hashing")
|
||||
print(" - PerformanceMonitor with real-time metrics")
|
||||
print(" - Generic base classes with type safety")
|
||||
print(" - @override decorators for method safety")
|
||||
print(" - Enhanced error messages for debugging")
|
||||
print(" - 5-10% performance improvements")
|
||||
logger.info("\n✅ Services ready for Python 3.13.5 deployment:")
|
||||
logger.info(" - OptimizedJobService with batch processing")
|
||||
logger.info(" - OptimizedMinerService with enhanced validation")
|
||||
logger.info(" - SecurityEnhancedService with improved hashing")
|
||||
logger.info(" - PerformanceMonitor with real-time metrics")
|
||||
logger.info(" - Generic base classes with type safety")
|
||||
logger.info(" - @override decorators for method safety")
|
||||
logger.info(" - Enhanced error messages for debugging")
|
||||
logger.info(" - 5-10% performance improvements")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(demonstrate_optimized_services())
|
||||
|
||||
@@ -736,7 +736,7 @@ def list_reports(report_type: Optional[str] = None, status: Optional[str] = None
|
||||
# Test function
|
||||
async def test_regulatory_reporting():
|
||||
"""Test regulatory reporting system"""
|
||||
print("🧪 Testing Regulatory Reporting System...")
|
||||
logger.info("🧪 Testing Regulatory Reporting System...")
|
||||
|
||||
# Test SAR generation
|
||||
activities = [
|
||||
@@ -755,20 +755,20 @@ async def test_regulatory_reporting():
|
||||
]
|
||||
|
||||
sar_result = await generate_sar(activities)
|
||||
print(f"✅ SAR Report Generated: {sar_result['report_id']}")
|
||||
logger.info(f"✅ SAR Report Generated: {sar_result['report_id']}")
|
||||
|
||||
# Test compliance summary
|
||||
compliance_result = await generate_compliance_summary(
|
||||
"2026-01-01T00:00:00",
|
||||
"2026-01-31T23:59:59"
|
||||
)
|
||||
print(f"✅ Compliance Summary Generated: {compliance_result['report_id']}")
|
||||
logger.info(f"✅ Compliance Summary Generated: {compliance_result['report_id']}")
|
||||
|
||||
# List reports
|
||||
reports = list_reports()
|
||||
print(f"📋 Total Reports: {len(reports)}")
|
||||
logger.info(f"📋 Total Reports: {len(reports)}")
|
||||
|
||||
print("🎉 Regulatory reporting test complete!")
|
||||
logger.info("🎉 Regulatory reporting test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_regulatory_reporting())
|
||||
|
||||
@@ -516,28 +516,28 @@ def get_surveillance_summary() -> Dict[str, Any]:
|
||||
# Test function
|
||||
async def test_trading_surveillance():
|
||||
"""Test trading surveillance system"""
|
||||
print("🧪 Testing Trading Surveillance System...")
|
||||
logger.info("🧪 Testing Trading Surveillance System...")
|
||||
|
||||
# Start monitoring
|
||||
await start_surveillance(["BTC/USDT", "ETH/USDT"])
|
||||
print("✅ Surveillance started")
|
||||
logger.info("✅ Surveillance started")
|
||||
|
||||
# Let it run for a few seconds to generate alerts
|
||||
await asyncio.sleep(5)
|
||||
|
||||
# Get alerts
|
||||
alerts = get_alerts()
|
||||
print(f"🚨 Generated {alerts['total']} alerts")
|
||||
logger.info(f"🚨 Generated {alerts['total']} alerts")
|
||||
|
||||
# Get summary
|
||||
summary = get_surveillance_summary()
|
||||
print(f"📊 Alert Summary: {summary}")
|
||||
logger.info(f"📊 Alert Summary: {summary}")
|
||||
|
||||
# Stop monitoring
|
||||
await stop_surveillance()
|
||||
print("🔍 Surveillance stopped")
|
||||
logger.info("🔍 Surveillance stopped")
|
||||
|
||||
print("🎉 Trading surveillance test complete!")
|
||||
logger.info("🎉 Trading surveillance test complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_trading_surveillance())
|
||||
|
||||
@@ -473,7 +473,7 @@ def monitor(ctx, realtime, interval):
|
||||
live.update(generate_monitor_table())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
else:
|
||||
# Single snapshot
|
||||
overview = asyncio.run(comm.get_network_overview())
|
||||
|
||||
@@ -151,7 +151,7 @@ def monitor(ctx, realtime, interval, chain_id):
|
||||
live.update(generate_monitor_table())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
else:
|
||||
# Single snapshot
|
||||
asyncio.run(analytics.collect_all_metrics())
|
||||
|
||||
@@ -513,7 +513,7 @@ def monitor(ctx, chain_id, realtime, export, interval):
|
||||
live.update(generate_monitor_layout())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
else:
|
||||
# Single snapshot
|
||||
import asyncio
|
||||
|
||||
@@ -55,7 +55,7 @@ def rates(ctx, from_chain: Optional[str], to_chain: Optional[str],
|
||||
|
||||
if rate_table:
|
||||
headers = ["From Chain", "To Chain", "Rate"]
|
||||
print(tabulate(rate_table, headers=headers, tablefmt="grid"))
|
||||
click.echo(tabulate(rate_table, headers=headers, tablefmt="grid"))
|
||||
else:
|
||||
output("No cross-chain rates available")
|
||||
else:
|
||||
|
||||
@@ -316,7 +316,7 @@ def monitor(ctx, deployment_id, interval):
|
||||
live.update(generate_monitor_table())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
|
||||
except Exception as e:
|
||||
error(f"Error during monitoring: {str(e)}")
|
||||
|
||||
@@ -792,7 +792,7 @@ def status(ctx, exchange: Optional[str]):
|
||||
|
||||
if health.error_message:
|
||||
error(f" Error: {health.error_message}")
|
||||
print()
|
||||
click.echo()
|
||||
|
||||
except ImportError:
|
||||
error("❌ Real exchange integration not available. Install ccxt library.")
|
||||
|
||||
@@ -463,7 +463,7 @@ def monitor(ctx, realtime, interval):
|
||||
live.update(generate_monitor_table())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
else:
|
||||
# Single snapshot
|
||||
overview = asyncio.run(marketplace.get_marketplace_overview())
|
||||
|
||||
@@ -33,7 +33,7 @@ def dashboard(ctx, refresh: int, duration: int):
|
||||
|
||||
console.clear()
|
||||
console.rule("[bold blue]AITBC Dashboard[/bold blue]")
|
||||
console.print(f"[dim]Refreshing every {refresh}s | Elapsed: {int(elapsed)}s[/dim]\n")
|
||||
console.click.echo(f"[dim]Refreshing every {refresh}s | Elapsed: {int(elapsed)}s[/dim]\n")
|
||||
|
||||
# Fetch system dashboard
|
||||
try:
|
||||
@@ -47,39 +47,39 @@ def dashboard(ctx, refresh: int, duration: int):
|
||||
)
|
||||
if resp.status_code == 200:
|
||||
dashboard = resp.json()
|
||||
console.print("[bold green]Dashboard Status:[/bold green] Online")
|
||||
console.click.echo("[bold green]Dashboard Status:[/bold green] Online")
|
||||
|
||||
# Overall status
|
||||
overall_status = dashboard.get("overall_status", "unknown")
|
||||
console.print(f" Overall Status: {overall_status}")
|
||||
console.click.echo(f" Overall Status: {overall_status}")
|
||||
|
||||
# Services summary
|
||||
services = dashboard.get("services", {})
|
||||
console.print(f" Services: {len(services)}")
|
||||
console.click.echo(f" Services: {len(services)}")
|
||||
|
||||
for service_name, service_data in services.items():
|
||||
status = service_data.get("status", "unknown")
|
||||
console.print(f" {service_name}: {status}")
|
||||
console.click.echo(f" {service_name}: {status}")
|
||||
|
||||
# Metrics summary
|
||||
metrics = dashboard.get("metrics", {})
|
||||
if metrics:
|
||||
health_pct = metrics.get("health_percentage", 0)
|
||||
console.print(f" Health: {health_pct:.1f}%")
|
||||
console.click.echo(f" Health: {health_pct:.1f}%")
|
||||
|
||||
else:
|
||||
console.print(f"[bold yellow]Dashboard:[/bold yellow] HTTP {resp.status_code}")
|
||||
console.click.echo(f"[bold yellow]Dashboard:[/bold yellow] HTTP {resp.status_code}")
|
||||
except Exception as e:
|
||||
console.print(f"[bold red]Dashboard:[/bold red] Error - {e}")
|
||||
console.click.echo(f"[bold red]Dashboard:[/bold red] Error - {e}")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error fetching data: {e}[/red]")
|
||||
console.click.echo(f"[red]Error fetching data: {e}[/red]")
|
||||
|
||||
console.print(f"\n[dim]Press Ctrl+C to exit[/dim]")
|
||||
console.click.echo(f"\n[dim]Press Ctrl+C to exit[/dim]")
|
||||
time.sleep(refresh)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[bold]Dashboard stopped[/bold]")
|
||||
console.click.echo("\n[bold]Dashboard stopped[/bold]")
|
||||
|
||||
|
||||
@monitor.command()
|
||||
|
||||
@@ -97,7 +97,7 @@ def chains(ctx, show_private, node_id):
|
||||
chains = await client.get_hosted_chains()
|
||||
return [(nid, chain) for chain in chains]
|
||||
except Exception as e:
|
||||
print(f"Error getting chains from node {nid}: {e}")
|
||||
click.echo(f"Error getting chains from node {nid}: {e}")
|
||||
return []
|
||||
|
||||
tasks.append(get_chains_for_node(node_id, node_config))
|
||||
@@ -328,7 +328,7 @@ def monitor(ctx, node_id, realtime, interval):
|
||||
live.update(generate_monitor_layout())
|
||||
time.sleep(interval)
|
||||
except KeyboardInterrupt:
|
||||
console.print("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
console.click.echo("\n[yellow]Monitoring stopped by user[/yellow]")
|
||||
else:
|
||||
# Single snapshot
|
||||
node_info = asyncio.run(get_node_stats())
|
||||
|
||||
@@ -19,8 +19,11 @@ if _src_path not in sys.path:
|
||||
|
||||
try:
|
||||
from app.services.regulatory_reporting import (
|
||||
generate_sar, generate_compliance_summary, list_reports,
|
||||
regulatory_reporter, ReportType, ReportStatus, RegulatoryBody
|
||||
generate_sar as generate_sar_svc,
|
||||
generate_compliance_summary as generate_compliance_summary_svc,
|
||||
list_reports,
|
||||
regulatory_reporter,
|
||||
ReportType, ReportStatus, RegulatoryBody
|
||||
)
|
||||
_import_error = None
|
||||
except ImportError as e:
|
||||
@@ -77,7 +80,7 @@ def generate_sar(ctx, user_id: str, activity_type: str, amount: float, descripti
|
||||
}
|
||||
|
||||
# Generate SAR
|
||||
result = asyncio.run(generate_sar([activity]))
|
||||
result = asyncio.run(generate_sar_svc([activity]))
|
||||
|
||||
click.echo(f"\n✅ SAR Report Generated Successfully!")
|
||||
click.echo(f"📋 Report ID: {result['report_id']}")
|
||||
@@ -110,7 +113,7 @@ def compliance_summary(ctx, period_start: str, period_end: str):
|
||||
click.echo(f"📈 Duration: {(end_date - start_date).days} days")
|
||||
|
||||
# Generate compliance summary
|
||||
result = asyncio.run(generate_compliance_summary(
|
||||
result = asyncio.run(generate_compliance_summary_svc(
|
||||
start_date.isoformat(),
|
||||
end_date.isoformat()
|
||||
))
|
||||
|
||||
138
scripts/claim-task.py
Normal file
138
scripts/claim-task.py
Normal file
@@ -0,0 +1,138 @@
|
||||
#!/usr/bin/env python3
|
||||
import os, sys, json, subprocess, random, time, logging
|
||||
from datetime import datetime
|
||||
|
||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
TOKEN = os.getenv("GITEA_TOKEN", "ffce3b62d583b761238ae00839dce7718acaad85")
|
||||
API_BASE = os.getenv("GITEA_API_BASE", "http://gitea.bubuit.net:3000/api/v1")
|
||||
REPO = "oib/aitbc"
|
||||
AGENT = os.getenv("AGENT_NAME", "aitbc")
|
||||
|
||||
def log(msg):
|
||||
logger.info(msg)
|
||||
|
||||
def die(msg):
|
||||
log(f"FATAL: {msg}")
|
||||
sys.exit(1)
|
||||
|
||||
def api_get(path):
|
||||
cmd = ["curl", "-s", "-H", f"Authorization: token {TOKEN}", f"{API_BASE}/{path}"]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except:
|
||||
return None
|
||||
|
||||
def api_post(path, payload):
|
||||
cmd = ["curl", "-s", "-X", "POST", "-H", f"Authorization: token {TOKEN}", "-H", "Content-Type: application/json",
|
||||
f"{API_BASE}/{path}", "-d", json.dumps(payload)]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except:
|
||||
return None
|
||||
|
||||
def compute_utility(issue):
|
||||
score = 0
|
||||
labels = [l['name'] for l in issue['labels']]
|
||||
if 'security' in labels:
|
||||
score += 100
|
||||
elif 'bug' in labels:
|
||||
score += 50
|
||||
elif 'feature' in labels:
|
||||
score += 30
|
||||
elif 'refactor' in labels:
|
||||
score += 10
|
||||
if 'good-first-task-for-agent' in labels:
|
||||
score += 20
|
||||
if any(tag in labels for tag in ['needs-design', 'blocked', 'needs-reproduction']):
|
||||
score -= 1000
|
||||
score += issue['comments'] * 5
|
||||
return score
|
||||
|
||||
def get_open_unassigned_issues():
|
||||
items = api_get(f"repos/{REPO}/issues?state=open") or []
|
||||
candidates = []
|
||||
for i in items:
|
||||
# Skip PRs: pull_request field is non-null for PRs
|
||||
if i.get('pull_request') is not None:
|
||||
continue
|
||||
if i['assignee'] is not None:
|
||||
continue
|
||||
labels = [l['name'] for l in i['labels']]
|
||||
if any(tag in labels for tag in ['needs-design', 'blocked', 'needs-reproduction']):
|
||||
continue
|
||||
candidates.append(i)
|
||||
return candidates
|
||||
|
||||
def create_claim_branch(issue_number):
|
||||
branch = f"claim/{issue_number}"
|
||||
subprocess.run(["git", "fetch", "origin"], capture_output=True, cwd=REPO_DIR)
|
||||
subprocess.run(["git", "checkout", "-B", branch, "origin/main"], capture_output=True, cwd=REPO_DIR)
|
||||
subprocess.run(["git", "commit", "--allow-empty", "-m", f"Claim issue #{issue_number} for {AGENT}"], capture_output=True, cwd=REPO_DIR)
|
||||
r = subprocess.run(["git", "push", "-u", "origin", branch], capture_output=True, cwd=REPO_DIR)
|
||||
return r.returncode == 0
|
||||
|
||||
def assign_issue(issue_number):
|
||||
payload = {"assignee": AGENT}
|
||||
result = api_post(f"repos/{REPO}/issues/{issue_number}/assignees", payload)
|
||||
return result is not None
|
||||
|
||||
def add_comment(issue_number, body):
|
||||
payload = {"body": body}
|
||||
return api_post(f"repos/{REPO}/issues/{issue_number}/comments", payload) is not None
|
||||
|
||||
def main():
|
||||
# Jitter 0-60s
|
||||
time.sleep(random.randint(0, 60))
|
||||
log("Claim task cycle starting...")
|
||||
state_file = os.path.join(REPO_DIR, ".claim-state.json")
|
||||
try:
|
||||
with open(state_file) as f:
|
||||
state = json.load(f)
|
||||
except:
|
||||
state = {}
|
||||
if state.get('current_claim'):
|
||||
log(f"Already working on issue #{state['current_claim']} (branch {state.get('work_branch')})")
|
||||
return
|
||||
issues = get_open_unassigned_issues()
|
||||
if not issues:
|
||||
log("No unassigned issues available.")
|
||||
return
|
||||
issues.sort(key=lambda i: compute_utility(i), reverse=True)
|
||||
for issue in issues:
|
||||
num = issue['number']
|
||||
title = issue['title']
|
||||
labels = [lbl['name'] for lbl in issue.get('labels', [])]
|
||||
log(f"Attempting to claim issue #{num}: {title} (labels={labels})")
|
||||
if create_claim_branch(num):
|
||||
assign_issue(num)
|
||||
slug = ''.join(c if c.isalnum() else '-' for c in title.lower())[:40].strip('-')
|
||||
work_branch = f"{AGENT}/{num}-{slug}"
|
||||
subprocess.run(["git", "checkout", "-b", work_branch, "main"], cwd=REPO_DIR, capture_output=True)
|
||||
state = {
|
||||
'current_claim': num,
|
||||
'claim_branch': f'claim/{num}',
|
||||
'work_branch': work_branch,
|
||||
'claimed_at': datetime.utcnow().isoformat() + 'Z',
|
||||
'issue_title': title,
|
||||
'labels': labels
|
||||
}
|
||||
with open(state_file, 'w') as f:
|
||||
json.dump(state, f, indent=2)
|
||||
add_comment(num, f"Agent `{AGENT}` claiming this task. (automated)")
|
||||
log(f"✅ Claimed issue #{num}. Work branch: {work_branch}")
|
||||
return
|
||||
else:
|
||||
log(f"Claim failed for #{num} (branch exists). Trying next...")
|
||||
log("Could not claim any issue; all taken or unavailable.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
REPO_DIR = '/opt/aitbc'
|
||||
main()
|
||||
139
scripts/monitor-prs.py
Normal file
139
scripts/monitor-prs.py
Normal file
@@ -0,0 +1,139 @@
|
||||
#!/usr/bin/env python3
|
||||
import os, sys, json, subprocess, tempfile, shutil, re, time, random
|
||||
from datetime import datetime
|
||||
|
||||
TOKEN = os.getenv("GITEA_TOKEN", "ffce3b62d583b761238ae00839dce7718acaad85")
|
||||
API_BASE = os.getenv("GITEA_API_BASE", "http://gitea.bubuit.net:3000/api/v1")
|
||||
REPO = "oib/aitbc"
|
||||
AGENT = os.getenv("AGENT_NAME", "aitbc")
|
||||
OTHER = "aitbc1" if AGENT == "aitbc" else "aitbc"
|
||||
|
||||
RING_PREFIXES = [
|
||||
(0, ["packages/py/aitbc-core", "packages/py/aitbc-sdk"]),
|
||||
(1, ["apps/"]),
|
||||
(2, ["cli/", "analytics/", "tools/"]),
|
||||
]
|
||||
RING_THRESHOLD = {0: 0.9, 1: 0.8, 2: 0.7, 3: 0.5}
|
||||
|
||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def log(msg):
|
||||
logger.info(msg)
|
||||
|
||||
def api_get(path):
|
||||
cmd = ["curl", "-s", "-H", f"Authorization: token {TOKEN}", f"{API_BASE}/{path}"]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except:
|
||||
return None
|
||||
|
||||
def api_post(path, payload):
|
||||
cmd = ["curl", "-s", "-X", "POST", "-H", f"Authorization: token {TOKEN}", "-H", "Content-Type: application/json",
|
||||
f"{API_BASE}/{path}", "-d", json.dumps(payload)]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except:
|
||||
return None
|
||||
|
||||
def get_open_prs():
|
||||
return api_get(f"repos/{REPO}/pulls?state=open") or []
|
||||
|
||||
def get_my_reviews(pr_number):
|
||||
return api_get(f"repos/{REPO}/pulls/{pr_number}/reviews") or []
|
||||
|
||||
def is_test_file(path):
|
||||
return '/tests/' in path or path.startswith('tests/') or path.endswith('_test.py')
|
||||
|
||||
def detect_ring(base_sha, head_sha):
|
||||
try:
|
||||
output = subprocess.run(
|
||||
["git", "diff", "--name-only", base_sha, head_sha],
|
||||
capture_output=True, text=True, check=True, cwd='/opt/aitbc'
|
||||
).stdout
|
||||
files = [f.strip() for f in output.splitlines() if f.strip()]
|
||||
except subprocess.CalledProcessError:
|
||||
files = []
|
||||
if files and all(is_test_file(f) for f in files):
|
||||
return 3
|
||||
for ring, prefixes in sorted(RING_PREFIXES, key=lambda x: x[0]):
|
||||
for p in files:
|
||||
if any(p.startswith(prefix) for prefix in prefixes):
|
||||
return ring
|
||||
return 3
|
||||
|
||||
def syntax_check(worktree):
|
||||
py_files = []
|
||||
for root, dirs, files in os.walk(worktree):
|
||||
for f in files:
|
||||
if f.endswith('.py'):
|
||||
py_files.append(os.path.join(root, f))
|
||||
for f in py_files:
|
||||
r = subprocess.run([sys.executable, '-m', 'py_compile', f], capture_output=True)
|
||||
if r.returncode != 0:
|
||||
return False, f"Syntax error in {os.path.relpath(f, worktree)}"
|
||||
return True, ""
|
||||
|
||||
def post_review(pr_number, event, body):
|
||||
payload = {"event": event, "body": body}
|
||||
return api_post(f"repos/{REPO}/pulls/{pr_number}/reviews", payload) is not None
|
||||
|
||||
def request_review(pr_number, reviewer):
|
||||
payload = {"reviewers": [reviewer]}
|
||||
return api_post(f"repos/{REPO}/pulls/{pr_number}/requested-reviewers", payload) is not None
|
||||
|
||||
def main():
|
||||
# Jitter 0-60s
|
||||
time.sleep(random.randint(0, 60))
|
||||
log("Fetching open PRs...")
|
||||
prs = get_open_prs()
|
||||
if not prs:
|
||||
log("No open PRs")
|
||||
return
|
||||
# Process sibling PRs
|
||||
for pr in prs:
|
||||
if pr['user']['login'] != OTHER:
|
||||
continue
|
||||
number = pr['number']
|
||||
title = pr['title'][:50]
|
||||
log(f"Reviewing sibling PR #{number}: {title}")
|
||||
# Checkout and validate
|
||||
tmp = tempfile.mkdtemp(prefix="aitbc_monitor_")
|
||||
try:
|
||||
subprocess.run(["git", "clone", "--no-checkout", "origin", tmp], capture_output=True, check=True)
|
||||
wt = os.path.join(tmp, "wt")
|
||||
os.makedirs(wt)
|
||||
subprocess.run(["git", "--git-dir", os.path.join(tmp, ".git"), "--work-tree", wt, "fetch", "origin", pr['head']['ref']], capture_output=True, check=True)
|
||||
subprocess.run(["git", "--git-dir", os.path.join(tmp, ".git"), "--work-tree", wt, "checkout", "FETCH_HEAD"], capture_output=True, check=True)
|
||||
ok, err = syntax_check(wt)
|
||||
if not ok:
|
||||
post_review(number, "REQUEST_CHANGES", f"❌ Syntax error: {err}")
|
||||
log(f"PR #{number} failed syntax: {err}")
|
||||
continue
|
||||
ring = detect_ring(pr['base']['sha'], pr['head']['sha'])
|
||||
threshold = RING_THRESHOLD.get(ring, 0.5)
|
||||
if ring == 0:
|
||||
post_review(number, "REQUEST_CHANGES", f"Ring 0 (core): needs manual review. Threshold: >{threshold}")
|
||||
log(f"PR #{number} is Ring0; manual review required")
|
||||
else:
|
||||
post_review(number, "APPROVE", f"✅ Auto‑approved (ring {ring}, threshold >{threshold})")
|
||||
log(f"PR #{number} approved")
|
||||
finally:
|
||||
shutil.rmtree(tmp, ignore_errors=True)
|
||||
# Our own PRs: request review from OTHER if not yet
|
||||
our_prs = [pr for pr in prs if pr['user']['login'] == AGENT]
|
||||
for pr in our_prs:
|
||||
number = pr['number']
|
||||
reviews = get_my_reviews(number)
|
||||
if not any(r['user']['login'] == OTHER for r in reviews):
|
||||
log(f"Requesting review from {OTHER} on our PR #{number}")
|
||||
request_review(number, OTHER)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
120
scripts/qa-cycle.py
Executable file
120
scripts/qa-cycle.py
Executable file
@@ -0,0 +1,120 @@
|
||||
#!/usr/bin/env python3
|
||||
import os, sys, json, subprocess, tempfile, shutil, glob, re
|
||||
import random, time, logging
|
||||
from datetime import datetime
|
||||
|
||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
AGENT = os.getenv("AGENT_NAME", "aitbc")
|
||||
LOG_DIR = "/var/log/aitbc"
|
||||
QA_LOG = os.path.join(LOG_DIR, "qa_cycle.log")
|
||||
|
||||
def run_cmd(cmd, capture=True, check=False):
|
||||
result = subprocess.run(cmd, shell=True, capture_output=capture, text=True)
|
||||
if check and result.returncode != 0:
|
||||
logger.error(f"Command failed: {cmd}")
|
||||
logger.error(f"stderr: {result.stderr}")
|
||||
return None
|
||||
return result.stdout if capture else result.returncode
|
||||
|
||||
def feature_tests():
|
||||
logger.info("=== FEATURE TESTS ===")
|
||||
for mod in ["aiohttp", "fastapi", "click", "pydantic"]:
|
||||
cmd = f"python3 -c 'import {mod}; print({mod}.__version__)'"
|
||||
out = run_cmd(cmd)
|
||||
if out is None:
|
||||
logger.error(f"Import failed: {mod}")
|
||||
else:
|
||||
logger.info(f"{mod} import OK: {out.strip()}")
|
||||
out = run_cmd("/opt/aitbc/cli_venv/bin/aitbc --help")
|
||||
if out:
|
||||
logger.info("CLI help works")
|
||||
else:
|
||||
logger.error("CLI help failed")
|
||||
out = run_cmd("curl -s http://localhost:8010/health || true")
|
||||
if out and 'ok' in out.lower():
|
||||
logger.info("Brother chain health endpoint OK")
|
||||
else:
|
||||
logger.warning("Brother chain health check inconclusive")
|
||||
|
||||
def bug_sentinel():
|
||||
logger.info("=== BUG SENTINEL ===")
|
||||
logs = glob.glob("/var/log/aitbc/*.log")
|
||||
patterns = ["ERROR", "CRITICAL", "FATAL", "Traceback", "Exception"]
|
||||
found = []
|
||||
for logfile in logs:
|
||||
try:
|
||||
with open(logfile) as f:
|
||||
lines = f.readlines()[-200:]
|
||||
except:
|
||||
continue
|
||||
for line in lines:
|
||||
if any(p in line for p in patterns):
|
||||
found.append(f"{logfile}: {line.strip()}")
|
||||
if found:
|
||||
logger.warning(f"Found {len(found)} error patterns (sample):")
|
||||
for item in found[:5]:
|
||||
logger.warning(f" {item}")
|
||||
else:
|
||||
logger.info("No error patterns in recent logs")
|
||||
|
||||
def code_review():
|
||||
logger.info("=== CODE REVIEW ===")
|
||||
py_files = run_cmd("find /opt/aitbc -name '*.py' -type f | head -30").splitlines()
|
||||
issues = []
|
||||
for f in py_files:
|
||||
try:
|
||||
with open(f) as fp:
|
||||
content = fp.read()
|
||||
except:
|
||||
continue
|
||||
if re.search(r'except\s*:', content):
|
||||
issues.append(f"{f}: bare except")
|
||||
if re.search(r'def\s+\w+\s*\([^)]*=\s*[\[\{\}]', content):
|
||||
issues.append(f"{f}: mutable default argument")
|
||||
if 'print(' in content and 'if __name__' not in content:
|
||||
issues.append(f"{f}: print statement in library code")
|
||||
if re.search(r'(password|secret|key)\s*=\s*[\'"][^\'"]+[\'"]', content, re.IGNORECASE):
|
||||
issues.append(f"{f}: possible hardcoded secret")
|
||||
if issues:
|
||||
logger.warning(f"Found {len(issues)} code quality issues (sample):")
|
||||
for i in issues[:5]:
|
||||
logger.warning(f" {i}")
|
||||
else:
|
||||
logger.info("No obvious code quality issues detected")
|
||||
|
||||
def scenario_runner():
|
||||
logger.info("=== SCENARIO RUNNER ===")
|
||||
tmp = tempfile.mkdtemp(prefix="aitbc_qa_")
|
||||
try:
|
||||
run_cmd(f"git init {tmp}", check=True)
|
||||
run_cmd(f"git -C {tmp} config user.email 'qa@aitbc'", check=True)
|
||||
run_cmd(f"git -C {tmp} config user.name 'QA Agent'", check=True)
|
||||
run_cmd(f"echo 'test' > {tmp}/test.txt", check=True)
|
||||
run_cmd(f"git -C {tmp} add .", check=True)
|
||||
run_cmd(f"git -C {tmp} commit -m 'initial'", check=True)
|
||||
run_cmd(f"git -C {tmp} checkout -b claim/123", check=True)
|
||||
run_cmd(f"git -C {tmp} commit --allow-empty -m 'Claim issue 123'", check=True)
|
||||
logger.info("Git workflow test passed")
|
||||
except Exception as e:
|
||||
logger.error(f"Git workflow test failed: {e}")
|
||||
finally:
|
||||
shutil.rmtree(tmp, ignore_errors=True)
|
||||
|
||||
def main():
|
||||
logger.info(f"QA CYCLE STARTED for {AGENT}")
|
||||
start = datetime.now()
|
||||
# Jitter: 0-900 seconds (15 minutes)
|
||||
jitter = random.randint(0, 900)
|
||||
logger.info(f"Sleeping {jitter}s before start...")
|
||||
time.sleep(jitter)
|
||||
feature_tests()
|
||||
bug_sentinel()
|
||||
code_review()
|
||||
scenario_runner()
|
||||
elapsed = datetime.now() - start
|
||||
logger.info(f"QA CYCLE COMPLETED in {elapsed.total_seconds():.1f}s")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
21
setup-cron.sh
Executable file
21
setup-cron.sh
Executable file
@@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
LOG_DIR="/var/log/aitbc"
|
||||
mkdir -p "$LOG_DIR"
|
||||
chown $(whoami):$(whoami) "$LOG_DIR" 2>/dev/null || true
|
||||
|
||||
CRON_TMP=$(mktemp)
|
||||
|
||||
cat > "$CRON_TMP" <<EOF
|
||||
AGENT_NAME=$(whoami)
|
||||
PATH=/usr/local/bin:/usr/bin:/bin
|
||||
GITEA_TOKEN=ffce3b62d583b761238ae00839dce7718acaad85
|
||||
|
||||
*/10 * * * * sleep \$((RANDOM % 31)); /opt/aitbc/scripts/auto_review.py >> "$LOG_DIR/auto_review.log" 2>&1
|
||||
0,30 * * * * sleep \$((RANDOM % 16)); /opt/aitbc/scripts/claim-task.py >> "$LOG_DIR/claim_task.log" 2>&1
|
||||
EOF
|
||||
|
||||
crontab "$CRON_TMP"
|
||||
rm "$CRON_TMP"
|
||||
echo "Cron installed for $(whoami)"
|
||||
Reference in New Issue
Block a user