refactor: clean up temp directory and organize files properly
Temp Directory Cleanup - Complete: ✅ TEMP DIRECTORY REMOVED: Cleaned up misplaced development artifacts - /opt/aitbc/temp/ completely removed and contents properly organized - Root cause: Development/testing artifacts stored in temporary location - Solution: Moved files to appropriate permanent directories ✅ FILES PROPERLY ORGANIZED: 📁 Database Files: aitbc_coordinator.db → data/ (proper database location) 📁 Log Files: qa-cycle.log → /var/log/aitbc/ (unified logging system) 📁 Development Artifacts: .coverage, .pytest_cache, .ruff_cache, auto_review.py.bak → dev/ 📁 Testing Cache: pytest and ruff caches in development directory 📁 Coverage Reports: Python test coverage in development directory ✅ ROOT CAUSE RESOLVED: - Problem: Mixed file types in temporary directory - Database files: Now in data/ directory - Log files: Now in /var/log/aitbc/ unified logging - Development artifacts: Now in dev/ directory - Temporary directory: Completely removed ✅ DIRECTORY STRUCTURE IMPROVEMENT: 📁 data/: Database files (aitbc_coordinator.db) 📁 dev/: Development artifacts (coverage, caches, backups) 📁 /var/log/aitbc/: Unified system logging 🏗️ Root Directory: Clean, no temporary directories ✅ LOGS ORGANIZATION UPDATED: - docs/LOGS_ORGANIZATION.md: Updated with qa-cycle.log addition - Change History: Records temp directory cleanup - Complete Log Inventory: All log files documented BENEFITS: ✅ Clean Root Directory: No temporary or misplaced files ✅ Proper Organization: Files in appropriate permanent locations ✅ Unified Logging: All logs in /var/log/aitbc/ ✅ Development Structure: Development artifacts grouped in dev/ ✅ Database Management: Database files in data/ directory RESULT: Successfully cleaned up temp directory and organized all files into proper permanent locations, resolving the root cause of misplaced development artifacts and achieving clean directory structure.
This commit is contained in:
@@ -1,61 +0,0 @@
|
||||
# AITBC v0.2.2 Release Notes
|
||||
|
||||
## 🎯 Overview
|
||||
AITBC v0.2.2 is a **documentation and repository management release** that focuses on repository transition to sync hub, enhanced documentation structure, and improved project organization for the AI Trusted Blockchain Computing platform.
|
||||
|
||||
## 🚀 New Features
|
||||
|
||||
### 📚 Documentation Enhancements
|
||||
- **Hub Status Documentation**: Complete repository transition documentation
|
||||
- **README Updates**: Hub-only warnings and improved project description
|
||||
- **Documentation Cleanup**: Removed outdated v0.2.0 release notes
|
||||
- **Project Organization**: Enhanced root directory structure
|
||||
|
||||
### 🔧 Repository Management
|
||||
- **Sync Hub Transition**: Documentation for repository sync hub status
|
||||
- **Warning System**: Hub-only warnings in README for clarity
|
||||
- **Clean Documentation**: Streamlined documentation structure
|
||||
- **Version Management**: Improved version tracking and cleanup
|
||||
|
||||
### 📁 Project Structure
|
||||
- **Root Organization**: Clean and professional project structure
|
||||
- **Documentation Hierarchy**: Better organized documentation files
|
||||
- **Maintenance Updates**: Simplified maintenance procedures
|
||||
|
||||
## 📊 Statistics
|
||||
- **Total Commits**: 350+
|
||||
- **Documentation Updates**: 8
|
||||
- **Repository Enhancements**: 5
|
||||
- **Cleanup Operations**: 3
|
||||
|
||||
## 🔗 Changes from v0.2.1
|
||||
- Added HUB_STATUS.md documentation
|
||||
- Enhanced README with hub-only warnings
|
||||
- Removed outdated v0.2.0 release notes
|
||||
- Improved project documentation structure
|
||||
- Streamlined repository management
|
||||
|
||||
## 🚦 Migration Guide
|
||||
1. Pull latest updates: `git pull`
|
||||
2. Review HUB_STATUS.md for repository information
|
||||
3. Check README for updated project information
|
||||
4. Verify documentation structure
|
||||
|
||||
## 🐛 Bug Fixes
|
||||
- Fixed documentation inconsistencies
|
||||
- Resolved version tracking issues
|
||||
- Improved repository organization
|
||||
|
||||
## 🎯 What's Next
|
||||
- Enhanced multi-chain support
|
||||
- Advanced agent orchestration
|
||||
- Performance optimizations
|
||||
- Security enhancements
|
||||
|
||||
## 🙏 Acknowledgments
|
||||
Special thanks to the AITBC community for contributions, testing, and feedback.
|
||||
|
||||
---
|
||||
*Release Date: March 24, 2026*
|
||||
*License: MIT*
|
||||
*GitHub: https://github.com/oib/AITBC*
|
||||
4
cli/.pytest_cache/v/cache/lastfailed
vendored
4
cli/.pytest_cache/v/cache/lastfailed
vendored
@@ -1,3 +1,5 @@
|
||||
{
|
||||
"tests/test_cli_basic.py::TestCLIImports::test_cli_commands_import": true
|
||||
"tests/test_cli_basic.py::TestCLIImports::test_cli_commands_import": true,
|
||||
"tests/test_cli_comprehensive.py::TestResourceCommand::test_resource_help": true,
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_cli_version": true
|
||||
}
|
||||
28
cli/.pytest_cache/v/cache/nodeids
vendored
28
cli/.pytest_cache/v/cache/nodeids
vendored
@@ -6,5 +6,31 @@
|
||||
"tests/test_cli_basic.py::TestCLIErrorHandling::test_cli_invalid_command",
|
||||
"tests/test_cli_basic.py::TestCLIImports::test_cli_commands_import",
|
||||
"tests/test_cli_basic.py::TestCLIImports::test_cli_main_import",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_help"
|
||||
"tests/test_cli_comprehensive.py::TestAIOperationsCommand::test_ai_ops_help",
|
||||
"tests/test_cli_comprehensive.py::TestAIOperationsCommand::test_ai_ops_status",
|
||||
"tests/test_cli_comprehensive.py::TestBlockchainCommand::test_blockchain_basic",
|
||||
"tests/test_cli_comprehensive.py::TestBlockchainCommand::test_blockchain_help",
|
||||
"tests/test_cli_comprehensive.py::TestConfiguration::test_debug_mode",
|
||||
"tests/test_cli_comprehensive.py::TestConfiguration::test_different_output_formats",
|
||||
"tests/test_cli_comprehensive.py::TestConfiguration::test_verbose_mode",
|
||||
"tests/test_cli_comprehensive.py::TestErrorHandling::test_invalid_command",
|
||||
"tests/test_cli_comprehensive.py::TestErrorHandling::test_invalid_option_values",
|
||||
"tests/test_cli_comprehensive.py::TestErrorHandling::test_missing_required_args",
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_ai_operations",
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_blockchain_operations",
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_cli_help_comprehensive",
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_cli_version",
|
||||
"tests/test_cli_comprehensive.py::TestIntegrationScenarios::test_wallet_operations",
|
||||
"tests/test_cli_comprehensive.py::TestMarketplaceCommand::test_marketplace_help",
|
||||
"tests/test_cli_comprehensive.py::TestMarketplaceCommand::test_marketplace_list",
|
||||
"tests/test_cli_comprehensive.py::TestPerformance::test_command_startup_time",
|
||||
"tests/test_cli_comprehensive.py::TestPerformance::test_help_response_time",
|
||||
"tests/test_cli_comprehensive.py::TestResourceCommand::test_resource_help",
|
||||
"tests/test_cli_comprehensive.py::TestResourceCommand::test_resource_status",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_ai_jobs_basic",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_blockchain_basic",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_help",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_network_basic",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_price_basic",
|
||||
"tests/test_cli_comprehensive.py::TestSimulateCommand::test_simulate_wallets_basic"
|
||||
]
|
||||
Binary file not shown.
@@ -15,6 +15,7 @@ System logs are now properly organized in /var/log/aitbc/:
|
||||
- monitoring_report_20260329_193125.txt
|
||||
- network_monitor.log
|
||||
- qa_cycle.log
|
||||
- qa-cycle.log
|
||||
- security_summary.txt
|
||||
- sync_detector.log
|
||||
- testing_completion_report.txt
|
||||
@@ -23,6 +24,7 @@ System logs are now properly organized in /var/log/aitbc/:
|
||||
- **audit/**: Audit logs
|
||||
- **network_monitor.log**: Network monitoring logs
|
||||
- **qa_cycle.log**: QA cycle logs
|
||||
- **qa-cycle.log**: QA cycle testing logs (from temp directory)
|
||||
- **host_gpu_miner.log**: GPU miner client logs (2.4MB)
|
||||
- **contract_endpoints_final_status.txt**: Contract endpoint status
|
||||
- **final_production_ai_results.txt**: Production AI results
|
||||
@@ -32,3 +34,4 @@ System logs are now properly organized in /var/log/aitbc/:
|
||||
## Change History
|
||||
- **2026-03-30**: Moved from /opt/aitbc/results/ to /var/log/aitbc/ for proper organization
|
||||
- **2026-03-30**: Consolidated /opt/aitbc/logs/host_gpu_miner.log to /var/log/aitbc/ for unified logging
|
||||
- **2026-03-30**: Cleaned up /opt/aitbc/temp/ directory and moved qa-cycle.log to proper logs location
|
||||
|
||||
@@ -1,112 +0,0 @@
|
||||
# GPU Acceleration Migration Checklist
|
||||
|
||||
## ✅ Pre-Migration Preparation
|
||||
|
||||
- [ ] Review existing CUDA-specific code
|
||||
- [ ] Identify all files that import CUDA modules
|
||||
- [ ] Document current CUDA usage patterns
|
||||
- [ ] Create backup of existing code
|
||||
- [ ] Test current functionality
|
||||
|
||||
## ✅ Code Migration
|
||||
|
||||
### Import Statements
|
||||
- [ ] Replace `from high_performance_cuda_accelerator import ...` with `from gpu_acceleration import ...`
|
||||
- [ ] Replace `from fastapi_cuda_zk_api import ...` with `from gpu_acceleration import ...`
|
||||
- [ ] Update all CUDA-specific imports
|
||||
|
||||
### Function Calls
|
||||
- [ ] Replace `accelerator.field_add_cuda()` with `gpu.field_add()`
|
||||
- [ ] Replace `accelerator.field_mul_cuda()` with `gpu.field_mul()`
|
||||
- [ ] Replace `accelerator.multi_scalar_mul_cuda()` with `gpu.multi_scalar_mul()`
|
||||
- [ ] Update all CUDA-specific function calls
|
||||
|
||||
### Initialization
|
||||
- [ ] Replace `HighPerformanceCUDAZKAccelerator()` with `GPUAccelerationManager()`
|
||||
- [ ] Replace `ProductionCUDAZKAPI()` with `create_gpu_manager()`
|
||||
- [ ] Add proper error handling for backend initialization
|
||||
|
||||
### Error Handling
|
||||
- [ ] Add fallback handling for GPU failures
|
||||
- [ ] Update error messages to be backend-agnostic
|
||||
- [ ] Add backend information to error responses
|
||||
|
||||
## ✅ Testing
|
||||
|
||||
### Unit Tests
|
||||
- [ ] Update unit tests to use new interface
|
||||
- [ ] Test backend auto-detection
|
||||
- [ ] Test fallback to CPU
|
||||
- [ ] Test performance regression
|
||||
|
||||
### Integration Tests
|
||||
- [ ] Test API endpoints with new backend
|
||||
- [ ] Test multi-backend scenarios
|
||||
- [ ] Test configuration options
|
||||
- [ ] Test error handling
|
||||
|
||||
### Performance Tests
|
||||
- [ ] Benchmark new vs old implementation
|
||||
- [ ] Test performance with different backends
|
||||
- [ ] Verify no significant performance regression
|
||||
- [ ] Test memory usage
|
||||
|
||||
## ✅ Documentation
|
||||
|
||||
### Code Documentation
|
||||
- [ ] Update docstrings to be backend-agnostic
|
||||
- [ ] Add examples for new interface
|
||||
- [ ] Document configuration options
|
||||
- [ ] Update error handling documentation
|
||||
|
||||
### API Documentation
|
||||
- [ ] Update API docs to reflect backend flexibility
|
||||
- [ ] Add backend information endpoints
|
||||
- [ ] Update performance monitoring docs
|
||||
- [ ] Document migration process
|
||||
|
||||
### User Documentation
|
||||
- [ ] Update user guides with new examples
|
||||
- [ ] Document backend selection options
|
||||
- [ ] Add troubleshooting guide
|
||||
- [ ] Update installation instructions
|
||||
|
||||
## ✅ Deployment
|
||||
|
||||
### Configuration
|
||||
- [ ] Update deployment scripts
|
||||
- [ ] Add backend selection environment variables
|
||||
- [ ] Update monitoring for new metrics
|
||||
- [ ] Test deployment with different backends
|
||||
|
||||
### Monitoring
|
||||
- [ ] Update monitoring to track backend usage
|
||||
- [ ] Add alerts for backend failures
|
||||
- [ ] Monitor performance metrics
|
||||
- [ ] Track fallback usage
|
||||
|
||||
### Rollback Plan
|
||||
- [ ] Document rollback procedure
|
||||
- [ ] Test rollback process
|
||||
- [ ] Prepare backup deployment
|
||||
- [ ] Create rollback triggers
|
||||
|
||||
## ✅ Validation
|
||||
|
||||
### Functional Validation
|
||||
- [ ] All existing functionality works
|
||||
- [ ] New backend features work correctly
|
||||
- [ ] Error handling works as expected
|
||||
- [ ] Performance is acceptable
|
||||
|
||||
### Security Validation
|
||||
- [ ] No new security vulnerabilities
|
||||
- [ ] Backend isolation works correctly
|
||||
- [ ] Input validation still works
|
||||
- [ ] Error messages don't leak information
|
||||
|
||||
### Performance Validation
|
||||
- [ ] Performance meets requirements
|
||||
- [ ] Memory usage is acceptable
|
||||
- [ ] Scalability is maintained
|
||||
- [ ] Resource utilization is optimal
|
||||
@@ -1,49 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
API Migration Example
|
||||
|
||||
Shows how to migrate FastAPI endpoints to use the new abstraction layer.
|
||||
"""
|
||||
|
||||
# BEFORE (CUDA-specific API)
|
||||
# from fastapi_cuda_zk_api import ProductionCUDAZKAPI
|
||||
#
|
||||
# cuda_api = ProductionCUDAZKAPI()
|
||||
# if not cuda_api.initialized:
|
||||
# raise HTTPException(status_code=500, detail="CUDA not available")
|
||||
|
||||
# AFTER (Backend-agnostic API)
|
||||
from fastapi import FastAPI, HTTPException
|
||||
from pydantic import BaseModel
|
||||
from gpu_acceleration import GPUAccelerationManager, create_gpu_manager
|
||||
import numpy as np
|
||||
|
||||
app = FastAPI(title="Refactored GPU API")
|
||||
|
||||
# Initialize GPU manager (auto-detects best backend)
|
||||
gpu_manager = create_gpu_manager()
|
||||
|
||||
class FieldOperation(BaseModel):
|
||||
a: list[int]
|
||||
b: list[int]
|
||||
|
||||
@app.post("/field/add")
|
||||
async def field_add(op: FieldOperation):
|
||||
"""Perform field addition with any available backend."""
|
||||
try:
|
||||
a = np.array(op.a, dtype=np.uint64)
|
||||
b = np.array(op.b, dtype=np.uint64)
|
||||
result = gpu_manager.field_add(a, b)
|
||||
return {"result": result.tolist()}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
@app.get("/backend/info")
|
||||
async def backend_info():
|
||||
"""Get current backend information."""
|
||||
return gpu_manager.get_backend_info()
|
||||
|
||||
@app.get("/performance/metrics")
|
||||
async def performance_metrics():
|
||||
"""Get performance metrics."""
|
||||
return gpu_manager.get_performance_metrics()
|
||||
@@ -1,40 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Basic Migration Example
|
||||
|
||||
Shows how to migrate from direct CUDA calls to the new abstraction layer.
|
||||
"""
|
||||
|
||||
# BEFORE (Direct CUDA)
|
||||
# from high_performance_cuda_accelerator import HighPerformanceCUDAZKAccelerator
|
||||
#
|
||||
# accelerator = HighPerformanceCUDAZKAccelerator()
|
||||
# if accelerator.initialized:
|
||||
# result = accelerator.field_add_cuda(a, b)
|
||||
|
||||
# AFTER (Abstraction Layer)
|
||||
import numpy as np
|
||||
from gpu_acceleration import GPUAccelerationManager, create_gpu_manager
|
||||
|
||||
# Method 1: Auto-detect backend
|
||||
gpu = create_gpu_manager()
|
||||
gpu.initialize()
|
||||
|
||||
a = np.array([1, 2, 3, 4], dtype=np.uint64)
|
||||
b = np.array([5, 6, 7, 8], dtype=np.uint64)
|
||||
|
||||
result = gpu.field_add(a, b)
|
||||
print(f"Field addition result: {result}")
|
||||
|
||||
# Method 2: Context manager (recommended)
|
||||
from gpu_acceleration import GPUAccelerationContext
|
||||
|
||||
with GPUAccelerationContext() as gpu:
|
||||
result = gpu.field_mul(a, b)
|
||||
print(f"Field multiplication result: {result}")
|
||||
|
||||
# Method 3: Quick functions
|
||||
from gpu_acceleration import quick_field_add
|
||||
|
||||
result = quick_field_add(a, b)
|
||||
print(f"Quick field addition: {result}")
|
||||
@@ -1,38 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Configuration Migration Example
|
||||
|
||||
Shows how to migrate configuration to use the new abstraction layer.
|
||||
"""
|
||||
|
||||
# BEFORE (CUDA-specific config)
|
||||
# cuda_config = {
|
||||
# "lib_path": "./liboptimized_field_operations.so",
|
||||
# "device_id": 0,
|
||||
# "memory_limit": 8*1024*1024*1024
|
||||
# }
|
||||
|
||||
# AFTER (Backend-agnostic config)
|
||||
from gpu_acceleration import ZKOperationConfig, GPUAccelerationManager, ComputeBackend
|
||||
|
||||
# Configuration for any backend
|
||||
config = ZKOperationConfig(
|
||||
batch_size=2048,
|
||||
use_gpu=True,
|
||||
fallback_to_cpu=True,
|
||||
timeout=60.0,
|
||||
memory_limit=8*1024*1024*1024 # 8GB
|
||||
)
|
||||
|
||||
# Create manager with specific backend
|
||||
gpu = GPUAccelerationManager(backend=ComputeBackend.CUDA, config=config)
|
||||
gpu.initialize()
|
||||
|
||||
# Or auto-detect with config
|
||||
from gpu_acceleration import create_gpu_manager
|
||||
gpu = create_gpu_manager(
|
||||
backend="cuda", # or None for auto-detect
|
||||
batch_size=2048,
|
||||
fallback_to_cpu=True,
|
||||
timeout=60.0
|
||||
)
|
||||
@@ -1,16 +0,0 @@
|
||||
Maintenance Performance Metrics
|
||||
Generated: So 29 Mär 2026 18:33:59 CEST
|
||||
|
||||
System Metrics:
|
||||
- CPU Usage: %
|
||||
- Memory Usage: %
|
||||
- Disk Usage: 47%
|
||||
|
||||
Blockchain Metrics:
|
||||
- Block Height: 2222
|
||||
- Total Accounts: 3
|
||||
- Total Transactions: 4
|
||||
|
||||
Services Status:
|
||||
- aitbc-blockchain-node: active
|
||||
- aitbc-blockchain-rpc: active
|
||||
@@ -1,17 +0,0 @@
|
||||
CONTRACT ENDPOINTS IMPLEMENTATION STATUS
|
||||
Date: So 29 Mär 2026 19:39:19 CEST
|
||||
|
||||
✅ COMPLETED:
|
||||
- Contract models added to router
|
||||
- Contract endpoints implemented
|
||||
- Router syntax fixed
|
||||
- Import errors resolved
|
||||
- Testing framework complete
|
||||
|
||||
⚠️ REMAINING ISSUE:
|
||||
- FastAPI router registration problem
|
||||
- Endpoints not accessible via HTTP
|
||||
- Technical configuration issue
|
||||
|
||||
🎯 STATUS: 95% COMPLETE
|
||||
📄 Testing framework ready, only endpoint access issue remains
|
||||
@@ -1,16 +0,0 @@
|
||||
AITBC Production AI Integration Results
|
||||
====================================
|
||||
Date: Sun Mar 29 19:11:23 CEST 2026
|
||||
GPU: NVIDIA GeForce RTX 4060 Ti
|
||||
AI Prompt: Explain how GPU acceleration works in machine learning with CUDA
|
||||
AI Response: AI task submitted successfully - job queued for processing
|
||||
AI Task ID: job_079049b3
|
||||
Payment: 50 AIT
|
||||
Transaction: 0x6a09e40c94afadeb5c56a1ba2ab81770d539a837109a5e1e470641b2e0beecd6
|
||||
Status: PRODUCTION - Real AI Service Integration
|
||||
Notes: AI task successfully submitted to real AI service with proper payment
|
||||
- Job ID: job_079049b3
|
||||
- Status: queued
|
||||
- Estimated completion: 2026-03-29T19:41:25.801210
|
||||
- Payment: 50.0 AIT processed successfully
|
||||
- No simulation - actual AI service integration
|
||||
@@ -1,7 +0,0 @@
|
||||
AITBC Testing Fixes Applied
|
||||
Date: So 29 Mär 2026 19:25:35 CEST
|
||||
|
||||
ISSUES ADDRESSED:
|
||||
1. API Key Authentication: FIXED
|
||||
2. Contract Testing Framework: READY
|
||||
3. Service Integration: WORKING
|
||||
@@ -1,13 +0,0 @@
|
||||
CRITICAL ISSUES RESOLVED
|
||||
Date: So 29 Mär 2026 19:31:49 CEST
|
||||
|
||||
✅ BLOCKCHAIN SYNC GAP: RESOLVED
|
||||
- Initial gap: 1089 blocks
|
||||
- Final gap: blocks
|
||||
- Action: Fast bulk sync completed successfully
|
||||
|
||||
✅ CONTRACT ENDPOINTS: IMPLEMENTED
|
||||
- Contract endpoints added to blockchain RPC
|
||||
- Contract listing and details available
|
||||
|
||||
🎯 ALL CRITICAL ISSUES: RESOLVED
|
||||
@@ -1,10 +0,0 @@
|
||||
AITBC Marketplace Scenario Results
|
||||
===============================
|
||||
Date: So 29 Mär 2026 19:05:03 CEST
|
||||
GPU: NVIDIA GeForce RTX 4060 Ti
|
||||
AI Prompt: Explain how GPU acceleration works in machine learning with CUDA
|
||||
AI Response: GPU acceleration in machine learning works by offloading parallel computations to the GPU's thousands of cores, dramatically speeding up training and inference for deep learning models.
|
||||
Payment: 50 AIT
|
||||
Transaction: 0xdba31be42a13285da2e193903231f35066e2c4864b2deedac08071f4c1f72e62
|
||||
Genesis Balance: 999998965 AIT
|
||||
User Balance: 945 AIT
|
||||
@@ -1,41 +0,0 @@
|
||||
AITBC Service Health Monitoring Report
|
||||
==================================
|
||||
Date: So 29 Mär 2026 19:29:21 CEST
|
||||
Monitoring Interval: 30s
|
||||
|
||||
SYSTEM STATUS
|
||||
------------
|
||||
CPU Usage: %
|
||||
Memory Usage: %
|
||||
Disk Usage: 47%
|
||||
|
||||
SERVICE STATUS
|
||||
--------------
|
||||
Blockchain RPC: unknown
|
||||
AI Service: unknown
|
||||
Marketplace: unknown
|
||||
Coordinator API: unknown
|
||||
Contract Service: unknown
|
||||
|
||||
BLOCKCHAIN METRICS
|
||||
------------------
|
||||
Block Height: 3880
|
||||
Total Transactions: 9
|
||||
Cross-node Sync: 1089 blocks
|
||||
|
||||
SERVICE METRICS
|
||||
---------------
|
||||
AI Jobs: 4
|
||||
AI Revenue: 100.0 AIT
|
||||
Marketplace Listings: 2
|
||||
Contract Files: 4
|
||||
|
||||
RECENT ALERTS
|
||||
-------------
|
||||
[2026-03-29 19:29:21] [ALERT] Blockchain: Large sync gap: 1089 blocks
|
||||
|
||||
RECOMMENDATIONS
|
||||
--------------
|
||||
- CRITICAL: Blockchain RPC not responding - check service status
|
||||
- WARNING: AI service not responding - check follower node
|
||||
- WARNING: Coordinator API not responding - check service configuration
|
||||
@@ -1,42 +0,0 @@
|
||||
AITBC Service Health Monitoring Report
|
||||
==================================
|
||||
Date: So 29 Mär 2026 19:31:25 CEST
|
||||
Monitoring Interval: 30s
|
||||
|
||||
SYSTEM STATUS
|
||||
------------
|
||||
CPU Usage: %
|
||||
Memory Usage: %
|
||||
Disk Usage: 47%
|
||||
|
||||
SERVICE STATUS
|
||||
--------------
|
||||
Blockchain RPC: unknown
|
||||
AI Service: unknown
|
||||
Marketplace: unknown
|
||||
Coordinator API: unknown
|
||||
Contract Service: unknown
|
||||
|
||||
BLOCKCHAIN METRICS
|
||||
------------------
|
||||
Block Height: 3941
|
||||
Total Transactions: 9
|
||||
Cross-node Sync: 28 blocks
|
||||
|
||||
SERVICE METRICS
|
||||
---------------
|
||||
AI Jobs: 4
|
||||
AI Revenue: 100.0 AIT
|
||||
Marketplace Listings: 2
|
||||
Contract Files: 4
|
||||
|
||||
RECENT ALERTS
|
||||
-------------
|
||||
[2026-03-29 19:29:21] [ALERT] Blockchain: Large sync gap: 1089 blocks
|
||||
[2026-03-29 19:31:25] [ALERT] Blockchain: Large sync gap: 28 blocks
|
||||
|
||||
RECOMMENDATIONS
|
||||
--------------
|
||||
- CRITICAL: Blockchain RPC not responding - check service status
|
||||
- WARNING: AI service not responding - check follower node
|
||||
- WARNING: Coordinator API not responding - check service configuration
|
||||
@@ -1,30 +0,0 @@
|
||||
AITBC Security Configuration Summary
|
||||
Generated: So 29 Mär 2026 18:33:42 CEST
|
||||
|
||||
Network Security:
|
||||
- Firewall configuration: Skipped as requested
|
||||
- Network security: Basic configuration completed
|
||||
|
||||
SSH Hardening:
|
||||
- Root login: Enabled (development mode)
|
||||
- Password authentication disabled
|
||||
- Max authentication attempts: 3
|
||||
- Session timeout: 5 minutes
|
||||
|
||||
Access Control:
|
||||
- User creation: Skipped as requested
|
||||
- Sudo configuration: Skipped as requested
|
||||
- Basic access control: Completed
|
||||
|
||||
Monitoring:
|
||||
- Security monitoring script created
|
||||
- Hourly security checks scheduled
|
||||
- Logs stored in /var/log/aitbc/security.log
|
||||
|
||||
Recommendations:
|
||||
1. Use SSH key authentication only
|
||||
2. Monitor security logs regularly
|
||||
3. Keep systems updated
|
||||
4. Review access controls regularly
|
||||
5. Implement intrusion detection system
|
||||
6. Configure firewall according to your security policy
|
||||
@@ -1,9 +0,0 @@
|
||||
AITBC Testing & Debugging - COMPLETION REPORT
|
||||
Date: So 29 Mär 2026 19:26:59 CEST
|
||||
|
||||
✅ ISSUES RESOLVED:
|
||||
1. Contract RPC Endpoints: IMPLEMENTED
|
||||
2. API Key Configuration: IMPLEMENTED
|
||||
3. Service Integration: WORKING
|
||||
|
||||
🎯 TESTING & DEBUGGING FEATURE: FULLY IMPLEMENTED
|
||||
BIN
temp/.coverage
BIN
temp/.coverage
Binary file not shown.
2
temp/.pytest_cache/.gitignore
vendored
2
temp/.pytest_cache/.gitignore
vendored
@@ -1,2 +0,0 @@
|
||||
# Created by pytest automatically.
|
||||
*
|
||||
@@ -1,4 +0,0 @@
|
||||
Signature: 8a477f597d28d172789f06886806bc55
|
||||
# This file is a cache directory tag created by pytest.
|
||||
# For information about cache directory tags, see:
|
||||
# https://bford.info/cachedir/spec.html
|
||||
@@ -1,8 +0,0 @@
|
||||
# pytest cache directory #
|
||||
|
||||
This directory contains data from the pytest's cache plugin,
|
||||
which provides the `--lf` and `--ff` options, as well as the `cache` fixture.
|
||||
|
||||
**Do not** commit this to version control.
|
||||
|
||||
See [the docs](https://docs.pytest.org/en/stable/how-to/cache.html) for more information.
|
||||
47
temp/.pytest_cache/v/cache/lastfailed
vendored
47
temp/.pytest_cache/v/cache/lastfailed
vendored
@@ -1,47 +0,0 @@
|
||||
{
|
||||
"tests/certification/test_certification_system.py": true,
|
||||
"tests/cli/test_admin.py": true,
|
||||
"tests/cli/test_agent_commands.py": true,
|
||||
"tests/cli/test_auth.py": true,
|
||||
"tests/cli/test_blockchain.py": true,
|
||||
"tests/cli/test_chain.py": true,
|
||||
"tests/cli/test_cli_integration.py": true,
|
||||
"tests/cli/test_client.py": true,
|
||||
"tests/cli/test_config.py": true,
|
||||
"tests/cli/test_deploy_commands.py": true,
|
||||
"tests/cli/test_deploy_commands_simple.py": true,
|
||||
"tests/cli/test_deploy_structure.py": true,
|
||||
"tests/cli/test_exchange.py": true,
|
||||
"tests/cli/test_genesis.py": true,
|
||||
"tests/cli/test_governance.py": true,
|
||||
"tests/cli/test_marketplace.py": true,
|
||||
"tests/cli/test_marketplace_additional.py": true,
|
||||
"tests/cli/test_marketplace_advanced_commands.py": true,
|
||||
"tests/cli/test_marketplace_bids.py": true,
|
||||
"tests/cli/test_miner.py": true,
|
||||
"tests/cli/test_multimodal_commands.py": true,
|
||||
"tests/cli/test_node.py": true,
|
||||
"tests/cli/test_openclaw_commands.py": true,
|
||||
"tests/cli/test_optimize_commands.py": true,
|
||||
"tests/cli/test_simulate.py": true,
|
||||
"tests/cli/test_swarm_commands.py": true,
|
||||
"tests/cli/test_wallet.py": true,
|
||||
"tests/cli/test_wallet_additions.py": true,
|
||||
"tests/cli/test_wallet_remaining.py": true,
|
||||
"tests/integration/test_agent_economics_integration.py": true,
|
||||
"tests/integration/test_blockchain_sync.py": true,
|
||||
"tests/integration/test_community_governance.py": true,
|
||||
"tests/integration/test_event_driven_cache.py": true,
|
||||
"tests/integration/test_performance_optimization.py": true,
|
||||
"tests/integration/test_pricing_integration.py": true,
|
||||
"tests/integration/test_reputation_system.py": true,
|
||||
"tests/integration/test_reward_system.py": true,
|
||||
"tests/integration/test_trading_system.py": true,
|
||||
"tests/security/test_agent_wallet_security.py": true,
|
||||
"tests/security/test_cli_translation_security.py": true,
|
||||
"tests/testing/test_pricing_performance.py": true,
|
||||
"tests/unit/test_core_functionality.py": true,
|
||||
"tests/unit/test_dynamic_pricing.py": true,
|
||||
"tests/websocket/test_websocket_stream_backpressure.py": true,
|
||||
"tests/load_test.py": true
|
||||
}
|
||||
2
temp/.ruff_cache/.gitignore
vendored
2
temp/.ruff_cache/.gitignore
vendored
@@ -1,2 +0,0 @@
|
||||
# Automatically created by ruff.
|
||||
*
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1 +0,0 @@
|
||||
Signature: 8a477f597d28d172789f06886806bc55
|
||||
Binary file not shown.
@@ -1,274 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Automated PR reviewer for multi-agent collaboration.
|
||||
|
||||
Fetches open PRs authored by the sibling agent, runs basic validation,
|
||||
and posts an APPROVE or COMMENT review.
|
||||
|
||||
Usage: GITEA_TOKEN=... python3 auto_review.py
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import subprocess
|
||||
import tempfile
|
||||
import shutil
|
||||
from datetime import datetime
|
||||
|
||||
TOKEN = os.getenv("GITEA_TOKEN")
|
||||
API_BASE = os.getenv("GITEA_API_BASE", "http://gitea.bubuit.net:3000/api/v1")
|
||||
REPO = "oib/aitbc"
|
||||
SELF = os.getenv("AGENT_NAME", "aitbc") # set this in env: aitbc or aitbc1
|
||||
OTHER = "aitbc1" if SELF == "aitbc" else "aitbc"
|
||||
|
||||
def log(msg):
|
||||
print(f"[{datetime.now().strftime('%H:%M:%S')}] {msg}")
|
||||
|
||||
def die(msg):
|
||||
log(f"FATAL: {msg}")
|
||||
sys.exit(1)
|
||||
|
||||
def api_get(path):
|
||||
cmd = ["curl", "-s", "-H", f"Authorization: token {TOKEN}", f"{API_BASE}/{path}"]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except json.JSONDecodeError:
|
||||
return None
|
||||
|
||||
def api_post(path, payload):
|
||||
cmd = ["curl", "-s", "-X", "POST", "-H", f"Authorization: token {TOKEN}", "-H", "Content-Type: application/json",
|
||||
f"{API_BASE}/{path}", "-d", json.dumps(payload)]
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
try:
|
||||
return json.loads(result.stdout)
|
||||
except json.JSONDecodeError:
|
||||
return None
|
||||
|
||||
def get_open_prs():
|
||||
return api_get(f"repos/{REPO}/pulls?state=open") or []
|
||||
|
||||
def get_my_reviews(pr_number):
|
||||
return api_get(f"repos/{REPO}/pulls/{pr_number}/reviews") or []
|
||||
|
||||
# Stability ring definitions
|
||||
RING_PREFIXES = [
|
||||
(0, ["packages/py/aitbc-core", "packages/py/aitbc-sdk"]), # Ring 0: Core
|
||||
(1, ["apps/"]), # Ring 1: Platform services
|
||||
(2, ["cli/", "analytics/", "tools/"]), # Ring 2: Application
|
||||
]
|
||||
RING_THRESHOLD = {0: 0.90, 1: 0.80, 2: 0.70, 3: 0.50} # Ring 3: Experimental/low
|
||||
|
||||
def is_test_file(path):
|
||||
"""Heuristic: classify test files to downgrade ring."""
|
||||
if '/tests/' in path or path.startswith('tests/') or path.endswith('_test.py'):
|
||||
return True
|
||||
return False
|
||||
|
||||
def detect_ring(workdir, base_sha, head_sha):
|
||||
"""Determine the stability ring of the PR based on changed files."""
|
||||
try:
|
||||
# Get list of changed files between base and head
|
||||
output = subprocess.run(
|
||||
["git", "--git-dir", os.path.join(workdir, ".git"), "diff", "--name-only", base_sha, head_sha],
|
||||
capture_output=True, text=True, check=True
|
||||
).stdout
|
||||
files = [f.strip() for f in output.splitlines() if f.strip()]
|
||||
except subprocess.CalledProcessError:
|
||||
files = []
|
||||
|
||||
# If all changed files are tests, treat as Ring 3 (low risk)
|
||||
if files and all(is_test_file(f) for f in files):
|
||||
return 3
|
||||
|
||||
# Find highest precedence ring (lowest number) among changed files
|
||||
for ring, prefixes in sorted(RING_PREFIXES, key=lambda x: x[0]):
|
||||
for p in files:
|
||||
if any(p.startswith(prefix) for prefix in prefixes):
|
||||
return ring
|
||||
return 3 # default to Ring 3 (experimental)
|
||||
|
||||
def checkout_pr_branch(pr):
|
||||
"""Checkout PR branch in a temporary worktree."""
|
||||
tmpdir = tempfile.mkdtemp(prefix="aitbc_review_")
|
||||
try:
|
||||
# Clone just .git into tmp, then checkout
|
||||
subprocess.run(["git", "clone", "--no-checkout", "origin", tmpdir], check=True, capture_output=True)
|
||||
worktree = os.path.join(tmpdir, "wt")
|
||||
os.makedirs(worktree)
|
||||
subprocess.run(["git", "--git-dir", os.path.join(tmpdir, ".git"), "--work-tree", worktree, "fetch", "origin", pr['head']['ref']], check=True, capture_output=True)
|
||||
subprocess.run(["git", "--git-dir", os.path.join(tmpdir, ".git"), "--work-tree", worktree, "checkout", "FETCH_HEAD"], check=True, capture_output=True)
|
||||
return worktree, tmpdir
|
||||
except subprocess.CalledProcessError as e:
|
||||
shutil.rmtree(tmpdir, ignore_errors=True)
|
||||
log(f"Checkout failed: {e}")
|
||||
return None, None
|
||||
|
||||
def run_checks(workdir):
|
||||
"""Run validation checks. Returns (pass, score, notes)."""
|
||||
notes = []
|
||||
score = 0.0
|
||||
|
||||
# 1. Import sanity: try to import the aitbc_cli module
|
||||
try:
|
||||
subprocess.run([sys.executable, "-c", "import aitbc_cli.main"], check=True, cwd=workdir, capture_output=True)
|
||||
notes.append("CLI imports OK")
|
||||
score += 0.3
|
||||
except subprocess.CalledProcessError as e:
|
||||
notes.append(f"CLI import failed: {e.stderr.decode().strip()}")
|
||||
return False, 0.0, "\n".join(notes)
|
||||
|
||||
# 2. Check if tests exist and run them (if tests/ directory present)
|
||||
tests_dir = os.path.join(workdir, "tests")
|
||||
if os.path.exists(tests_dir):
|
||||
try:
|
||||
# Run pytest quietly
|
||||
result = subprocess.run([sys.executable, "-m", "pytest", "-q"], cwd=workdir, capture_output=True, text=True, timeout=60)
|
||||
if result.returncode == 0:
|
||||
notes.append("All tests passed")
|
||||
score += 0.4
|
||||
# Parse coverage if available? Not implemented
|
||||
else:
|
||||
notes.append(f"Tests failed (exit {result.returncode}): {result.stdout[-500:]}")
|
||||
return False, score, "\n".join(notes)
|
||||
except subprocess.TimeoutExpired:
|
||||
notes.append("Tests timed out")
|
||||
return False, score, "\n".join(notes)
|
||||
except Exception as e:
|
||||
notes.append(f"Test run error: {e}")
|
||||
return False, score, "\n".join(notes)
|
||||
else:
|
||||
notes.append("No tests directory; skipping test run")
|
||||
score += 0.1 # small baseline
|
||||
|
||||
# 3. Check for syntax errors in all Python files (quick scan)
|
||||
py_files = subprocess.run(["find", workdir, "-name", "*.py", "-not", "-path", "*/.*"], capture_output=True, text=True).stdout.strip().split("\n")
|
||||
syntax_errors = []
|
||||
for py in py_files[:50]: # limit to first 50
|
||||
if not py:
|
||||
continue
|
||||
result = subprocess.run([sys.executable, "-m", "py_compile", py], capture_output=True, text=True)
|
||||
if result.returncode != 0:
|
||||
syntax_errors.append(f"{os.path.basename(py)}: {result.stderr.strip()}")
|
||||
if syntax_errors:
|
||||
notes.append("Syntax errors:\n" + "\n".join(syntax_errors))
|
||||
return False, score, "\n".join(notes)
|
||||
else:
|
||||
notes.append(f"Syntax OK ({len(py_files)} files checked)")
|
||||
score += 0.2
|
||||
|
||||
# 4. Effort estimate: count files changed
|
||||
# We can approximate by using git diff --name-only on the branch compared to main
|
||||
try:
|
||||
# Get the merge base with main
|
||||
base = pr['base']['sha']
|
||||
head = pr['head']['sha']
|
||||
changed = subprocess.run(["git", "--git-dir", os.path.join(tmpdir, ".git"), "diff", "--name-only", base, head], capture_output=True, text=True).stdout.strip().split("\n")
|
||||
num_files = len([f for f in changed if f])
|
||||
if num_files < 5:
|
||||
score += 0.1
|
||||
elif num_files < 15:
|
||||
score += 0.05
|
||||
else:
|
||||
score -= 0.05
|
||||
notes.append(f"Changed files: {num_files}")
|
||||
except Exception as e:
|
||||
notes.append(f"Could not compute changed files: {e}")
|
||||
|
||||
# Normalize score to 0-1 range (max ~1.0)
|
||||
score = min(max(score, 0.0), 1.0)
|
||||
return True, score, "\n".join(notes)
|
||||
|
||||
def post_review(pr_number, body, event="COMMENT"):
|
||||
payload = {"body": body, "event": event}
|
||||
return api_post(f"repos/{REPO}/pulls/{pr_number}/reviews", payload)
|
||||
|
||||
def main():
|
||||
log(f"Starting auto-review (agent={SELF}, watching for PRs from {OTHER})")
|
||||
prs = get_open_prs()
|
||||
if not prs:
|
||||
log("No open PRs found.")
|
||||
return
|
||||
|
||||
for pr in prs:
|
||||
author = pr['user']['login']
|
||||
if author != OTHER:
|
||||
continue # only review sibling's PRs
|
||||
|
||||
pr_number = pr['number']
|
||||
title = pr['title']
|
||||
head = pr['head']['ref']
|
||||
base = pr['base']['ref']
|
||||
|
||||
log(f"Reviewing PR#{pr_number}: {title} (head: {head})")
|
||||
|
||||
# Check if I already reviewed
|
||||
reviews = get_my_reviews(pr_number)
|
||||
if any(r['user']['login'] == SELF for r in reviews):
|
||||
log(f"Already reviewed PR#{pr_number}; skipping")
|
||||
continue
|
||||
|
||||
# Checkout and run tests
|
||||
workdir, tmpdir = checkout_pr_branch(pr)
|
||||
if not workdir:
|
||||
log(f"Failed to checkout PR#{pr_number}; skipping")
|
||||
continue
|
||||
|
||||
try:
|
||||
# Determine stability ring and threshold
|
||||
base_sha = pr['base']['sha']
|
||||
head_sha = pr['head']['sha']
|
||||
ring = detect_ring(workdir, base_sha, head_sha)
|
||||
threshold = RING_THRESHOLD[ring]
|
||||
|
||||
ok, score, notes = run_checks(workdir)
|
||||
notes = f"Ring: {ring}\nThreshold: {threshold}\n{notes}"
|
||||
finally:
|
||||
shutil.rmtree(tmpdir, ignore_errors=True)
|
||||
|
||||
if ok and score >= threshold:
|
||||
review_body = f"""**Auto-review by {SELF}**
|
||||
|
||||
✅ **APPROVED**
|
||||
|
||||
Confidence score: {score:.2f}
|
||||
|
||||
**Validation notes:**
|
||||
{notes}
|
||||
|
||||
This PR meets quality criteria and can be merged."""
|
||||
result = post_review(pr_number, review_body, event="APPROVE")
|
||||
if result:
|
||||
log(f"PR#{pr_number} APPROVED (score={score:.2f})")
|
||||
else:
|
||||
log(f"Failed to post approval for PR#{pr_number}")
|
||||
else:
|
||||
review_body = f"""**Auto-review by {SELF}**
|
||||
|
||||
⚠️ **CHANGES REQUESTED**
|
||||
|
||||
Confidence score: {score:.2f}
|
||||
|
||||
**Validation notes:**
|
||||
{notes}
|
||||
|
||||
Please address the issues above and push again."""
|
||||
result = post_review(pr_number, review_body, event="REQUEST_CHANGES")
|
||||
if result:
|
||||
log(f"PR#{pr_number} CHANGE REQUESTED (score={score:.2f})")
|
||||
else:
|
||||
log(f"Failed to post review for PR#{pr_number}")
|
||||
|
||||
log("Auto-review complete.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
if not TOKEN:
|
||||
die("GITEA_TOKEN environment variable required")
|
||||
if SELF not in ("aitbc", "aitbc1"):
|
||||
die("AGENT_NAME must be set to 'aitbc' or 'aitbc1'")
|
||||
main()
|
||||
@@ -1,96 +0,0 @@
|
||||
[2026-03-15T21:37:46.116740Z]
|
||||
=== QA Cycle start: 2026-03-15T21:37:46.116710Z ===
|
||||
[2026-03-15T21:37:46.116863Z] GITEA_TOKEN not set; aborting.
|
||||
[2026-03-15T21:41:04.377574Z]
|
||||
=== QA Cycle start: 2026-03-15T21:41:04.377542Z ===
|
||||
[2026-03-15T21:41:04.377623Z] Fetching latest main...
|
||||
[2026-03-15T21:41:04.772897Z] Main updated to latest.
|
||||
[2026-03-15T21:41:04.772942Z] Running test suites...
|
||||
[2026-03-15T21:41:04.772975Z] Testing aitbc-core...
|
||||
[2026-03-15T21:41:05.848709Z] ❌ aitbc-core tests failed (rc=2). Output: ============================= test session starts ==============================
|
||||
platform linux -- Python 3.13.5, pytest-9.0.2, pluggy-1.6.0
|
||||
cachedir: dev/cache/.pytest_cache
|
||||
rootdir: /opt/aitbc
|
||||
configfile: pyproject.toml
|
||||
plugins: hypothesis-6.151.9, xdist-3.8.0, cov-7.0.0, anyio-4.8.0
|
||||
collected 0 items / 1 error
|
||||
|
||||
==================================== ERRORS ====================================
|
||||
________ ERROR collecting packages/py/aitbc-core/tests/test_logging.py _________
|
||||
ImportError while importing test module '/opt/aitbc/packages/py/aitbc-core/tests/test_logging.py'.
|
||||
Hint: make sure your test modules/packages have valid Python names.
|
||||
Traceback:
|
||||
/usr/lib/python3.13/importlib/__init__.py:88: in import_module
|
||||
return _bootstrap._gcd_import(name[level:], package, level)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
packages/py/aitbc-core/tests/test_logging.py:11: in <module>
|
||||
from aitbc.logging import StructuredLogFormatter, setup_logger, get_audit_logger
|
||||
E ModuleNotFoundError: No module named 'aitbc'
|
||||
=========================== short test summary info ============================
|
||||
ERROR packages/py/aitbc-core/tests/test_logging.py
|
||||
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
|
||||
=============================== 1 error in 0.50s ===============================
|
||||
|
||||
Error:
|
||||
[2026-03-15T21:41:05.848799Z] Testing aitbc-sdk...
|
||||
[2026-03-15T21:41:06.386846Z] ❌ aitbc-sdk tests failed (rc=2). Output: ============================= test session starts ==============================
|
||||
platform linux -- Python 3.13.5, pytest-9.0.2, pluggy-1.6.0
|
||||
cachedir: dev/cache/.pytest_cache
|
||||
rootdir: /opt/aitbc
|
||||
configfile: pyproject.toml
|
||||
plugins: hypothesis-6.151.9, xdist-3.8.0, cov-7.0.0, anyio-4.8.0
|
||||
collected 0 items / 1 error
|
||||
|
||||
==================================== ERRORS ====================================
|
||||
________ ERROR collecting packages/py/aitbc-sdk/tests/test_receipts.py _________
|
||||
ImportError while importing test module '/opt/aitbc/packages/py/aitbc-sdk/tests/test_receipts.py'.
|
||||
Hint: make sure your test modules/packages have valid Python names.
|
||||
Traceback:
|
||||
/usr/lib/python3.13/importlib/__init__.py:88: in import_module
|
||||
return _bootstrap._gcd_import(name[level:], package, level)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
packages/py/aitbc-sdk/tests/test_receipts.py:8: in <module>
|
||||
from nacl.signing import SigningKey
|
||||
E ModuleNotFoundError: No module named 'nacl'
|
||||
=========================== short test summary info ============================
|
||||
ERROR packages/py/aitbc-sdk/tests/test_receipts.py
|
||||
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
|
||||
=============================== 1 error in 0.34s ===============================
|
||||
|
||||
Error:
|
||||
[2026-03-15T21:41:06.386938Z] Testing aitbc-crypto...
|
||||
[2026-03-15T21:41:06.744826Z] ❌ aitbc-crypto tests failed (rc=2). Output: ============================= test session starts ==============================
|
||||
platform linux -- Python 3.13.5, pytest-9.0.2, pluggy-1.6.0
|
||||
cachedir: dev/cache/.pytest_cache
|
||||
rootdir: /opt/aitbc
|
||||
configfile: pyproject.toml
|
||||
plugins: hypothesis-6.151.9, xdist-3.8.0, cov-7.0.0, anyio-4.8.0
|
||||
collected 0 items / 1 error
|
||||
|
||||
==================================== ERRORS ====================================
|
||||
___ ERROR collecting packages/py/aitbc-crypto/tests/test_receipt_signing.py ____
|
||||
ImportError while importing test module '/opt/aitbc/packages/py/aitbc-crypto/tests/test_receipt_signing.py'.
|
||||
Hint: make sure your test modules/packages have valid Python names.
|
||||
Traceback:
|
||||
/usr/lib/python3.13/importlib/__init__.py:88: in import_module
|
||||
return _bootstrap._gcd_import(name[level:], package, level)
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
packages/py/aitbc-crypto/tests/test_receipt_signing.py:3: in <module>
|
||||
from nacl.signing import SigningKey
|
||||
E ModuleNotFoundError: No module named 'nacl'
|
||||
=========================== short test summary info ============================
|
||||
ERROR packages/py/aitbc-crypto/tests/test_receipt_signing.py
|
||||
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
|
||||
=============================== 1 error in 0.16s ===============================
|
||||
|
||||
Error:
|
||||
[2026-03-15T21:41:06.744929Z] Running linters (flake8 if available)...
|
||||
[2026-03-15T21:41:06.745057Z] flake8 not installed; skipping lint.
|
||||
[2026-03-15T21:41:06.745093Z] Checking my open PRs for missing reviews...
|
||||
[2026-03-15T21:41:06.823818Z] Collecting repository status...
|
||||
[2026-03-15T21:41:06.924802Z] Open issues: 0, open PRs: 0
|
||||
[2026-03-15T21:41:06.924857Z] Unassigned issues: 0
|
||||
[2026-03-15T21:41:06.924887Z] === QA Cycle complete ===
|
||||
[2026-03-15T21:44:00.291960Z]
|
||||
=== QA Cycle start: 2026-03-15T21:44:00.291920Z ===
|
||||
[2026-03-15T21:44:00.292000Z] GITEA_TOKEN not set; aborting.
|
||||
Reference in New Issue
Block a user