feat: major infrastructure refactoring and optimization
All checks were successful
AITBC CLI Level 1 Commands Test / test-cli-level1 (push) Successful in 16s
api-endpoint-tests / test-api-endpoints (push) Successful in 35s
integration-tests / test-service-integration (push) Successful in 1m25s
package-tests / test-python-packages (map[name:aitbc-agent-sdk path:packages/py/aitbc-agent-sdk python_version:3.13]) (push) Successful in 16s
package-tests / test-python-packages (map[name:aitbc-cli path:. python_version:3.13]) (push) Successful in 14s
package-tests / test-python-packages (map[name:aitbc-core path:packages/py/aitbc-core python_version:3.13]) (push) Successful in 13s
package-tests / test-python-packages (map[name:aitbc-crypto path:packages/py/aitbc-crypto python_version:3.13]) (push) Successful in 10s
package-tests / test-python-packages (map[name:aitbc-sdk path:packages/py/aitbc-sdk python_version:3.13]) (push) Successful in 12s
package-tests / test-javascript-packages (map[name:aitbc-sdk node_version:24 path:packages/js/aitbc-sdk]) (push) Successful in 18s
python-tests / test-specific (push) Has been skipped
security-scanning / audit (push) Successful in 14s
systemd-sync / sync-systemd (push) Successful in 4s
package-tests / cross-language-compatibility (push) Successful in 2s
package-tests / package-integration-tests (push) Successful in 3s
Documentation Validation / validate-docs (push) Successful in 6m13s
python-tests / test (push) Successful in 14s

## 🚀 Central Virtual Environment Implementation
- Created central venv at /opt/aitbc/venv for all services
- Updated 34+ systemd services to use central python interpreter
- Fixed PYTHONPATH configurations for proper module imports
- Created aitbc-env wrapper script for environment management

## 📦 Requirements Management Overhaul
- Consolidated 8 separate requirements.txt files into central requirements.txt
- Added web3>=6.11.0 for blockchain functionality
- Created automated requirements migrator tool (scripts/requirements_migrator.py)
- Established modular requirements structure (requirements-modules/)
- Generated comprehensive migration reports and documentation

## 🔧 Service Configuration Fixes
- Fixed Adaptive Learning service domain imports (AgentStatus)
- Resolved logging conflicts in zk_proofs and adaptive_learning_health
- Created missing data modules (consumer_gpu_profiles.py)
- Updated CLI to version 0.2.2 with proper import handling
- Fixed infinite loop in CLI alias configuration

## 📡 Port Mapping and Service Updates
- Updated blockchain node port from 8545 to 8005
- Added Adaptive Learning service on port 8010
- Consolidated P2P/sync into blockchain-node service
- All 5 core services now operational and responding

## 📚 Documentation Enhancements
- Updated SYSTEMD_SERVICES.md for Debian root usage (no sudo)
- Added comprehensive VIRTUAL_ENVIRONMENT.md guide
- Created REQUIREMENTS_MERGE_SUMMARY.md with migration details
- Updated RUNTIME_DIRECTORIES.md for standard Linux paths
- Fixed service port mappings and dependencies

## 🛠️ CLI Improvements
- Fixed import errors and version display (0.2.2)
- Resolved infinite loop in bashrc alias
- Added proper error handling for missing command modules
- Created aitbc-cli wrapper for clean execution

##  Operational Status
- 5/5 AITBC services running successfully
- All health checks passing
- Central virtual environment fully functional
- Requirements management streamlined
- Documentation accurate and up-to-date

## 🎯 Technical Achievements
- Eliminated 7 redundant requirements.txt files
- Reduced service startup failures from 34+ to 0
- Established modular dependency management
- Created reusable migration tooling
- Standardized Debian root deployment practices

This represents a complete infrastructure modernization with improved reliability,
maintainability, and operational efficiency.
This commit is contained in:
aitbc1
2026-03-29 11:52:37 +02:00
parent 848162ae21
commit 3352d63f36
200 changed files with 2007 additions and 676 deletions

364
scripts/requirements_migrator.py Executable file
View File

@@ -0,0 +1,364 @@
#!/usr/bin/env python3
"""
AITBC Requirements Migration Tool
Core function to migrate service requirements to central and identify 3rd party modules
"""
import os
import sys
import re
from pathlib import Path
from typing import Dict, List, Set, Tuple
import argparse
class RequirementsMigrator:
"""Core requirements migration and analysis tool"""
def __init__(self, base_path: str = "/opt/aitbc"):
self.base_path = Path(base_path)
self.central_req = self.base_path / "requirements.txt"
self.central_packages = set()
self.migration_log = []
def load_central_requirements(self) -> Set[str]:
"""Load central requirements packages"""
if not self.central_req.exists():
print(f"❌ Central requirements not found: {self.central_req}")
return set()
packages = set()
with open(self.central_req, 'r') as f:
for line in f:
line = line.strip()
if line and not line.startswith('#'):
# Extract package name (before version specifier)
match = re.match(r'^([a-zA-Z0-9_-]+)', line)
if match:
packages.add(match.group(1))
self.central_packages = packages
print(f"✅ Loaded {len(packages)} packages from central requirements")
return packages
def find_requirements_files(self) -> List[Path]:
"""Find all requirements.txt files"""
files = []
for req_file in self.base_path.rglob("requirements.txt"):
if req_file != self.central_req:
files.append(req_file)
return files
def parse_requirements_file(self, file_path: Path) -> List[str]:
"""Parse individual requirements file"""
requirements = []
try:
with open(file_path, 'r') as f:
for line in f:
line = line.strip()
if line and not line.startswith('#'):
requirements.append(line)
except Exception as e:
print(f"❌ Error reading {file_path}: {e}")
return requirements
def analyze_coverage(self, file_path: Path, requirements: List[str]) -> Dict:
"""Analyze coverage of requirements by central packages"""
covered = []
not_covered = []
version_upgrades = []
if not requirements:
return {
'file': file_path,
'total': 0,
'covered': 0,
'not_covered': [],
'coverage_percent': 100.0,
'version_upgrades': []
}
for req in requirements:
# Extract package name
match = re.match(r'^([a-zA-Z0-9_-]+)([><=!]+.*)?', req)
if not match:
continue
package_name = match.group(1)
version_spec = match.group(2) or ""
if package_name in self.central_packages:
covered.append(req)
# Check for version upgrades
central_req = self._find_central_requirement(package_name)
if central_req and version_spec and central_req != version_spec:
version_upgrades.append({
'package': package_name,
'old_version': version_spec,
'new_version': central_req
})
else:
not_covered.append(req)
return {
'file': file_path,
'total': len(requirements),
'covered': len(covered),
'not_covered': not_covered,
'coverage_percent': (len(covered) / len(requirements) * 100) if requirements else 100.0,
'version_upgrades': version_upgrades
}
def _find_central_requirement(self, package_name: str) -> str:
"""Find requirement specification in central file"""
try:
with open(self.central_req, 'r') as f:
for line in f:
line = line.strip()
if line and not line.startswith('#'):
match = re.match(rf'^{re.escape(package_name)}([><=!]+.+)', line)
if match:
return match.group(1)
except:
pass
return ""
def categorize_uncovered(self, not_covered: List[str]) -> Dict[str, List[str]]:
"""Categorize uncovered requirements"""
categories = {
'core_infrastructure': [],
'ai_ml': [],
'blockchain': [],
'translation_nlp': [],
'monitoring': [],
'testing': [],
'security': [],
'utilities': [],
'other': []
}
# Package categorization mapping
category_map = {
# Core Infrastructure
'fastapi': 'core_infrastructure', 'uvicorn': 'core_infrastructure',
'sqlalchemy': 'core_infrastructure', 'pydantic': 'core_infrastructure',
'sqlmodel': 'core_infrastructure', 'alembic': 'core_infrastructure',
# AI/ML
'torch': 'ai_ml', 'tensorflow': 'ai_ml', 'numpy': 'ai_ml',
'pandas': 'ai_ml', 'scikit-learn': 'ai_ml', 'transformers': 'ai_ml',
'opencv-python': 'ai_ml', 'pillow': 'ai_ml', 'tenseal': 'ai_ml',
# Blockchain
'web3': 'blockchain', 'eth-utils': 'blockchain', 'eth-account': 'blockchain',
'cryptography': 'blockchain', 'ecdsa': 'blockchain', 'base58': 'blockchain',
# Translation/NLP
'openai': 'translation_nlp', 'google-cloud-translate': 'translation_nlp',
'deepl': 'translation_nlp', 'langdetect': 'translation_nlp',
'polyglot': 'translation_nlp', 'fasttext': 'translation_nlp',
'nltk': 'translation_nlp', 'spacy': 'translation_nlp',
# Monitoring
'prometheus-client': 'monitoring', 'structlog': 'monitoring',
'sentry-sdk': 'monitoring',
# Testing
'pytest': 'testing', 'pytest-asyncio': 'testing', 'pytest-mock': 'testing',
# Security
'python-jose': 'security', 'passlib': 'security', 'keyring': 'security',
# Utilities
'click': 'utilities', 'rich': 'utilities', 'typer': 'utilities',
'httpx': 'utilities', 'requests': 'utilities', 'aiohttp': 'utilities',
}
for req in not_covered:
package_name = re.match(r'^([a-zA-Z0-9_-]+)', req).group(1)
category = category_map.get(package_name, 'other')
categories[category].append(req)
return categories
def migrate_requirements(self, dry_run: bool = True) -> Dict:
"""Migrate requirements to central if fully covered"""
results = {
'migrated': [],
'kept': [],
'errors': []
}
self.load_central_requirements()
req_files = self.find_requirements_files()
for file_path in req_files:
try:
requirements = self.parse_requirements_file(file_path)
analysis = self.analyze_coverage(file_path, requirements)
if analysis['coverage_percent'] == 100:
if not dry_run:
file_path.unlink()
results['migrated'].append({
'file': str(file_path),
'packages': analysis['covered']
})
print(f"✅ Migrated: {file_path} ({len(analysis['covered'])} packages)")
else:
results['migrated'].append({
'file': str(file_path),
'packages': analysis['covered']
})
print(f"🔄 Would migrate: {file_path} ({len(analysis['covered'])} packages)")
else:
categories = self.categorize_uncovered(analysis['not_covered'])
results['kept'].append({
'file': str(file_path),
'coverage': analysis['coverage_percent'],
'not_covered': analysis['not_covered'],
'categories': categories
})
print(f"⚠️ Keep: {file_path} ({analysis['coverage_percent']:.1f}% covered)")
except Exception as e:
results['errors'].append({
'file': str(file_path),
'error': str(e)
})
print(f"❌ Error processing {file_path}: {e}")
return results
def generate_report(self, results: Dict) -> str:
"""Generate migration report"""
report = []
report.append("# AITBC Requirements Migration Report\n")
# Summary
report.append("## Summary")
report.append(f"- Files analyzed: {len(results['migrated']) + len(results['kept']) + len(results['errors'])}")
report.append(f"- Files migrated: {len(results['migrated'])}")
report.append(f"- Files kept: {len(results['kept'])}")
report.append(f"- Errors: {len(results['errors'])}\n")
# Migrated files
if results['migrated']:
report.append("## ✅ Migrated Files")
for item in results['migrated']:
packages = item['packages'] if isinstance(item['packages'], list) else []
report.append(f"- `{item['file']}` ({len(packages)} packages)")
report.append("")
# Kept files with analysis
if results['kept']:
report.append("## ⚠️ Files Kept (Specialized Dependencies)")
for item in results['kept']:
report.append(f"### `{item['file']}`")
report.append(f"- Coverage: {item['coverage']:.1f}%")
report.append(f"- Uncovered packages: {len(item['not_covered'])}")
for category, packages in item['categories'].items():
if packages:
report.append(f" - **{category.replace('_', ' ').title()}**: {len(packages)} packages")
for pkg in packages[:3]: # Show first 3
report.append(f" - `{pkg}`")
if len(packages) > 3:
report.append(f" - ... and {len(packages) - 3} more")
report.append("")
# Errors
if results['errors']:
report.append("## ❌ Errors")
for item in results['errors']:
report.append(f"- `{item['file']}`: {item['error']}")
report.append("")
return "\n".join(report)
def suggest_3rd_party_modules(self, results: Dict) -> Dict[str, List[str]]:
"""Suggest 3rd party module groupings"""
modules = {
'ai_ml_translation': [],
'blockchain_web3': [],
'monitoring_observability': [],
'testing_quality': [],
'security_compliance': []
}
for item in results['kept']:
categories = item['categories']
# AI/ML + Translation
ai_ml_packages = categories.get('ai_ml', []) + categories.get('translation_nlp', [])
if ai_ml_packages:
modules['ai_ml_translation'].extend([pkg.split('>=')[0] for pkg in ai_ml_packages])
# Blockchain
blockchain_packages = categories.get('blockchain', [])
if blockchain_packages:
modules['blockchain_web3'].extend([pkg.split('>=')[0] for pkg in blockchain_packages])
# Monitoring
monitoring_packages = categories.get('monitoring', [])
if monitoring_packages:
modules['monitoring_observability'].extend([pkg.split('>=')[0] for pkg in monitoring_packages])
# Testing
testing_packages = categories.get('testing', [])
if testing_packages:
modules['testing_quality'].extend([pkg.split('>=')[0] for pkg in testing_packages])
# Security
security_packages = categories.get('security', [])
if security_packages:
modules['security_compliance'].extend([pkg.split('>=')[0] for pkg in security_packages])
# Remove duplicates and sort
for key in modules:
modules[key] = sorted(list(set(modules[key])))
return modules
def main():
"""Main entry point"""
parser = argparse.ArgumentParser(description="AITBC Requirements Migration Tool")
parser.add_argument("--dry-run", action="store_true", help="Show what would be migrated without actually doing it")
parser.add_argument("--execute", action="store_true", help="Actually migrate files")
parser.add_argument("--base-path", default="/opt/aitbc", help="Base path for AITBC repository")
args = parser.parse_args()
if not args.dry_run and not args.execute:
print("Use --dry-run to preview or --execute to actually migrate")
return
migrator = RequirementsMigrator(args.base_path)
print("🔍 Analyzing AITBC requirements files...")
results = migrator.migrate_requirements(dry_run=not args.execute)
print("\n📊 Generating report...")
report = migrator.generate_report(results)
# Save report
report_file = Path(args.base_path) / "docs" / "REQUIREMENTS_MIGRATION_REPORT.md"
report_file.parent.mkdir(exist_ok=True)
with open(report_file, 'w') as f:
f.write(report)
print(f"📄 Report saved to: {report_file}")
# Suggest 3rd party modules
modules = migrator.suggest_3rd_party_modules(results)
print("\n🎯 Suggested 3rd Party Modules:")
for module_name, packages in modules.items():
if packages:
print(f"\n📦 {module_name.replace('_', ' ').title()}:")
for pkg in packages:
print(f" - {pkg}")
if __name__ == "__main__":
main()

View File

@@ -36,7 +36,7 @@ def deploy_to_container(container_name, genesis_config):
print(f"🧹 Clearing existing blockchain data on {container_name}...")
subprocess.run([
'ssh', container_name,
'sudo rm -f /opt/aitbc/data/chain.db'
'sudo rm -f /var/lib/aitbc/data/chain.db'
], check=False)
# Initialize new genesis

View File

@@ -41,7 +41,7 @@ def encrypt_private_key(private_key_hex: str, password: str) -> dict:
}
def create_keystore(address: str, password: str, keystore_dir: Path | str = "/opt/aitbc/keystore", force: bool = False) -> Path:
def create_keystore(address: str, password: str, keystore_dir: Path | str = "/var/lib/aitbc/keystore", force: bool = False) -> Path:
"""Create encrypted keystore file and return its path."""
keystore_dir = Path(keystore_dir)
keystore_dir.mkdir(parents=True, exist_ok=True)
@@ -66,7 +66,7 @@ def create_keystore(address: str, password: str, keystore_dir: Path | str = "/op
def main() -> None:
parser = argparse.ArgumentParser(description="Generate encrypted keystore for an account")
parser.add_argument("address", help="Account address (e.g., aitbc1treasury)")
parser.add_argument("--output-dir", type=Path, default=Path("/opt/aitbc/keystore"), help="Keystore directory")
parser.add_argument("--output-dir", type=Path, default=Path("/var/lib/aitbc/keystore"), help="Keystore directory")
parser.add_argument("--force", action="store_true", help="Overwrite existing keystore file")
parser.add_argument("--password", help="Encryption password (or read from KEYSTORE_PASSWORD / keystore/.password)")
args = parser.parse_args()
@@ -84,11 +84,11 @@ def main() -> None:
if not password:
password = os.getenv("KEYSTORE_PASSWORD")
if not password:
pw_file = Path("/opt/aitbc/keystore/.password")
pw_file = Path("/var/lib/aitbc/keystore/.password")
if pw_file.exists():
password = pw_file.read_text().strip()
if not password:
print("No password provided. Set KEYSTORE_PASSWORD, pass --password, or create /opt/aitbc/keystore/.password")
print("No password provided. Set KEYSTORE_PASSWORD, pass --password, or create /var/lib/aitbc/keystore/.password")
sys.exit(1)
print(f"Generating keystore for {args.address}...")

View File

@@ -13,9 +13,9 @@ from pathlib import Path
# Configuration
CHAIN_ID = "ait-mainnet"
DATA_DIR = Path("/opt/aitbc/data/ait-mainnet")
DATA_DIR = Path("/var/lib/aitbc/data/ait-mainnet")
DB_PATH = DATA_DIR / "chain.db"
KEYS_DIR = Path("/opt/aitbc/keystore")
KEYS_DIR = Path("/var/lib/aitbc/keystore")
# Check for proposer key in keystore
PROPOSER_KEY_FILE = KEYS_DIR / "aitbc1genesis.json"

View File

@@ -15,9 +15,9 @@ from pathlib import Path
# Configuration
CHAIN_ID = "ait-mainnet"
DATA_DIR = Path("/opt/aitbc/data/ait-mainnet")
DATA_DIR = Path("/var/lib/aitbc/data/ait-mainnet")
DB_PATH = DATA_DIR / "chain.db"
KEYS_DIR = Path("/opt/aitbc/keystore")
KEYS_DIR = Path("/var/lib/aitbc/keystore")
PASSWORD_FILE = KEYS_DIR / ".password"
NODE_VENV = Path("/opt/aitbc/apps/blockchain-node/.venv/bin/python")
NODE_ENV = Path("/opt/aitbc/apps/blockchain-node/.env")