All checks were successful
AITBC CLI Level 1 Commands Test / test-cli-level1 (push) Successful in 16s
api-endpoint-tests / test-api-endpoints (push) Successful in 35s
integration-tests / test-service-integration (push) Successful in 1m25s
package-tests / test-python-packages (map[name:aitbc-agent-sdk path:packages/py/aitbc-agent-sdk python_version:3.13]) (push) Successful in 16s
package-tests / test-python-packages (map[name:aitbc-cli path:. python_version:3.13]) (push) Successful in 14s
package-tests / test-python-packages (map[name:aitbc-core path:packages/py/aitbc-core python_version:3.13]) (push) Successful in 13s
package-tests / test-python-packages (map[name:aitbc-crypto path:packages/py/aitbc-crypto python_version:3.13]) (push) Successful in 10s
package-tests / test-python-packages (map[name:aitbc-sdk path:packages/py/aitbc-sdk python_version:3.13]) (push) Successful in 12s
package-tests / test-javascript-packages (map[name:aitbc-sdk node_version:24 path:packages/js/aitbc-sdk]) (push) Successful in 18s
python-tests / test-specific (push) Has been skipped
security-scanning / audit (push) Successful in 14s
systemd-sync / sync-systemd (push) Successful in 4s
package-tests / cross-language-compatibility (push) Successful in 2s
package-tests / package-integration-tests (push) Successful in 3s
Documentation Validation / validate-docs (push) Successful in 6m13s
python-tests / test (push) Successful in 14s
## 🚀 Central Virtual Environment Implementation - Created central venv at /opt/aitbc/venv for all services - Updated 34+ systemd services to use central python interpreter - Fixed PYTHONPATH configurations for proper module imports - Created aitbc-env wrapper script for environment management ## 📦 Requirements Management Overhaul - Consolidated 8 separate requirements.txt files into central requirements.txt - Added web3>=6.11.0 for blockchain functionality - Created automated requirements migrator tool (scripts/requirements_migrator.py) - Established modular requirements structure (requirements-modules/) - Generated comprehensive migration reports and documentation ## 🔧 Service Configuration Fixes - Fixed Adaptive Learning service domain imports (AgentStatus) - Resolved logging conflicts in zk_proofs and adaptive_learning_health - Created missing data modules (consumer_gpu_profiles.py) - Updated CLI to version 0.2.2 with proper import handling - Fixed infinite loop in CLI alias configuration ## 📡 Port Mapping and Service Updates - Updated blockchain node port from 8545 to 8005 - Added Adaptive Learning service on port 8010 - Consolidated P2P/sync into blockchain-node service - All 5 core services now operational and responding ## 📚 Documentation Enhancements - Updated SYSTEMD_SERVICES.md for Debian root usage (no sudo) - Added comprehensive VIRTUAL_ENVIRONMENT.md guide - Created REQUIREMENTS_MERGE_SUMMARY.md with migration details - Updated RUNTIME_DIRECTORIES.md for standard Linux paths - Fixed service port mappings and dependencies ## 🛠️ CLI Improvements - Fixed import errors and version display (0.2.2) - Resolved infinite loop in bashrc alias - Added proper error handling for missing command modules - Created aitbc-cli wrapper for clean execution ## ✅ Operational Status - 5/5 AITBC services running successfully - All health checks passing - Central virtual environment fully functional - Requirements management streamlined - Documentation accurate and up-to-date ## 🎯 Technical Achievements - Eliminated 7 redundant requirements.txt files - Reduced service startup failures from 34+ to 0 - Established modular dependency management - Created reusable migration tooling - Standardized Debian root deployment practices This represents a complete infrastructure modernization with improved reliability, maintainability, and operational efficiency.
104 lines
3.2 KiB
Python
Executable File
104 lines
3.2 KiB
Python
Executable File
#!/usr/bin/env python3
|
|
"""Complete migration script for Coordinator API"""
|
|
|
|
import sqlite3
|
|
import psycopg2
|
|
from psycopg2.extras import RealDictCursor
|
|
import json
|
|
from decimal import Decimal
|
|
|
|
# Database configurations
|
|
SQLITE_DB = "/var/lib/aitbc/data/coordinator.db"
|
|
PG_CONFIG = {
|
|
"host": "localhost",
|
|
"database": "aitbc_coordinator",
|
|
"user": "aitbc_user",
|
|
"password": "aitbc_password",
|
|
"port": 5432
|
|
}
|
|
|
|
def migrate_all_data():
|
|
"""Migrate all data from SQLite to PostgreSQL"""
|
|
|
|
print("\nStarting complete data migration...")
|
|
|
|
# Connect to SQLite
|
|
sqlite_conn = sqlite3.connect(SQLITE_DB)
|
|
sqlite_conn.row_factory = sqlite3.Row
|
|
sqlite_cursor = sqlite_conn.cursor()
|
|
|
|
# Connect to PostgreSQL
|
|
pg_conn = psycopg2.connect(**PG_CONFIG)
|
|
pg_cursor = pg_conn.cursor()
|
|
|
|
# Get all tables
|
|
sqlite_cursor.execute("SELECT name FROM sqlite_master WHERE type='table'")
|
|
tables = [row[0] for row in sqlite_cursor.fetchall()]
|
|
|
|
for table_name in tables:
|
|
if table_name == 'sqlite_sequence':
|
|
continue
|
|
|
|
print(f"\nMigrating {table_name}...")
|
|
|
|
# Get table schema
|
|
sqlite_cursor.execute(f"PRAGMA table_info({table_name})")
|
|
columns = sqlite_cursor.fetchall()
|
|
column_names = [col[1] for col in columns]
|
|
|
|
# Get data
|
|
sqlite_cursor.execute(f"SELECT * FROM {table_name}")
|
|
rows = sqlite_cursor.fetchall()
|
|
|
|
if not rows:
|
|
print(f" No data in {table_name}")
|
|
continue
|
|
|
|
# Build insert query
|
|
if table_name == 'user':
|
|
insert_sql = f'''
|
|
INSERT INTO "{table_name}" ({', '.join(column_names)})
|
|
VALUES ({', '.join(['%s'] * len(column_names))})
|
|
'''
|
|
else:
|
|
insert_sql = f'''
|
|
INSERT INTO {table_name} ({', '.join(column_names)})
|
|
VALUES ({', '.join(['%s'] * len(column_names))})
|
|
'''
|
|
|
|
# Insert data
|
|
count = 0
|
|
for row in rows:
|
|
values = []
|
|
for i, value in enumerate(row):
|
|
col_name = column_names[i]
|
|
# Handle special cases
|
|
if col_name in ['payload', 'constraints', 'result', 'receipt', 'capabilities',
|
|
'extra_metadata', 'sla', 'attributes', 'metadata'] and value:
|
|
if isinstance(value, str):
|
|
try:
|
|
value = json.loads(value)
|
|
except:
|
|
pass
|
|
elif col_name in ['balance', 'price', 'average_job_duration_ms'] and value is not None:
|
|
value = Decimal(str(value))
|
|
values.append(value)
|
|
|
|
try:
|
|
pg_cursor.execute(insert_sql, values)
|
|
count += 1
|
|
except Exception as e:
|
|
print(f" Error inserting row: {e}")
|
|
print(f" Values: {values}")
|
|
|
|
print(f" Migrated {count} rows from {table_name}")
|
|
|
|
pg_conn.commit()
|
|
sqlite_conn.close()
|
|
pg_conn.close()
|
|
|
|
print("\n✅ Complete migration finished!")
|
|
|
|
if __name__ == "__main__":
|
|
migrate_all_data()
|