Some checks failed
API Endpoint Tests / test-api-endpoints (push) Successful in 10s
Blockchain Synchronization Verification / sync-verification (push) Failing after 3s
CLI Tests / test-cli (push) Failing after 4s
Documentation Validation / validate-docs (push) Successful in 8s
Documentation Validation / validate-policies-strict (push) Successful in 4s
Integration Tests / test-service-integration (push) Successful in 38s
Multi-Node Blockchain Health Monitoring / health-check (push) Successful in 2s
P2P Network Verification / p2p-verification (push) Successful in 3s
Security Scanning / security-scan (push) Successful in 40s
Smart Contract Tests / test-solidity (map[name:aitbc-token path:packages/solidity/aitbc-token]) (push) Successful in 15s
Smart Contract Tests / lint-solidity (push) Successful in 8s
- Relocate blockchain-event-bridge README content to docs/apps/blockchain/blockchain-event-bridge.md - Relocate blockchain-explorer README content to docs/apps/blockchain/blockchain-explorer.md - Replace app READMEs with redirect notices pointing to new documentation location - Consolidate documentation in central docs/ directory for better organization
4.4 KiB
4.4 KiB
Plugin Analytics
Status
✅ Operational
Overview
Analytics plugin for collecting, processing, and analyzing data from various AITBC components and services.
Architecture
Core Components
- Data Collector: Collects data from services and plugins
- Data Processor: Processes and normalizes collected data
- Analytics Engine: Performs analytics and generates insights
- Report Generator: Generates reports and visualizations
- Storage Manager: Manages data storage and retention
Quick Start (End Users)
Prerequisites
- Python 3.13+
- PostgreSQL database
- Access to service metrics endpoints
Installation
cd /opt/aitbc/apps/plugin-analytics
.venv/bin/pip install -r requirements.txt
Configuration
Set environment variables in .env:
DATABASE_URL=postgresql://user:pass@localhost/analytics
COLLECTION_INTERVAL=300
RETENTION_DAYS=90
Running the Service
.venv/bin/python main.py
Developer Guide
Development Setup
- Clone the repository
- Create virtual environment:
python -m venv .venv - Install dependencies:
pip install -r requirements.txt - Set up database
- Configure data sources
- Run tests:
pytest tests/
Project Structure
plugin-analytics/
├── src/
│ ├── data_collector/ # Data collection
│ ├── data_processor/ # Data processing
│ ├── analytics_engine/ # Analytics engine
│ ├── report_generator/ # Report generation
│ └── storage_manager/ # Storage management
├── tests/ # Test suite
└── pyproject.toml # Project configuration
Testing
# Run all tests
pytest tests/
# Run data collector tests
pytest tests/test_collector.py
# Run analytics engine tests
pytest tests/test_analytics.py
API Reference
Data Collection
Start Collection
POST /api/v1/analytics/collection/start
Content-Type: application/json
{
"data_source": "string",
"interval": 300
}
Stop Collection
POST /api/v1/analytics/collection/stop
Content-Type: application/json
{
"collection_id": "string"
}
Get Collection Status
GET /api/v1/analytics/collection/status
Analytics
Run Analysis
POST /api/v1/analytics/analyze
Content-Type: application/json
{
"analysis_type": "trend|anomaly|correlation",
"data_source": "string",
"time_range": "1h|1d|1w"
}
Get Analysis Results
GET /api/v1/analytics/results/{analysis_id}
Reports
Generate Report
POST /api/v1/analytics/reports/generate
Content-Type: application/json
{
"report_type": "summary|detailed|custom",
"data_source": "string",
"time_range": "1d|1w|1m"
}
Get Report
GET /api/v1/analytics/reports/{report_id}
List Reports
GET /api/v1/analytics/reports?limit=10
Data Management
Query Data
POST /api/v1/analytics/data/query
Content-Type: application/json
{
"data_source": "string",
"filters": {},
"time_range": "1h"
}
Export Data
POST /api/v1/analytics/data/export
Content-Type: application/json
{
"data_source": "string",
"format": "csv|json",
"time_range": "1d"
}
Configuration
Environment Variables
DATABASE_URL: PostgreSQL connection stringCOLLECTION_INTERVAL: Data collection interval (default: 300s)RETENTION_DAYS: Data retention period (default: 90 days)MAX_BATCH_SIZE: Maximum batch size for processing
Data Sources
- Blockchain Metrics: Blockchain node metrics
- Exchange Data: Exchange trading data
- Agent Activity: Agent coordination data
- System Metrics: System performance metrics
Analysis Types
- Trend Analysis: Identify trends over time
- Anomaly Detection: Detect unusual patterns
- Correlation Analysis: Find correlations between metrics
Troubleshooting
Data not collecting: Check data source connectivity and configuration.
Analysis not running: Verify data availability and analysis parameters.
Report generation failed: Check data completeness and report configuration.
Storage full: Review retention policy and data growth rate.
Security Notes
- Secure database access credentials
- Implement data encryption at rest
- Validate all data inputs
- Implement access controls for sensitive data
- Regularly audit data access logs
- Comply with data retention policies