feat: implement dynamic batch size for bulk sync
Some checks failed
Blockchain Synchronization Verification / sync-verification (push) Failing after 3s
Integration Tests / test-service-integration (push) Successful in 12s
Multi-Node Blockchain Health Monitoring / health-check (push) Successful in 2s
P2P Network Verification / p2p-verification (push) Successful in 2s
Python Tests / test-python (push) Successful in 9s
Security Scanning / security-scan (push) Successful in 32s
Some checks failed
Blockchain Synchronization Verification / sync-verification (push) Failing after 3s
Integration Tests / test-service-integration (push) Successful in 12s
Multi-Node Blockchain Health Monitoring / health-check (push) Successful in 2s
P2P Network Verification / p2p-verification (push) Successful in 2s
Python Tests / test-python (push) Successful in 9s
Security Scanning / security-scan (push) Successful in 32s
- Add _calculate_dynamic_batch_size method to sync.py - Scale batch size based on gap size: - Small gaps (< 100): 20-50 blocks for precision - Medium gaps (100-500): 50-100 blocks - Large gaps (> 500): 100-200 blocks for speed - Add min_bulk_sync_batch_size and max_bulk_sync_batch_size config params - Update bulk_import_from to use dynamic batch size - Replaces fixed 50-block batch size with adaptive sizing
This commit is contained in:
@@ -74,6 +74,8 @@ class ChainSettings(BaseSettings):
|
||||
auto_sync_threshold: int = 10 # blocks gap threshold to trigger bulk sync
|
||||
auto_sync_max_retries: int = 3 # max retry attempts for automatic bulk sync
|
||||
min_bulk_sync_interval: int = 60 # minimum seconds between bulk sync attempts
|
||||
min_bulk_sync_batch_size: int = 20 # minimum batch size for dynamic bulk sync
|
||||
max_bulk_sync_batch_size: int = 200 # maximum batch size for dynamic bulk sync
|
||||
|
||||
gossip_backend: str = "memory"
|
||||
gossip_broadcast_url: Optional[str] = None
|
||||
|
||||
Reference in New Issue
Block a user