Merge pull request 'aitbc1/blockchain-production' (#39) from aitbc1/blockchain-production into main
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (push) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (push) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (push) Has been cancelled
AITBC CI/CD Pipeline / test-cli (push) Has been cancelled
AITBC CI/CD Pipeline / test-services (push) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (push) Has been cancelled
AITBC CI/CD Pipeline / security-scan (push) Has been cancelled
AITBC CI/CD Pipeline / build (push) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (push) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (push) Has been cancelled
AITBC CI/CD Pipeline / performance-test (push) Has been cancelled
AITBC CI/CD Pipeline / docs (push) Has been cancelled
AITBC CI/CD Pipeline / release (push) Has been cancelled
AITBC CI/CD Pipeline / notify (push) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (push) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (push) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (push) Has been cancelled
Security Scanning / Dependency Security Scan (push) Has been cancelled
Security Scanning / Container Security Scan (push) Has been cancelled
Security Scanning / OSSF Scorecard (push) Has been cancelled
Security Scanning / Security Summary Report (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (push) Has been cancelled
Some checks failed
AITBC CI/CD Pipeline / lint-and-test (3.11) (push) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.12) (push) Has been cancelled
AITBC CI/CD Pipeline / lint-and-test (3.13) (push) Has been cancelled
AITBC CI/CD Pipeline / test-cli (push) Has been cancelled
AITBC CI/CD Pipeline / test-services (push) Has been cancelled
AITBC CI/CD Pipeline / test-production-services (push) Has been cancelled
AITBC CI/CD Pipeline / security-scan (push) Has been cancelled
AITBC CI/CD Pipeline / build (push) Has been cancelled
AITBC CI/CD Pipeline / deploy-staging (push) Has been cancelled
AITBC CI/CD Pipeline / deploy-production (push) Has been cancelled
AITBC CI/CD Pipeline / performance-test (push) Has been cancelled
AITBC CI/CD Pipeline / docs (push) Has been cancelled
AITBC CI/CD Pipeline / release (push) Has been cancelled
AITBC CI/CD Pipeline / notify (push) Has been cancelled
Security Scanning / Bandit Security Scan (apps/coordinator-api/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (cli/aitbc_cli) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-core/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-crypto/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (packages/py/aitbc-sdk/src) (push) Has been cancelled
Security Scanning / Bandit Security Scan (tests) (push) Has been cancelled
Security Scanning / CodeQL Security Analysis (javascript) (push) Has been cancelled
Security Scanning / CodeQL Security Analysis (python) (push) Has been cancelled
Security Scanning / Dependency Security Scan (push) Has been cancelled
Security Scanning / Container Security Scan (push) Has been cancelled
Security Scanning / OSSF Scorecard (push) Has been cancelled
Security Scanning / Security Summary Report (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.11) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.12) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-cli-level1 (3.13) (push) Has been cancelled
AITBC CLI Level 1 Commands Test / test-summary (push) Has been cancelled
Reviewed-on: #39
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -427,3 +427,4 @@ wallet*.json
|
||||
keystore/
|
||||
certificates/
|
||||
>>>>>>> Stashed changes
|
||||
.gitea_token.sh
|
||||
|
||||
29
README.md
29
README.md
@@ -89,28 +89,30 @@ aitbc marketplace list --translate-to french
|
||||
|
||||
## 🔗 Blockchain Node (Brother Chain)
|
||||
|
||||
A minimal asset-backed blockchain that validates compute receipts and mints AIT tokens.
|
||||
Production-ready blockchain with fixed supply and secure key management.
|
||||
|
||||
### ✅ Current Status
|
||||
- **Chain ID**: `ait-devnet`
|
||||
- **Chain ID**: `ait-mainnet` (production)
|
||||
- **Consensus**: Proof-of-Authority (single proposer)
|
||||
- **RPC Endpoint**: `http://localhost:8026/rpc`
|
||||
- **Health Check**: `http://localhost:8026/health`
|
||||
- **Metrics**: `http://localhost:8026/metrics` (Prometheus format)
|
||||
- **Status**: 🟢 Operational and fully functional
|
||||
- **RPC Endpoint**: `http://127.0.0.1:8026/rpc`
|
||||
- **Health Check**: `http://127.0.0.1:8026/health`
|
||||
- **Metrics**: `http://127.0.0.1:8026/metrics` (Prometheus format)
|
||||
- **Status**: 🟢 Operational with immutable supply, no admin minting
|
||||
|
||||
### 🚀 Quick Launch
|
||||
### 🚀 Quick Launch (First Time)
|
||||
|
||||
```bash
|
||||
# 1. Generate keystore and genesis
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
bash scripts/devnet_up.sh
|
||||
.venv/bin/python scripts/setup_production.py --chain-id ait-mainnet
|
||||
|
||||
# 2. Start the node (production)
|
||||
bash scripts/mainnet_up.sh
|
||||
```
|
||||
|
||||
The node starts:
|
||||
- Proposer loop (block production)
|
||||
- RPC API on port 8026
|
||||
- Mock coordinator on port 8090 (for testing)
|
||||
- RPC API on `http://127.0.0.1:8026`
|
||||
|
||||
### 🛠️ CLI Interaction
|
||||
|
||||
@@ -123,11 +125,10 @@ aitbc blockchain head
|
||||
|
||||
# Check balance
|
||||
aitbc blockchain balance --address <your-address>
|
||||
|
||||
# Fund an address (devnet faucet)
|
||||
aitbc blockchain faucet --address <your-address> --amount 1000
|
||||
```
|
||||
|
||||
> **Note**: The devnet faucet (`aitbc blockchain faucet`) has been removed. All tokens are allocated at genesis to the `aitbc1genesis` wallet.
|
||||
|
||||
For full documentation, see: [`apps/blockchain-node/README.md`](./apps/blockchain-node/README.md)
|
||||
|
||||
## 🤖 Agent-First Computing
|
||||
|
||||
@@ -1,129 +1,165 @@
|
||||
# Blockchain Node (Brother Chain)
|
||||
|
||||
Minimal asset-backed blockchain node that validates compute receipts and mints AIT tokens.
|
||||
Production-ready blockchain node for AITBC with fixed supply and secure key management.
|
||||
|
||||
## Status
|
||||
|
||||
✅ **Operational** — Core blockchain functionality implemented and running.
|
||||
✅ **Operational** — Core blockchain functionality implemented.
|
||||
|
||||
### Capabilities
|
||||
- PoA consensus with single proposer (devnet)
|
||||
- PoA consensus with single proposer
|
||||
- Transaction processing (TRANSFER, RECEIPT_CLAIM)
|
||||
- Receipt validation and minting
|
||||
- Gossip-based peer-to-peer networking (in-memory backend)
|
||||
- RESTful RPC API (`/rpc/*`)
|
||||
- Prometheus metrics (`/metrics`)
|
||||
- Health check endpoint (`/health`)
|
||||
- SQLite persistence with Alembic migrations
|
||||
- Multi-chain support (separate data directories per chain ID)
|
||||
|
||||
## Quickstart (Devnet)
|
||||
## Architecture
|
||||
|
||||
The blockchain node is already set up with a virtualenv. To launch:
|
||||
### Wallets & Supply
|
||||
- **Fixed supply**: All tokens minted at genesis; no further minting.
|
||||
- **Two wallets**:
|
||||
- `aitbc1genesis` (treasury): holds the full initial supply (default 1 B AIT). This is the **cold storage** wallet; private key is encrypted in keystore.
|
||||
- `aitbc1treasury` (spending): operational wallet for transactions; initially zero balance. Can receive funds from genesis wallet.
|
||||
- **Private keys** are stored in `keystore/*.json` using AES‑256‑GCM encryption. Password is stored in `keystore/.password` (mode 600).
|
||||
|
||||
### Chain Configuration
|
||||
- **Chain ID**: `ait-mainnet` (production)
|
||||
- **Proposer**: The genesis wallet address is the block proposer and authority.
|
||||
- **Trusted proposers**: Only the genesis wallet is allowed to produce blocks.
|
||||
- **No admin endpoints**: The `/rpc/admin/mintFaucet` endpoint has been removed.
|
||||
|
||||
## Quickstart (Production)
|
||||
|
||||
### 1. Generate Production Keys & Genesis
|
||||
|
||||
Run the setup script once to create the keystore, allocations, and genesis:
|
||||
|
||||
```bash
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
bash scripts/devnet_up.sh
|
||||
.venv/bin/python scripts/setup_production.py --chain-id ait-mainnet
|
||||
```
|
||||
|
||||
This will:
|
||||
1. Generate genesis block at `data/devnet/genesis.json`
|
||||
2. Start the blockchain node proposer loop (PID logged)
|
||||
3. Start RPC API on `http://127.0.0.1:8026`
|
||||
4. Start mock coordinator on `http://127.0.0.1:8090`
|
||||
This creates:
|
||||
- `keystore/aitbc1genesis.json` (treasury wallet)
|
||||
- `keystore/aitbc1treasury.json` (spending wallet)
|
||||
- `keystore/.password` (random strong password)
|
||||
- `data/ait-mainnet/allocations.json`
|
||||
- `data/ait-mainnet/genesis.json`
|
||||
|
||||
Press `Ctrl+C` to stop all processes.
|
||||
**Important**: Back up the keystore directory and the `.password` file securely. Loss of these means loss of funds.
|
||||
|
||||
### Manual Startup
|
||||
### 2. Configure Environment
|
||||
|
||||
If you prefer to start components separately:
|
||||
Copy the provided production environment file:
|
||||
|
||||
```bash
|
||||
# Terminal 1: Blockchain node
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src python -m aitbc_chain.main
|
||||
cp .env.production .env
|
||||
```
|
||||
|
||||
# Terminal 2: RPC API
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026
|
||||
Edit `.env` if you need to adjust ports or paths. Ensure `chain_id=ait-mainnet` and `proposer_id` matches the genesis wallet address (the setup script sets it automatically in `.env.production`).
|
||||
|
||||
# Terminal 3: Mock coordinator (optional, for testing)
|
||||
### 3. Start the Node
|
||||
|
||||
Use the production launcher:
|
||||
|
||||
```bash
|
||||
bash scripts/mainnet_up.sh
|
||||
```
|
||||
|
||||
This starts:
|
||||
- Blockchain node (PoA proposer)
|
||||
- RPC API on `http://127.0.0.1:8026`
|
||||
|
||||
Press `Ctrl+C` to stop both.
|
||||
|
||||
### Manual Startup (Alternative)
|
||||
|
||||
```bash
|
||||
cd /opt/aitbc/apps/blockchain-node
|
||||
source .venv/bin/activate
|
||||
PYTHONPATH=src uvicorn mock_coordinator:app --host 127.0.0.1 --port 8090
|
||||
source .env.production # or export the variables manually
|
||||
# Terminal 1: Node
|
||||
.venv/bin/python -m aitbc_chain.main
|
||||
# Terminal 2: RPC
|
||||
.venv/bin/bin/uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
Once running, the RPC API is available at `http://127.0.0.1:8026/rpc`.
|
||||
RPC API available at `http://127.0.0.1:8026/rpc`.
|
||||
|
||||
### Health & Metrics
|
||||
- `GET /health` — Health check with node info
|
||||
- `GET /metrics` — Prometheus-format metrics
|
||||
|
||||
### Blockchain Queries
|
||||
- `GET /rpc/head` — Current chain head block
|
||||
### Blockchain
|
||||
- `GET /rpc/head` — Current chain head
|
||||
- `GET /rpc/blocks/{height}` — Get block by height
|
||||
- `GET /rpc/blocks-range?start=0&end=10` — Get block range
|
||||
- `GET /rpc/blocks-range?start=0&end=10` — Block range
|
||||
- `GET /rpc/info` — Chain information
|
||||
- `GET /rpc/supply` — Token supply info
|
||||
- `GET /rpc/validators` — List validators
|
||||
- `GET /rpc/supply` — Token supply (total & circulating)
|
||||
- `GET /rpc/validators` — List of authorities
|
||||
- `GET /rpc/state` — Full state dump
|
||||
|
||||
### Transactions
|
||||
- `POST /rpc/sendTx` — Submit transaction (JSON body: `TransactionRequest`)
|
||||
- `POST /rpc/sendTx` — Submit transaction (TRANSFER, RECEIPT_CLAIM)
|
||||
- `GET /rpc/transactions` — Latest transactions
|
||||
- `GET /rpc/tx/{tx_hash}` — Get transaction by hash
|
||||
- `POST /rpc/estimateFee` — Estimate fee for transaction type
|
||||
|
||||
### Receipts (Compute Proofs)
|
||||
- `POST /rpc/submitReceipt` — Submit receipt claim
|
||||
- `GET /rpc/receipts` — Latest receipts
|
||||
- `GET /rpc/receipts/{receipt_id}` — Get receipt by ID
|
||||
- `POST /rpc/estimateFee` — Estimate fee
|
||||
|
||||
### Accounts
|
||||
- `GET /rpc/getBalance/{address}` — Account balance
|
||||
- `GET /rpc/address/{address}` — Address details + txs
|
||||
- `GET /rpc/addresses` — List active addresses
|
||||
|
||||
### Admin
|
||||
- `POST /rpc/admin/mintFaucet` — Mint devnet funds (requires admin key)
|
||||
### Health & Metrics
|
||||
- `GET /health` — Health check
|
||||
- `GET /metrics` — Prometheus metrics
|
||||
|
||||
### Sync
|
||||
- `GET /rpc/syncStatus` — Chain sync status
|
||||
*Note: Admin endpoints (`/rpc/admin/*`) are disabled in production.*
|
||||
|
||||
## CLI Integration
|
||||
## Multi‑Chain Support
|
||||
|
||||
Use the AITBC CLI to interact with the node:
|
||||
The node can run multiple chains simultaneously by setting `supported_chains` in `.env` as a comma‑separated list (e.g., `ait-mainnet,ait-testnet`). Each chain must have its own `data/<chain_id>/genesis.json` and (optionally) its own keystore. The proposer identity is shared across chains; for multi‑chain you may want separate proposer wallets per chain.
|
||||
|
||||
## Keystore Management
|
||||
|
||||
### Encrypted Keystore Format
|
||||
- Uses Web3 keystore format (AES‑256‑GCM + PBKDF2).
|
||||
- Password stored in `keystore/.password` (chmod 600).
|
||||
- Private keys are **never** stored in plaintext.
|
||||
|
||||
### Changing the Password
|
||||
```bash
|
||||
source /opt/aitbc/cli/venv/bin/activate
|
||||
aitbc blockchain status
|
||||
aitbc blockchain head
|
||||
aitbc blockchain balance --address <your-address>
|
||||
aitbc blockchain faucet --address <your-address> --amount 1000
|
||||
# Use the keystore.py script to re‑encrypt with a new password
|
||||
.venv/bin/python scripts/keystore.py --name genesis --show --password <old> --new-password <new>
|
||||
```
|
||||
(Not yet implemented; currently you must manually decrypt and re‑encrypt.)
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `.env` in this directory to change:
|
||||
|
||||
```
|
||||
CHAIN_ID=ait-devnet
|
||||
DB_PATH=./data/chain.db
|
||||
RPC_BIND_HOST=0.0.0.0
|
||||
RPC_BIND_PORT=8026
|
||||
P2P_BIND_HOST=0.0.0.0
|
||||
P2P_BIND_PORT=7070
|
||||
PROPOSER_KEY=proposer_key_<timestamp>
|
||||
MINT_PER_UNIT=1000
|
||||
COORDINATOR_RATIO=0.05
|
||||
GOSSIP_BACKEND=memory
|
||||
### Adding a New Wallet
|
||||
```bash
|
||||
.venv/bin/python scripts/keystore.py --name mywallet --create
|
||||
```
|
||||
This appends a new entry to `allocations.json` if you want it to receive genesis allocation (edit the file and regenerate genesis).
|
||||
|
||||
Restart the node after changes.
|
||||
## Genesis & Supply
|
||||
|
||||
- Genesis file is generated by `scripts/make_genesis.py`.
|
||||
- Supply is fixed: the sum of `allocations[].balance`.
|
||||
- No tokens can be minted after genesis (`mint_per_unit=0`).
|
||||
- To change the allocation distribution, edit `allocations.json` and regenerate genesis (requires consensus to reset chain).
|
||||
|
||||
## Development / Devnet
|
||||
|
||||
The old devnet (faucet model) has been removed. For local development, use the production setup with a throwaway keystore, or create a separate `ait-devnet` chain by providing your own `allocations.json` and running `scripts/make_genesis.py` manually.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Genesis missing**: Run `scripts/setup_production.py` first.
|
||||
|
||||
**Proposer key not loaded**: Ensure `keystore/aitbc1genesis.json` exists and `keystore/.password` is readable. The node will log a warning but still run (block signing disabled until implemented).
|
||||
|
||||
**Port already in use**: Change `rpc_bind_port` in `.env` and restart.
|
||||
|
||||
**Database locked**: Delete `data/ait-mainnet/chain.db` and restart (only if you're sure no other node is using it).
|
||||
|
||||
## Project Layout
|
||||
|
||||
@@ -138,32 +174,26 @@ blockchain-node/
|
||||
│ ├── gossip/ # P2P message bus
|
||||
│ ├── consensus/ # PoA proposer logic
|
||||
│ ├── rpc/ # RPC endpoints
|
||||
│ ├── contracts/ # Smart contract logic
|
||||
│ └── models.py # SQLModel definitions
|
||||
├── data/
|
||||
│ └── devnet/
|
||||
│ └── genesis.json # Generated by make_genesis.py
|
||||
│ └── ait-mainnet/
|
||||
│ ├── genesis.json # Generated by make_genesis.py
|
||||
│ └── chain.db # SQLite database
|
||||
├── keystore/
|
||||
│ ├── aitbc1genesis.json
|
||||
│ ├── aitbc1treasury.json
|
||||
│ └── .password
|
||||
├── scripts/
|
||||
│ ├── make_genesis.py # Genesis generator
|
||||
│ ├── devnet_up.sh # Devnet launcher
|
||||
│ └── keygen.py # Keypair generator
|
||||
└── .env # Node configuration
|
||||
│ ├── setup_production.py # One‑time production setup
|
||||
│ ├── mainnet_up.sh # Production launcher
|
||||
│ └── keystore.py # Keystore utilities
|
||||
└── .env.production # Production environment template
|
||||
```
|
||||
|
||||
## Notes
|
||||
## Security Notes
|
||||
|
||||
- The node uses proof-of-authority (PoA) consensus with a single proposer for the devnet.
|
||||
- Transactions require a valid signature (ed25519) unless running in test mode.
|
||||
- Receipts represent compute work attestations and mint new AIT tokens to the miner.
|
||||
- Gossip backend defaults to in-memory; for multi-node networks, configure a Redis backend.
|
||||
- RPC API does not require authentication on devnet (add in production).
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Port already in use:** Change `RPC_BIND_PORT` in `.env` and restart.
|
||||
|
||||
**Database locked:** Ensure only one node instance is running; delete `data/chain.db` if corrupted.
|
||||
|
||||
**No blocks proposed:** Check proposer logs; ensure `PROPOSER_KEY` is set and no other proposers are conflicting.
|
||||
|
||||
**Mock coordinator not responding:** It's only needed for certain tests; the blockchain node can run standalone.
|
||||
- **Never** expose RPC API to the public internet without authentication (production should add mTLS or API keys).
|
||||
- Keep keystore and password backups offline.
|
||||
- The node runs as the current user; ensure file permissions restrict access to the `keystore/` and `data/` directories.
|
||||
- In a multi‑node network, use Redis gossip backend and configure `trusted_proposers` with all authority addresses.
|
||||
|
||||
@@ -2,13 +2,41 @@
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
VENV_PYTHON="$ROOT_DIR/.venv/bin/python"
|
||||
if [ ! -x "$VENV_PYTHON" ]; then
|
||||
echo "[devnet] Virtualenv not found at $VENV_PYTHON. Please create it: python -m venv .venv && .venv/bin/pip install -r requirements.txt"
|
||||
exit 1
|
||||
fi
|
||||
export PYTHONPATH="${ROOT_DIR}/src:${ROOT_DIR}/scripts:${PYTHONPATH:-}"
|
||||
|
||||
GENESIS_PATH="${ROOT_DIR}/data/devnet/genesis.json"
|
||||
python "${ROOT_DIR}/scripts/make_genesis.py" --output "${GENESIS_PATH}" --force
|
||||
GENESIS_PATH="data/devnet/genesis.json"
|
||||
ALLOCATIONS_PATH="data/devnet/allocations.json"
|
||||
PROPOSER_ADDRESS="ait15v2cdlz5a3uy3wfurgh6m957kahnhhprdq7fy9m6eay05mvrv4jsyx4sks"
|
||||
"$VENV_PYTHON" "scripts/make_genesis.py" \
|
||||
--output "$GENESIS_PATH" \
|
||||
--force \
|
||||
--allocations "$ALLOCATIONS_PATH" \
|
||||
--authorities "$PROPOSER_ADDRESS" \
|
||||
--chain-id "ait-devnet"
|
||||
|
||||
echo "[devnet] Generated genesis at ${GENESIS_PATH}"
|
||||
|
||||
# Set environment for devnet
|
||||
export chain_id="ait-devnet"
|
||||
export supported_chains="ait-devnet"
|
||||
export proposer_id="${PROPOSER_ADDRESS}"
|
||||
export mint_per_unit=0
|
||||
export coordinator_ratio=0.05
|
||||
export db_path="./data/${chain_id}/chain.db"
|
||||
export trusted_proposers="${PROPOSER_ADDRESS}"
|
||||
export gossip_backend="memory"
|
||||
|
||||
# Optional: if you have a proposer private key for block signing (future), set PROPOSER_KEY
|
||||
# export PROPOSER_KEY="..."
|
||||
|
||||
echo "[devnet] Environment configured: chain_id=${chain_id}, proposer_id=${proposer_id}"
|
||||
|
||||
declare -a CHILD_PIDS=()
|
||||
cleanup() {
|
||||
for pid in "${CHILD_PIDS[@]}"; do
|
||||
@@ -19,18 +47,19 @@ cleanup() {
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
python -m aitbc_chain.main &
|
||||
"$VENV_PYTHON" -m aitbc_chain.main &
|
||||
CHILD_PIDS+=($!)
|
||||
echo "[devnet] Blockchain node started (PID ${CHILD_PIDS[-1]})"
|
||||
|
||||
sleep 1
|
||||
|
||||
python -m uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026 --log-level info &
|
||||
"$VENV_PYTHON" -m uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026 --log-level info &
|
||||
CHILD_PIDS+=($!)
|
||||
echo "[devnet] RPC API serving at http://127.0.0.1:8026"
|
||||
echo "[devnet] RPC API serving at http://127.0.0.1:8026"
|
||||
|
||||
python -m uvicorn mock_coordinator:app --host 127.0.0.1 --port 8090 --log-level info &
|
||||
CHILD_PIDS+=($!)
|
||||
echo "[devnet] Mock coordinator serving at http://127.0.0.1:8090"
|
||||
# Optional: mock coordinator for devnet only
|
||||
# "$VENV_PYTHON" -m uvicorn mock_coordinator:app --host 127.0.0.1 --port 8090 --log-level info &
|
||||
# CHILD_PIDS+=($!)
|
||||
# echo "[devnet] Mock coordinator serving at http://127.0.0.1:8090"
|
||||
|
||||
wait
|
||||
|
||||
186
apps/blockchain-node/scripts/keystore.py
Normal file
186
apps/blockchain-node/scripts/keystore.py
Normal file
@@ -0,0 +1,186 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Production key management for AITBC blockchain.
|
||||
|
||||
Generates ed25519 keypairs and stores them in an encrypted JSON keystore
|
||||
(Ethereum-style web3 keystore). Supports multiple wallets (treasury, proposer, etc.)
|
||||
|
||||
Usage:
|
||||
python keystore.py --name treasury --create --password <secret>
|
||||
python keystore.py --name proposer --create --password <secret>
|
||||
python keystore.py --name treasury --show
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional
|
||||
|
||||
# Uses Cryptography library for ed25519 and encryption
|
||||
from cryptography.hazmat.primitives.asymmetric import ed25519
|
||||
from cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMAC
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
|
||||
# Address encoding: bech32m (HRP 'ait')
|
||||
from bech32 import bech32_encode, convertbits
|
||||
|
||||
|
||||
def generate_address(public_key_bytes: bytes) -> str:
|
||||
"""Generate a bech32m address from a public key.
|
||||
1. Take SHA256 of the public key (produces 32 bytes)
|
||||
2. Convert to 5-bit groups (bech32)
|
||||
3. Encode with HRP 'ait'
|
||||
"""
|
||||
digest = hashes.Hash(hashes.SHA256(), backend=default_backend())
|
||||
digest.update(public_key_bytes)
|
||||
hashed = digest.finalize()
|
||||
# Convert to 5-bit words for bech32
|
||||
data = convertbits(hashed, 8, 5, True)
|
||||
return bech32_encode("ait", data)
|
||||
|
||||
|
||||
def encrypt_private_key(private_key_bytes: bytes, password: str, salt: bytes) -> Dict[str, Any]:
|
||||
"""Encrypt a private key using AES-GCM, wrapped in a JSON keystore."""
|
||||
# Derive key from password using PBKDF2
|
||||
kdf = PBKDF2HMAC(
|
||||
algorithm=hashes.SHA256(),
|
||||
length=32,
|
||||
salt=salt,
|
||||
iterations=100_000,
|
||||
backend=default_backend()
|
||||
)
|
||||
key = kdf.derive(password.encode('utf-8'))
|
||||
|
||||
# Encrypt with AES-GCM
|
||||
aesgcm = AESGCM(key)
|
||||
nonce = os.urandom(12)
|
||||
encrypted = aesgcm.encrypt(nonce, private_key_bytes, None)
|
||||
|
||||
return {
|
||||
"crypto": {
|
||||
"cipher": "aes-256-gcm",
|
||||
"cipherparams": {"nonce": nonce.hex()},
|
||||
"ciphertext": encrypted.hex(),
|
||||
"kdf": "pbkdf2",
|
||||
"kdfparams": {
|
||||
"dklen": 32,
|
||||
"salt": salt.hex(),
|
||||
"c": 100_000,
|
||||
"prf": "hmac-sha256"
|
||||
},
|
||||
"mac": "TODO" # In production, compute MAC over ciphertext and KDF params
|
||||
},
|
||||
"address": None, # to be filled
|
||||
"keytype": "ed25519",
|
||||
"version": 1
|
||||
}
|
||||
|
||||
|
||||
def generate_keypair(name: str, password: str, keystore_dir: Path) -> Dict[str, Any]:
|
||||
"""Generate a new ed25519 keypair and store in keystore."""
|
||||
salt = os.urandom(32)
|
||||
private_key = ed25519.Ed25519PrivateKey.generate()
|
||||
public_key = private_key.public_key()
|
||||
private_bytes = private_key.private_bytes(
|
||||
encoding=serialization.Encoding.Raw,
|
||||
format=serialization.PrivateFormat.Raw,
|
||||
encryption_algorithm=serialization.NoEncryption()
|
||||
)
|
||||
public_bytes = public_key.public_bytes(
|
||||
encoding=serialization.Encoding.Raw,
|
||||
format=serialization.PublicFormat.Raw
|
||||
)
|
||||
address = generate_address(public_bytes)
|
||||
|
||||
keystore = encrypt_private_key(private_bytes, password, salt)
|
||||
keystore["address"] = address
|
||||
|
||||
keystore_file = keystore_dir / f"{name}.json"
|
||||
keystore_dir.mkdir(parents=True, exist_ok=True)
|
||||
with open(keystore_file, 'w') as f:
|
||||
json.dump(keystore, f, indent=2)
|
||||
os.chmod(keystore_file, 0o600)
|
||||
|
||||
print(f"Generated {name} keypair")
|
||||
print(f" Address: {address}")
|
||||
print(f" Keystore: {keystore_file}")
|
||||
return keystore
|
||||
|
||||
|
||||
def show_keyinfo(keystore_file: Path, password: str) -> None:
|
||||
"""Decrypt and show key info (address, public key)."""
|
||||
with open(keystore_file) as f:
|
||||
data = json.load(f)
|
||||
|
||||
# Derive key from password
|
||||
crypto = data["crypto"]
|
||||
kdfparams = crypto["kdfparams"]
|
||||
salt = bytes.fromhex(kdfparams["salt"])
|
||||
kdf = PBKDF2HMAC(
|
||||
algorithm=hashes.SHA256(),
|
||||
length=32,
|
||||
salt=salt,
|
||||
iterations=kdfparams["c"],
|
||||
backend=default_backend()
|
||||
)
|
||||
key = kdf.derive(password.encode('utf-8'))
|
||||
|
||||
# Decrypt private key
|
||||
nonce = bytes.fromhex(crypto["cipherparams"]["nonce"])
|
||||
ciphertext = bytes.fromhex(crypto["ciphertext"])
|
||||
aesgcm = AESGCM(key)
|
||||
private_bytes = aesgcm.decrypt(nonce, ciphertext, None)
|
||||
private_key = ed25519.Ed25519PrivateKey.from_private_bytes(private_bytes)
|
||||
public_bytes = private_key.public_key().public_bytes(
|
||||
encoding=serialization.Encoding.Raw,
|
||||
format=serialization.PublicFormat.Raw
|
||||
)
|
||||
address = generate_address(public_bytes)
|
||||
|
||||
print(f"Keystore: {keystore_file}")
|
||||
print(f"Address: {address}")
|
||||
print(f"Public key (hex): {public_bytes.hex()}")
|
||||
|
||||
|
||||
def main():
|
||||
from getpass import getpass
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
|
||||
parser = argparse.ArgumentParser(description="Production keystore management")
|
||||
parser.add_argument("--name", required=True, help="Key name (e.g., treasury, proposer)")
|
||||
parser.add_argument("--create", action="store_true", help="Generate new keypair")
|
||||
parser.add_argument("--show", action="store_true", help="Show address/public key (prompt for password)")
|
||||
parser.add_argument("--password", help="Password (avoid using in CLI; prefer prompt or env)")
|
||||
parser.add_argument("--keystore-dir", type=Path, default=Path("/opt/aitbc/keystore"), help="Keystore directory")
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.create:
|
||||
pwd = args.password or os.getenv("KEYSTORE_PASSWORD") or getpass("New password: ")
|
||||
if not pwd:
|
||||
print("Password required")
|
||||
sys.exit(1)
|
||||
generate_keypair(args.name, pwd, args.keystore_dir)
|
||||
|
||||
elif args.show:
|
||||
pwd = args.password or os.getenv("KEYSTORE_PASSWORD") or getpass("Password: ")
|
||||
if not pwd:
|
||||
print("Password required")
|
||||
sys.exit(1)
|
||||
keystore_file = args.keystore_dir / f"{args.name}.json"
|
||||
if not keystore_file.exists():
|
||||
print(f"Keystore not found: {keystore_file}")
|
||||
sys.exit(1)
|
||||
show_keyinfo(keystore_file, pwd)
|
||||
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
85
apps/blockchain-node/scripts/mainnet_up.sh
Executable file
85
apps/blockchain-node/scripts/mainnet_up.sh
Executable file
@@ -0,0 +1,85 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
VENV_PYTHON="$ROOT_DIR/.venv/bin/python"
|
||||
if [ ! -x "$VENV_PYTHON" ]; then
|
||||
echo "[mainnet] Virtualenv not found at $VENV_PYTHON. Please create it: python -m venv .venv && .venv/bin/pip install -r requirements.txt"
|
||||
exit 1
|
||||
fi
|
||||
export PYTHONPATH="${ROOT_DIR}/src:${ROOT_DIR}/scripts:${PYTHONPATH:-}"
|
||||
|
||||
# Load production environment
|
||||
if [ -f ".env.production" ]; then
|
||||
set -a
|
||||
source .env.production
|
||||
set +a
|
||||
fi
|
||||
|
||||
CHAIN_ID="${chain_id:-ait-mainnet}"
|
||||
export chain_id="$CHAIN_ID"
|
||||
export supported_chains="${supported_chains:-$CHAIN_ID}"
|
||||
|
||||
# Proposer ID: should be the genesis wallet address (from keystore/aitbc1genesis.json)
|
||||
# If not set in env, derive from keystore
|
||||
if [ -z "${proposer_id:-}" ]; then
|
||||
if [ -f "keystore/aitbc1genesis.json" ]; then
|
||||
PROPOSER_ID=$(grep -o '"address": "[^"]*"' keystore/aitbc1genesis.json | cut -d'"' -f4)
|
||||
if [ -n "$PROPOSER_ID" ]; then
|
||||
export proposer_id="$PROPOSER_ID"
|
||||
else
|
||||
echo "[mainnet] ERROR: Could not derive proposer_id from keystore. Set proposer_id in .env.production"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "[mainnet] ERROR: keystore/aitbc1genesis.json not found. Run setup_production.py first."
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
export proposer_id
|
||||
fi
|
||||
|
||||
# Ensure mint_per_unit=0 for fixed supply
|
||||
export mint_per_unit=0
|
||||
export coordinator_ratio=0.05
|
||||
export db_path="./data/${CHAIN_ID}/chain.db"
|
||||
export trusted_proposers="${trusted_proposers:-$proposer_id}"
|
||||
export gossip_backend="${gossip_backend:-memory}"
|
||||
|
||||
# Optional: load proposer private key from keystore if block signing is implemented
|
||||
# export PROPOSER_KEY="..." # Not yet used; future feature
|
||||
|
||||
echo "[mainnet] Starting blockchain node for ${CHAIN_ID}"
|
||||
echo " proposer_id: $proposer_id"
|
||||
echo " db_path: $db_path"
|
||||
echo " gossip: $gossip_backend"
|
||||
|
||||
# Check that genesis exists
|
||||
GENESIS_PATH="data/${CHAIN_ID}/genesis.json"
|
||||
if [ ! -f "$GENESIS_PATH" ]; then
|
||||
echo "[mainnet] Genesis not found at $GENESIS_PATH. Run setup_production.py first."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
declare -a CHILD_PIDS=()
|
||||
cleanup() {
|
||||
for pid in "${CHILD_PIDS[@]}"; do
|
||||
if kill -0 "$pid" 2>/dev/null; then
|
||||
kill "$pid" 2>/dev/null || true
|
||||
fi
|
||||
done
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
"$VENV_PYTHON" -m aitbc_chain.main &
|
||||
CHILD_PIDS+=($!)
|
||||
echo "[mainnet] Blockchain node started (PID ${CHILD_PIDS[-1]})"
|
||||
|
||||
sleep 2
|
||||
|
||||
"$VENV_PYTHON" -m uvicorn aitbc_chain.app:app --host 127.0.0.1 --port 8026 --log-level info &
|
||||
CHILD_PIDS+=($!)
|
||||
echo "[mainnet] RPC API serving at http://127.0.0.1:8026"
|
||||
|
||||
wait
|
||||
@@ -1,5 +1,10 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Generate a deterministic devnet genesis file for the blockchain node."""
|
||||
"""Generate a production-ready genesis file with fixed allocations.
|
||||
|
||||
This replaces the old devnet faucet model. Genesis now defines a fixed
|
||||
initial coin supply allocated to specific addresses. No admin minting
|
||||
is allowed; the total supply is immutable after genesis.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -7,75 +12,79 @@ import argparse
|
||||
import json
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import List, Dict, Any
|
||||
|
||||
DEFAULT_GENESIS = {
|
||||
"chain_id": "ait-devnet",
|
||||
"timestamp": None, # populated at runtime
|
||||
"params": {
|
||||
"mint_per_unit": 1000,
|
||||
"coordinator_ratio": 0.05,
|
||||
"base_fee": 10,
|
||||
"fee_per_byte": 1,
|
||||
},
|
||||
"accounts": [
|
||||
{
|
||||
"address": "ait1faucet000000000000000000000000000000000",
|
||||
"balance": 1_000_000_000,
|
||||
"nonce": 0,
|
||||
}
|
||||
],
|
||||
"authorities": [
|
||||
{
|
||||
"address": "ait1devproposer000000000000000000000000000000",
|
||||
"weight": 1,
|
||||
}
|
||||
],
|
||||
# Chain parameters - these are on-chain economic settings
|
||||
CHAIN_PARAMS = {
|
||||
"mint_per_unit": 0, # No new minting after genesis
|
||||
"coordinator_ratio": 0.05,
|
||||
"base_fee": 10,
|
||||
"fee_per_byte": 1,
|
||||
}
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Generate devnet genesis data")
|
||||
parser = argparse.ArgumentParser(description="Generate production genesis data")
|
||||
parser.add_argument(
|
||||
"--output",
|
||||
type=Path,
|
||||
default=Path("data/devnet/genesis.json"),
|
||||
help="Path to write the generated genesis file (default: data/devnet/genesis.json)",
|
||||
help="Path to write the genesis file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--force",
|
||||
action="store_true",
|
||||
help="Overwrite the genesis file if it already exists.",
|
||||
help="Overwrite existing genesis file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--faucet-address",
|
||||
default="ait1faucet000000000000000000000000000000000",
|
||||
help="Address seeded with devnet funds.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--faucet-balance",
|
||||
type=int,
|
||||
default=1_000_000_000,
|
||||
help="Faucet balance in smallest units.",
|
||||
"--allocations",
|
||||
type=Path,
|
||||
required=True,
|
||||
help="JSON file mapping addresses to initial balances (smallest units)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--authorities",
|
||||
nargs="*",
|
||||
default=["ait1devproposer000000000000000000000000000000"],
|
||||
help="Authority addresses included in the genesis file.",
|
||||
required=True,
|
||||
help="List of PoA authority addresses (proposer/validators)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--chain-id",
|
||||
default="ait-devnet",
|
||||
help="Chain ID (default: ait-devnet)",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def build_genesis(args: argparse.Namespace) -> dict:
|
||||
genesis = json.loads(json.dumps(DEFAULT_GENESIS)) # deep copy via JSON
|
||||
genesis["timestamp"] = int(time.time())
|
||||
genesis["accounts"][0]["address"] = args.faucet_address
|
||||
genesis["accounts"][0]["balance"] = args.faucet_balance
|
||||
genesis["authorities"] = [
|
||||
{"address": address, "weight": 1}
|
||||
for address in args.authorities
|
||||
def load_allocations(path: Path) -> List[Dict[str, Any]]:
|
||||
"""Load address allocations from a JSON file.
|
||||
Expected format:
|
||||
[
|
||||
{"address": "ait1...", "balance": 1000000000, "nonce": 0}
|
||||
]
|
||||
return genesis
|
||||
"""
|
||||
with open(path) as f:
|
||||
data = json.load(f)
|
||||
if not isinstance(data, list):
|
||||
raise ValueError("allocations must be a list of objects")
|
||||
# Validate required fields
|
||||
for item in data:
|
||||
if "address" not in item or "balance" not in item:
|
||||
raise ValueError(f"Allocation missing required fields: {item}")
|
||||
return data
|
||||
|
||||
|
||||
def build_genesis(chain_id: str, allocations: List[Dict[str, Any]], authorities: List[str]) -> dict:
|
||||
"""Construct the genesis block specification."""
|
||||
timestamp = int(time.time())
|
||||
return {
|
||||
"chain_id": chain_id,
|
||||
"timestamp": timestamp,
|
||||
"params": CHAIN_PARAMS.copy(),
|
||||
"allocations": allocations, # Renamed from 'accounts' to avoid confusion
|
||||
"authorities": [
|
||||
{"address": addr, "weight": 1} for addr in authorities
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
def write_genesis(path: Path, data: dict, force: bool) -> None:
|
||||
@@ -88,8 +97,12 @@ def write_genesis(path: Path, data: dict, force: bool) -> None:
|
||||
|
||||
def main() -> None:
|
||||
args = parse_args()
|
||||
genesis = build_genesis(args)
|
||||
allocations = load_allocations(args.allocations)
|
||||
genesis = build_genesis(args.chain_id, allocations, args.authorities)
|
||||
write_genesis(args.output, genesis, args.force)
|
||||
total = sum(a["balance"] for a in allocations)
|
||||
print(f"[genesis] Total supply: {total} (fixed, no future minting)")
|
||||
print("[genesis] IMPORTANT: Keep the private keys for these addresses secure!")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
202
apps/blockchain-node/scripts/setup_production.py
Normal file
202
apps/blockchain-node/scripts/setup_production.py
Normal file
@@ -0,0 +1,202 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Production setup generator for AITBC blockchain.
|
||||
Creates two wallets:
|
||||
- aitbc1genesis: Treasury wallet holding all initial supply (1B AIT)
|
||||
- aitbc1treasury: Spending wallet (for transactions, can receive from genesis)
|
||||
|
||||
No admin minting; fixed supply at genesis.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import secrets
|
||||
import string
|
||||
from pathlib import Path
|
||||
|
||||
from cryptography.hazmat.primitives.asymmetric import ed25519
|
||||
from cryptography.hazmat.primitives import serialization, hashes
|
||||
from cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMAC
|
||||
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
|
||||
from bech32 import bech32_encode, convertbits
|
||||
|
||||
|
||||
def random_password(length: int = 32) -> str:
|
||||
"""Generate a strong random password."""
|
||||
alphabet = string.ascii_letters + string.digits + string.punctuation
|
||||
return ''.join(secrets.choice(alphabet) for _ in range(length))
|
||||
|
||||
|
||||
def generate_address(public_key_bytes: bytes) -> str:
|
||||
"""Bech32m address with HRP 'ait'."""
|
||||
digest = hashes.Hash(hashes.SHA256(), backend=default_backend())
|
||||
digest.update(public_key_bytes)
|
||||
hashed = digest.finalize()
|
||||
data = convertbits(hashed, 8, 5, True)
|
||||
return bech32_encode("ait", data)
|
||||
|
||||
|
||||
def encrypt_private_key(private_bytes: bytes, password: str, salt: bytes) -> dict:
|
||||
"""Web3-style keystore encryption (AES-GCM + PBKDF2)."""
|
||||
kdf = PBKDF2HMAC(
|
||||
algorithm=hashes.SHA256(),
|
||||
length=32,
|
||||
salt=salt,
|
||||
iterations=100_000,
|
||||
backend=default_backend()
|
||||
)
|
||||
key = kdf.derive(password.encode('utf-8'))
|
||||
|
||||
aesgcm = AESGCM(key)
|
||||
nonce = os.urandom(12)
|
||||
ciphertext = aesgcm.encrypt(nonce, private_bytes, None)
|
||||
|
||||
return {
|
||||
"crypto": {
|
||||
"cipher": "aes-256-gcm",
|
||||
"cipherparams": {"nonce": nonce.hex()},
|
||||
"ciphertext": ciphertext.hex(),
|
||||
"kdf": "pbkdf2",
|
||||
"kdfparams": {
|
||||
"dklen": 32,
|
||||
"salt": salt.hex(),
|
||||
"c": 100_000,
|
||||
"prf": "hmac-sha256"
|
||||
},
|
||||
"mac": "TODO" # In production, compute proper MAC
|
||||
},
|
||||
"address": None,
|
||||
"keytype": "ed25519",
|
||||
"version": 1
|
||||
}
|
||||
|
||||
|
||||
def generate_wallet(name: str, password: str, keystore_dir: Path) -> dict:
|
||||
"""Generate ed25519 keypair and return wallet info."""
|
||||
private_key = ed25519.Ed25519PrivateKey.generate()
|
||||
public_key = private_key.public_key()
|
||||
|
||||
private_bytes = private_key.private_bytes(
|
||||
encoding=serialization.Encoding.Raw,
|
||||
format=serialization.PrivateFormat.Raw,
|
||||
encryption_algorithm=serialization.NoEncryption()
|
||||
)
|
||||
public_bytes = public_key.public_bytes(
|
||||
encoding=serialization.Encoding.Raw,
|
||||
format=serialization.PublicFormat.Raw
|
||||
)
|
||||
address = generate_address(public_bytes)
|
||||
|
||||
salt = os.urandom(32)
|
||||
keystore = encrypt_private_key(private_bytes, password, salt)
|
||||
keystore["address"] = address
|
||||
|
||||
keystore_file = keystore_dir / f"{name}.json"
|
||||
with open(keystore_file, 'w') as f:
|
||||
json.dump(keystore, f, indent=2)
|
||||
os.chmod(keystore_file, 0o600)
|
||||
|
||||
return {
|
||||
"name": name,
|
||||
"address": address,
|
||||
"keystore_file": str(keystore_file),
|
||||
"public_key_hex": public_bytes.hex()
|
||||
}
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Production blockchain setup")
|
||||
parser.add_argument("--base-dir", type=Path, default=Path("/opt/aitbc/apps/blockchain-node"),
|
||||
help="Blockchain node base directory")
|
||||
parser.add_argument("--chain-id", default="ait-mainnet", help="Chain ID")
|
||||
parser.add_argument("--total-supply", type=int, default=1_000_000_000,
|
||||
help="Total token supply (smallest units)")
|
||||
args = parser.parse_args()
|
||||
|
||||
base_dir = args.base_dir
|
||||
keystore_dir = base_dir / "keystore"
|
||||
data_dir = base_dir / "data" / args.chain_id
|
||||
|
||||
keystore_dir.mkdir(parents=True, exist_ok=True)
|
||||
data_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Generate strong random password and save it
|
||||
password = random_password(32)
|
||||
password_file = keystore_dir / ".password"
|
||||
with open(password_file, 'w') as f:
|
||||
f.write(password + "\n")
|
||||
os.chmod(password_file, 0o600)
|
||||
|
||||
print(f"[setup] Generated keystore password and saved to {password_file}")
|
||||
|
||||
# Generate two wallets
|
||||
wallets = []
|
||||
for suffix in ["genesis", "treasury"]:
|
||||
name = f"aitbc1{suffix}"
|
||||
info = generate_wallet(name, password, keystore_dir)
|
||||
# Store both the full name and suffix for lookup
|
||||
info['suffix'] = suffix
|
||||
wallets.append(info)
|
||||
print(f"[setup] Created wallet: {name}")
|
||||
print(f" Address: {info['address']}")
|
||||
print(f" Keystore: {info['keystore_file']}")
|
||||
|
||||
# Create allocations: all supply to genesis wallet, treasury gets 0 (for spending from genesis)
|
||||
genesis_wallet = next(w for w in wallets if w['suffix'] == 'genesis')
|
||||
treasury_wallet = next(w for w in wallets if w['suffix'] == 'treasury')
|
||||
allocations = [
|
||||
{
|
||||
"address": genesis_wallet["address"],
|
||||
"balance": args.total_supply,
|
||||
"nonce": 0
|
||||
},
|
||||
{
|
||||
"address": treasury_wallet["address"],
|
||||
"balance": 0,
|
||||
"nonce": 0
|
||||
}
|
||||
]
|
||||
|
||||
allocations_file = data_dir / "allocations.json"
|
||||
with open(allocations_file, 'w') as f:
|
||||
json.dump(allocations, f, indent=2)
|
||||
print(f"[setup] Wrote allocations to {allocations_file}")
|
||||
|
||||
# Create genesis.json via make_genesis script
|
||||
import subprocess
|
||||
genesis_file = data_dir / "genesis.json"
|
||||
python_exec = base_dir / ".venv" / "bin" / "python"
|
||||
if not python_exec.exists():
|
||||
python_exec = "python3" # fallback
|
||||
result = subprocess.run([
|
||||
str(python_exec), str(base_dir / "scripts" / "make_genesis.py"),
|
||||
"--output", str(genesis_file),
|
||||
"--force",
|
||||
"--allocations", str(allocations_file),
|
||||
"--authorities", genesis_wallet["address"],
|
||||
"--chain-id", args.chain_id
|
||||
], capture_output=True, text=True, cwd=str(base_dir))
|
||||
if result.returncode != 0:
|
||||
print(f"[setup] Genesis generation failed: {result.stderr}")
|
||||
return 1
|
||||
print(f"[setup] Created genesis file at {genesis_file}")
|
||||
print(result.stdout.strip())
|
||||
|
||||
print("\n[setup] Production setup complete!")
|
||||
print(f" Chain ID: {args.chain_id}")
|
||||
print(f" Total supply: {args.total_supply} (fixed)")
|
||||
print(f" Genesis wallet: {genesis_wallet['address']}")
|
||||
print(f" Treasury wallet: {treasury_wallet['address']}")
|
||||
print(f" Keystore password: stored in {password_file}")
|
||||
print("\n[IMPORTANT] Keep the keystore files and password secure!")
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
exit(main())
|
||||
@@ -16,7 +16,7 @@ from .mempool import init_mempool
|
||||
from .metrics import metrics_registry
|
||||
from .rpc.router import router as rpc_router
|
||||
from .rpc.websocket import router as websocket_router
|
||||
from .escrow_routes import router as escrow_router
|
||||
# from .escrow_routes import router as escrow_router # Not yet implemented
|
||||
|
||||
_app_logger = get_logger("aitbc_chain.app")
|
||||
|
||||
@@ -132,7 +132,7 @@ def create_app() -> FastAPI:
|
||||
# Include routers
|
||||
app.include_router(rpc_router, prefix="/rpc", tags=["rpc"])
|
||||
app.include_router(websocket_router, prefix="/rpc")
|
||||
app.include_router(escrow_router, prefix="/rpc")
|
||||
# app.include_router(escrow_router, prefix="/rpc") # Disabled until escrow routes are implemented
|
||||
|
||||
# Metrics and health endpoints
|
||||
metrics_router = APIRouter()
|
||||
|
||||
@@ -31,7 +31,7 @@ class ChainSettings(BaseSettings):
|
||||
proposer_id: str = "ait-devnet-proposer"
|
||||
proposer_key: Optional[str] = None
|
||||
|
||||
mint_per_unit: int = 1000
|
||||
mint_per_unit: int = 0 # No new minting after genesis for production
|
||||
coordinator_ratio: float = 0.05
|
||||
|
||||
block_time_seconds: int = 2
|
||||
@@ -58,5 +58,9 @@ class ChainSettings(BaseSettings):
|
||||
gossip_backend: str = "memory"
|
||||
gossip_broadcast_url: Optional[str] = None
|
||||
|
||||
# Keystore for proposer private key (future block signing)
|
||||
keystore_path: Path = Path("./keystore")
|
||||
keystore_password_file: Path = Path("./keystore/.password")
|
||||
|
||||
|
||||
settings = ChainSettings()
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import asyncio
|
||||
import hashlib
|
||||
import json
|
||||
import re
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Callable, ContextManager, Optional
|
||||
|
||||
from sqlmodel import Session, select
|
||||
@@ -9,7 +11,7 @@ from sqlmodel import Session, select
|
||||
from ..logger import get_logger
|
||||
from ..metrics import metrics_registry
|
||||
from ..config import ProposerConfig
|
||||
from ..models import Block
|
||||
from ..models import Block, Account
|
||||
from ..gossip import gossip_broker
|
||||
|
||||
_METRIC_KEY_SANITIZE = re.compile(r"[^a-zA-Z0-9_]")
|
||||
@@ -199,14 +201,17 @@ class PoAProposer:
|
||||
height=0,
|
||||
hash=block_hash,
|
||||
parent_hash="0x00",
|
||||
proposer="genesis",
|
||||
proposer=self._config.proposer_id, # Use configured proposer as genesis proposer
|
||||
timestamp=timestamp,
|
||||
tx_count=0,
|
||||
state_root=None,
|
||||
)
|
||||
session.add(genesis)
|
||||
session.commit()
|
||||
|
||||
|
||||
# Initialize accounts from genesis allocations file (if present)
|
||||
await self._initialize_genesis_allocations(session)
|
||||
|
||||
# Broadcast genesis block for initial sync
|
||||
await gossip_broker.publish(
|
||||
"blocks",
|
||||
@@ -222,6 +227,33 @@ class PoAProposer:
|
||||
}
|
||||
)
|
||||
|
||||
async def _initialize_genesis_allocations(self, session: Session) -> None:
|
||||
"""Create Account entries from the genesis allocations file."""
|
||||
# Look for genesis file relative to project root: data/{chain_id}/genesis.json
|
||||
# Alternatively, use a path from config (future improvement)
|
||||
genesis_path = Path(f"./data/{self._config.chain_id}/genesis.json")
|
||||
if not genesis_path.exists():
|
||||
self._logger.warning("Genesis allocations file not found; skipping account initialization", extra={"path": str(genesis_path)})
|
||||
return
|
||||
|
||||
with open(genesis_path) as f:
|
||||
genesis_data = json.load(f)
|
||||
|
||||
allocations = genesis_data.get("allocations", [])
|
||||
created = 0
|
||||
for alloc in allocations:
|
||||
addr = alloc["address"]
|
||||
balance = int(alloc["balance"])
|
||||
nonce = int(alloc.get("nonce", 0))
|
||||
# Check if account already exists (idempotent)
|
||||
acct = session.get(Account, (self._config.chain_id, addr))
|
||||
if acct is None:
|
||||
acct = Account(chain_id=self._config.chain_id, address=addr, balance=balance, nonce=nonce)
|
||||
session.add(acct)
|
||||
created += 1
|
||||
session.commit()
|
||||
self._logger.info("Initialized genesis accounts", extra={"count": created, "total": len(allocations)})
|
||||
|
||||
def _fetch_chain_head(self) -> Optional[Block]:
|
||||
with self._session_factory() as session:
|
||||
return session.exec(select(Block).order_by(Block.height.desc()).limit(1)).first()
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import os
|
||||
from contextlib import asynccontextmanager
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
from .config import settings
|
||||
@@ -14,6 +17,73 @@ from .mempool import init_mempool
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
def _load_keystore_password() -> str:
|
||||
"""Load keystore password from file or environment."""
|
||||
pwd_file = settings.keystore_password_file
|
||||
if pwd_file.exists():
|
||||
return pwd_file.read_text().strip()
|
||||
env_pwd = os.getenv("KEYSTORE_PASSWORD")
|
||||
if env_pwd:
|
||||
return env_pwd
|
||||
raise RuntimeError(f"Keystore password not found. Set in {pwd_file} or KEYSTORE_PASSWORD env.")
|
||||
|
||||
def _load_private_key_from_keystore(keystore_dir: Path, password: str, target_address: Optional[str] = None) -> Optional[bytes]:
|
||||
"""Load an ed25519 private key from the keystore.
|
||||
If target_address is given, find the keystore file with matching address.
|
||||
Otherwise, return the first key found.
|
||||
"""
|
||||
if not keystore_dir.exists():
|
||||
return None
|
||||
for kf in keystore_dir.glob("*.json"):
|
||||
try:
|
||||
with open(kf) as f:
|
||||
data = json.load(f)
|
||||
addr = data.get("address")
|
||||
if target_address and addr != target_address:
|
||||
continue
|
||||
# Decrypt
|
||||
from cryptography.hazmat.primitives.asymmetric import ed25519
|
||||
from cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMAC
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
|
||||
crypto = data["crypto"]
|
||||
kdfparams = crypto["kdfparams"]
|
||||
salt = bytes.fromhex(kdfparams["salt"])
|
||||
kdf = PBKDF2HMAC(
|
||||
algorithm=hashes.SHA256(),
|
||||
length=32,
|
||||
salt=salt,
|
||||
iterations=kdfparams["c"],
|
||||
backend=default_backend()
|
||||
)
|
||||
key = kdf.derive(password.encode('utf-8'))
|
||||
nonce = bytes.fromhex(crypto["cipherparams"]["nonce"])
|
||||
ciphertext = bytes.fromhex(crypto["ciphertext"])
|
||||
aesgcm = AESGCM(key)
|
||||
private_bytes = aesgcm.decrypt(nonce, ciphertext, None)
|
||||
# Verify it's ed25519
|
||||
priv_key = ed25519.Ed25519PrivateKey.from_private_bytes(private_bytes)
|
||||
return private_bytes
|
||||
except Exception:
|
||||
continue
|
||||
return None
|
||||
|
||||
# Attempt to load proposer private key from keystore if not set
|
||||
if not settings.proposer_key:
|
||||
try:
|
||||
pwd = _load_keystore_password()
|
||||
key_bytes = _load_private_key_from_keystore(settings.keystore_path, pwd, target_address=settings.proposer_id)
|
||||
if key_bytes:
|
||||
# Encode as hex for easy storage; not yet used for signing
|
||||
settings.proposer_key = key_bytes.hex()
|
||||
logger.info("Loaded proposer private key from keystore", extra={"proposer_id": settings.proposer_id})
|
||||
else:
|
||||
logger.warning("Proposer private key not found in keystore; block signing disabled", extra={"proposer_id": settings.proposer_id})
|
||||
except Exception as e:
|
||||
logger.warning("Failed to load proposer key from keystore", extra={"error": str(e)})
|
||||
|
||||
|
||||
class BlockchainNode:
|
||||
def __init__(self) -> None:
|
||||
|
||||
@@ -4,6 +4,7 @@ from sqlalchemy import func
|
||||
import asyncio
|
||||
import json
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from fastapi import APIRouter, HTTPException, status
|
||||
@@ -61,11 +62,6 @@ class EstimateFeeRequest(BaseModel):
|
||||
payload: Dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
|
||||
class MintFaucetRequest(BaseModel):
|
||||
address: str
|
||||
amount: int = Field(gt=0)
|
||||
|
||||
|
||||
@router.get("/head", summary="Get current chain head")
|
||||
async def get_head(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
metrics_registry.increment("rpc_get_head_total")
|
||||
@@ -530,24 +526,6 @@ async def estimate_fee(request: EstimateFeeRequest) -> Dict[str, Any]:
|
||||
}
|
||||
|
||||
|
||||
@router.post("/admin/mintFaucet", summary="Mint devnet funds to an address")
|
||||
async def mint_faucet(request: MintFaucetRequest, chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
metrics_registry.increment("rpc_mint_faucet_total")
|
||||
start = time.perf_counter()
|
||||
with session_scope() as session:
|
||||
account = session.get(Account, (chain_id, request.address))
|
||||
if account is None:
|
||||
account = Account(chain_id=chain_id, address=request.address, balance=request.amount)
|
||||
session.add(account)
|
||||
else:
|
||||
account.balance += request.amount
|
||||
session.commit()
|
||||
updated_balance = account.balance
|
||||
metrics_registry.increment("rpc_mint_faucet_success_total")
|
||||
metrics_registry.observe("rpc_mint_faucet_duration_seconds", time.perf_counter() - start)
|
||||
return {"address": request.address, "balance": updated_balance}
|
||||
|
||||
|
||||
class ImportBlockRequest(BaseModel):
|
||||
height: int
|
||||
hash: str
|
||||
@@ -663,15 +641,27 @@ async def get_token_supply(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
start = time.perf_counter()
|
||||
|
||||
with session_scope() as session:
|
||||
# Simple implementation for now
|
||||
# Sum balances of all accounts in this chain
|
||||
result = session.exec(select(func.sum(Account.balance)).where(Account.chain_id == chain_id)).one_or_none()
|
||||
circulating = int(result) if result is not None else 0
|
||||
|
||||
# Total supply is read from genesis (fixed), or fallback to circulating if unavailable
|
||||
# Try to locate genesis file
|
||||
genesis_path = Path(f"./data/{chain_id}/genesis.json")
|
||||
total_supply = circulating # default fallback
|
||||
if genesis_path.exists():
|
||||
try:
|
||||
with open(genesis_path) as f:
|
||||
g = json.load(f)
|
||||
total_supply = sum(a["balance"] for a in g.get("allocations", []))
|
||||
except Exception:
|
||||
total_supply = circulating
|
||||
|
||||
response = {
|
||||
"chain_id": chain_id,
|
||||
"total_supply": 1000000000, # 1 billion from genesis
|
||||
"circulating_supply": 0, # No transactions yet
|
||||
"faucet_balance": 1000000000, # All tokens in faucet
|
||||
"faucet_address": "ait1faucet000000000000000000000000000000000",
|
||||
"total_supply": total_supply,
|
||||
"circulating_supply": circulating,
|
||||
"mint_per_unit": cfg.mint_per_unit,
|
||||
"total_accounts": 0
|
||||
}
|
||||
|
||||
metrics_registry.observe("rpc_supply_duration_seconds", time.perf_counter() - start)
|
||||
@@ -682,30 +672,35 @@ async def get_token_supply(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
async def get_validators(chain_id: str = "ait-devnet") -> Dict[str, Any]:
|
||||
"""List blockchain validators (authorities)"""
|
||||
from ..config import settings as cfg
|
||||
|
||||
|
||||
metrics_registry.increment("rpc_validators_total")
|
||||
start = time.perf_counter()
|
||||
|
||||
# For PoA chain, validators are the authorities from genesis
|
||||
# In a full implementation, this would query the actual validator set
|
||||
|
||||
# Build validator set from trusted_proposers config (comma-separated)
|
||||
trusted = [p.strip() for p in cfg.trusted_proposers.split(",") if p.strip()]
|
||||
if not trusted:
|
||||
# Fallback to the node's own proposer_id as the sole validator
|
||||
trusted = [cfg.proposer_id]
|
||||
|
||||
validators = [
|
||||
{
|
||||
"address": "ait1devproposer000000000000000000000000000000",
|
||||
"address": addr,
|
||||
"weight": 1,
|
||||
"status": "active",
|
||||
"last_block_height": None, # Would be populated from actual validator tracking
|
||||
"last_block_height": None, # Could be populated from metrics
|
||||
"total_blocks_produced": None
|
||||
}
|
||||
for addr in trusted
|
||||
]
|
||||
|
||||
|
||||
response = {
|
||||
"chain_id": chain_id,
|
||||
"validators": validators,
|
||||
"total_validators": len(validators),
|
||||
"consensus_type": "PoA", # Proof of Authority
|
||||
"consensus_type": "PoA",
|
||||
"proposer_id": cfg.proposer_id
|
||||
}
|
||||
|
||||
|
||||
metrics_registry.observe("rpc_validators_duration_seconds", time.perf_counter() - start)
|
||||
return response
|
||||
|
||||
|
||||
@@ -469,6 +469,6 @@ def create_app() -> FastAPI:
|
||||
|
||||
app = create_app()
|
||||
|
||||
# Register jobs router
|
||||
from .routers import jobs as jobs_router
|
||||
app.include_router(jobs_router.router)
|
||||
# Register jobs router (disabled - legacy)
|
||||
# from .routers import jobs as jobs_router
|
||||
# app.include_router(jobs_router.router)
|
||||
|
||||
@@ -11,11 +11,12 @@ router = APIRouter(tags=["blockchain"])
|
||||
async def blockchain_status():
|
||||
"""Get blockchain status."""
|
||||
try:
|
||||
# Try to get blockchain status from RPC
|
||||
import httpx
|
||||
from ..config import settings
|
||||
|
||||
rpc_url = settings.blockchain_rpc_url.rstrip('/')
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get("http://localhost:8003/rpc/head", timeout=5.0)
|
||||
response = await client.get(f"{rpc_url}/rpc/head", timeout=5.0)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
return {
|
||||
@@ -42,11 +43,12 @@ async def blockchain_status():
|
||||
async def blockchain_sync_status():
|
||||
"""Get blockchain synchronization status."""
|
||||
try:
|
||||
# Try to get sync status from RPC
|
||||
import httpx
|
||||
from ..config import settings
|
||||
|
||||
rpc_url = settings.blockchain_rpc_url.rstrip('/')
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get("http://localhost:8003/rpc/sync", timeout=5.0)
|
||||
response = await client.get(f"{rpc_url}/rpc/sync", timeout=5.0)
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
return {
|
||||
|
||||
@@ -1004,28 +1004,6 @@ def balance(ctx, address, chain_id, all_chains):
|
||||
except Exception as e:
|
||||
error(f"Network error: {e}")
|
||||
|
||||
@blockchain.command()
|
||||
@click.option('--address', required=True, help='Wallet address')
|
||||
@click.option('--amount', type=int, default=1000, help='Amount to mint')
|
||||
@click.pass_context
|
||||
def faucet(ctx, address, amount):
|
||||
"""Mint devnet funds to an address"""
|
||||
config = ctx.obj['config']
|
||||
try:
|
||||
import httpx
|
||||
with httpx.Client() as client:
|
||||
response = client.post(
|
||||
f"{_get_node_endpoint(ctx)}/rpc/admin/mintFaucet",
|
||||
json={"address": address, "amount": amount, "chain_id": "ait-devnet"},
|
||||
timeout=5
|
||||
)
|
||||
if response.status_code in (200, 201):
|
||||
output(response.json(), ctx.obj['output_format'])
|
||||
else:
|
||||
error(f"Failed to use faucet: {response.status_code} - {response.text}")
|
||||
except Exception as e:
|
||||
error(f"Network error: {e}")
|
||||
|
||||
|
||||
@blockchain.command()
|
||||
@click.option('--chain', required=True, help='Chain ID to verify (e.g., ait-mainnet, ait-devnet)')
|
||||
|
||||
149
dev/scripts/dev_heartbeat.py
Executable file
149
dev/scripts/dev_heartbeat.py
Executable file
@@ -0,0 +1,149 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Dev Heartbeat: Periodic checks for /opt/aitbc development environment.
|
||||
Outputs concise markdown summary. Exit 0 if clean, 1 if issues detected.
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
|
||||
REPO_ROOT = Path("/opt/aitbc")
|
||||
LOGS_DIR = REPO_ROOT / "logs"
|
||||
|
||||
def sh(cmd, cwd=REPO_ROOT):
|
||||
"""Run shell command, return (returncode, stdout)."""
|
||||
result = subprocess.run(cmd, shell=True, cwd=cwd, capture_output=True, text=True)
|
||||
return result.returncode, result.stdout.strip()
|
||||
|
||||
def check_git_status():
|
||||
"""Return summary of uncommitted changes."""
|
||||
rc, out = sh("git status --porcelain")
|
||||
if rc != 0 or not out:
|
||||
return None
|
||||
lines = out.splitlines()
|
||||
changed = len(lines)
|
||||
# categorize simply
|
||||
modified = sum(1 for l in lines if l.startswith(' M') or l.startswith('M '))
|
||||
added = sum(1 for l in lines if l.startswith('A '))
|
||||
deleted = sum(1 for l in lines if l.startswith(' D') or l.startswith('D '))
|
||||
return {"changed": changed, "modified": modified, "added": added, "deleted": deleted, "preview": lines[:10]}
|
||||
|
||||
def check_build_tests():
|
||||
"""Quick build and test health check."""
|
||||
checks = []
|
||||
# 1) Poetry check (dependency resolution)
|
||||
rc, out = sh("poetry check")
|
||||
checks.append(("poetry check", rc == 0, out))
|
||||
# 2) Fast syntax check of CLI package
|
||||
rc, out = sh("python -m py_compile cli/aitbc_cli/__main__.py")
|
||||
checks.append(("cli syntax", rc == 0, out if rc != 0 else "OK"))
|
||||
# 3) Minimal test run (dry-run or 1 quick test)
|
||||
rc, out = sh("python -m pytest tests/ -v --collect-only 2>&1 | head -20")
|
||||
tests_ok = rc == 0
|
||||
checks.append(("test discovery", tests_ok, out if not tests_ok else f"Collected {out.count('test') if 'test' in out else '?'} tests"))
|
||||
all_ok = all(ok for _, ok, _ in checks)
|
||||
return {"all_ok": all_ok, "details": checks}
|
||||
|
||||
def check_logs_errors(hours=1):
|
||||
"""Scan logs for ERROR/WARNING in last N hours."""
|
||||
if not LOGS_DIR.exists():
|
||||
return None
|
||||
errors = []
|
||||
warnings = []
|
||||
cutoff = datetime.now() - timedelta(hours=hours)
|
||||
for logfile in LOGS_DIR.glob("*.log"):
|
||||
try:
|
||||
mtime = datetime.fromtimestamp(logfile.stat().st_mtime)
|
||||
if mtime < cutoff:
|
||||
continue
|
||||
with open(logfile) as f:
|
||||
for line in f:
|
||||
if "ERROR" in line or "FATAL" in line:
|
||||
errors.append(f"{logfile.name}: {line.strip()[:120]}")
|
||||
elif "WARN" in line:
|
||||
warnings.append(f"{logfile.name}: {line.strip()[:120]}")
|
||||
except Exception:
|
||||
continue
|
||||
return {"errors": errors[:20], "warnings": warnings[:20], "total_errors": len(errors), "total_warnings": len(warnings)}
|
||||
|
||||
def check_dependencies():
|
||||
"""Check outdated packages via poetry."""
|
||||
rc, out = sh("poetry show --outdated --no-interaction")
|
||||
if rc != 0 or not out:
|
||||
return []
|
||||
# parse package lines
|
||||
packages = []
|
||||
for line in out.splitlines()[2:]: # skip headers
|
||||
parts = line.split()
|
||||
if len(parts) >= 3:
|
||||
packages.append({"name": parts[0], "current": parts[1], "latest": parts[2]})
|
||||
return packages
|
||||
|
||||
def main():
|
||||
report = []
|
||||
issues = 0
|
||||
|
||||
# Git
|
||||
git = check_git_status()
|
||||
if git and git["changed"] > 0:
|
||||
issues += 1
|
||||
report.append(f"### Git: {git['changed']} uncommitted changes\n")
|
||||
if git["preview"]:
|
||||
report.append("```\n" + "\n".join(git["preview"]) + "\n```")
|
||||
else:
|
||||
report.append("### Git: clean")
|
||||
|
||||
# Build/Tests
|
||||
bt = check_build_tests()
|
||||
if not bt["all_ok"]:
|
||||
issues += 1
|
||||
report.append("### Build/Tests: problems detected\n")
|
||||
for label, ok, msg in bt["details"]:
|
||||
status = "OK" if ok else "FAIL"
|
||||
report.append(f"- **{label}**: {status}")
|
||||
if not ok and msg:
|
||||
report.append(f" ```\n{msg}\n```")
|
||||
else:
|
||||
report.append("### Build/Tests: OK")
|
||||
|
||||
# Logs
|
||||
logs = check_logs_errors()
|
||||
if logs and logs["total_errors"] > 0:
|
||||
issues += 1
|
||||
report.append(f"### Logs: {logs['total_errors']} recent errors (last hour)\n")
|
||||
for e in logs["errors"][:10]:
|
||||
report.append(f"- `{e}`")
|
||||
if logs["total_errors"] > 10:
|
||||
report.append(f"... and {logs['total_errors']-10} more")
|
||||
elif logs and logs["total_warnings"] > 0:
|
||||
# warnings non-blocking but included in report
|
||||
report.append(f"### Logs: {logs['total_warnings']} recent warnings (last hour)")
|
||||
else:
|
||||
report.append("### Logs: no recent errors")
|
||||
|
||||
# Dependencies
|
||||
outdated = check_dependencies()
|
||||
if outdated:
|
||||
issues += 1
|
||||
report.append(f"### Dependencies: {len(outdated)} outdated packages\n")
|
||||
for pkg in outdated[:10]:
|
||||
report.append(f"- {pkg['name']}: {pkg['current']} → {pkg['latest']}")
|
||||
if len(outdated) > 10:
|
||||
report.append(f"... and {len(outdated)-10} more")
|
||||
else:
|
||||
report.append("### Dependencies: up to date")
|
||||
|
||||
# Final output
|
||||
header = f"# Dev Heartbeat — {datetime.now().strftime('%Y-%m-%d %H:%M UTC')}\n\n"
|
||||
summary = f"**Issues:** {issues}\n\n" if issues > 0 else "**Status:** All checks passed.\n\n"
|
||||
full_report = header + summary + "\n".join(report)
|
||||
|
||||
print(full_report)
|
||||
|
||||
# Exit code signals issues presence
|
||||
sys.exit(1 if issues > 0 else 0)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
27
dev/scripts/generate_production_keys.py
Normal file
27
dev/scripts/generate_production_keys.py
Normal file
@@ -0,0 +1,27 @@
|
||||
#!/usr/bin/env python3
|
||||
import secrets
|
||||
import string
|
||||
import json
|
||||
import os
|
||||
|
||||
def random_string(length=32):
|
||||
alphabet = string.ascii_letters + string.digits
|
||||
return ''.join(secrets.choice(alphabet) for _ in range(length))
|
||||
|
||||
def generate_production_keys():
|
||||
client_key = f"client_prod_key_{random_string(24)}"
|
||||
miner_key = f"miner_prod_key_{random_string(24)}"
|
||||
admin_key = f"admin_prod_key_{random_string(24)}"
|
||||
hmac_secret = random_string(64)
|
||||
jwt_secret = random_string(64)
|
||||
return {
|
||||
"CLIENT_API_KEYS": [client_key],
|
||||
"MINER_API_KEYS": [miner_key],
|
||||
"ADMIN_API_KEYS": [admin_key],
|
||||
"HMAC_SECRET": hmac_secret,
|
||||
"JWT_SECRET": jwt_secret
|
||||
}
|
||||
|
||||
if __name__ == "__main__":
|
||||
keys = generate_production_keys()
|
||||
print(json.dumps(keys, indent=2))
|
||||
44
dev/scripts/security_scan.py
Executable file
44
dev/scripts/security_scan.py
Executable file
@@ -0,0 +1,44 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Security vulnerability scanner for AITBC dependencies.
|
||||
Uses pip-audit to check installed packages in the CLI virtualenv.
|
||||
"""
|
||||
import subprocess
|
||||
import json
|
||||
import sys
|
||||
|
||||
PIP_AUDIT = '/opt/aitbc/cli/venv/bin/pip-audit'
|
||||
|
||||
def run_audit():
|
||||
try:
|
||||
result = subprocess.run([PIP_AUDIT, '--format', 'json'],
|
||||
capture_output=True, text=True, timeout=300)
|
||||
if result.returncode not in (0, 1): # 1 means vulns found, 0 means clean
|
||||
return f"❌ pip-audit execution failed (exit {result.returncode}):\n{result.stderr}"
|
||||
data = json.loads(result.stdout) if result.stdout else {}
|
||||
vulns = data.get('vulnerabilities', [])
|
||||
if not vulns:
|
||||
return "✅ Security scan: No known vulnerabilities in installed packages."
|
||||
# Summarize by severity
|
||||
sev_counts = {}
|
||||
for v in vulns:
|
||||
sev = v.get('severity', 'UNKNOWN')
|
||||
sev_counts[sev] = sev_counts.get(sev, 0) + 1
|
||||
lines = ["🚨 Security scan: Found vulnerabilities:"]
|
||||
for sev, count in sorted(sev_counts.items(), key=lambda x: x[1], reverse=True):
|
||||
lines.append(f"- {sev}: {count} package(s)")
|
||||
# Add top 3 vulnerable packages
|
||||
if vulns:
|
||||
lines.append("\nTop vulnerable packages:")
|
||||
for v in vulns[:3]:
|
||||
pkg = v.get('package', 'unknown')
|
||||
vuln_id = v.get('vulnerability_id', 'unknown')
|
||||
lines.append(f"- {pkg}: {vuln_id}")
|
||||
return "\n".join(lines)
|
||||
except Exception as e:
|
||||
return f"❌ Error during security scan: {str(e)}"
|
||||
|
||||
if __name__ == '__main__':
|
||||
message = run_audit()
|
||||
print(message)
|
||||
sys.exit(0)
|
||||
@@ -2,11 +2,12 @@
|
||||
"""
|
||||
Task Claim System for AITBC agents.
|
||||
Uses Git branch atomic creation as a distributed lock to prevent duplicate work.
|
||||
Now with TTL/lease: claims expire after 2 hours to prevent stale locks.
|
||||
"""
|
||||
import os
|
||||
import json
|
||||
import subprocess
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
|
||||
REPO_DIR = '/opt/aitbc'
|
||||
STATE_FILE = '/opt/aitbc/.claim-state.json'
|
||||
@@ -16,6 +17,7 @@ MY_AGENT = os.getenv('AGENT_NAME', 'aitbc1')
|
||||
ISSUE_LABELS = ['security', 'bug', 'feature', 'refactor', 'task'] # priority order
|
||||
BONUS_LABELS = ['good-first-task-for-agent']
|
||||
AVOID_LABELS = ['needs-design', 'blocked', 'needs-reproduction']
|
||||
CLAIM_TTL_SECONDS = 7200 # 2 hours lease
|
||||
|
||||
def query_api(path, method='GET', data=None):
|
||||
url = f"{API_BASE}/{path}"
|
||||
@@ -88,6 +90,24 @@ def claim_issue(issue_number):
|
||||
result = subprocess.run(['git', 'push', 'origin', branch_name], capture_output=True, text=True, cwd=REPO_DIR)
|
||||
return result.returncode == 0
|
||||
|
||||
def is_claim_stale(claim_branch):
|
||||
"""Check if a claim branch is older than TTL (stale lock)."""
|
||||
try:
|
||||
result = subprocess.run(['git', 'ls-remote', '--heads', 'origin', claim_branch],
|
||||
capture_output=True, text=True, cwd=REPO_DIR)
|
||||
if result.returncode != 0 or not result.stdout.strip():
|
||||
return True # branch missing, treat as stale
|
||||
# Optional: could check commit timestamp via git show -s --format=%ct <sha>
|
||||
# For simplicity, we'll rely on state file expiration
|
||||
return False
|
||||
except Exception:
|
||||
return True
|
||||
|
||||
def cleanup_stale_claim(claim_branch):
|
||||
"""Delete a stale claim branch from remote."""
|
||||
subprocess.run(['git', 'push', 'origin', '--delete', claim_branch],
|
||||
capture_output=True, cwd=REPO_DIR)
|
||||
|
||||
def assign_issue(issue_number, assignee):
|
||||
data = {"assignee": assignee}
|
||||
return query_api(f'repos/oib/aitbc/issues/{issue_number}/assignees', method='POST', data=data)
|
||||
@@ -105,17 +125,35 @@ def create_work_branch(issue_number, title):
|
||||
return branch_name
|
||||
|
||||
def main():
|
||||
now = datetime.utcnow().isoformat() + 'Z'
|
||||
print(f"[{now}] Claim task cycle starting...")
|
||||
now = datetime.utcnow().replace(tzinfo=timezone.utc)
|
||||
now_iso = now.isoformat()
|
||||
now_ts = now.timestamp()
|
||||
print(f"[{now_iso}] Claim task cycle starting...")
|
||||
|
||||
state = load_state()
|
||||
current_claim = state.get('current_claim')
|
||||
|
||||
# Check if our own claim expired
|
||||
if current_claim:
|
||||
claimed_at = state.get('claimed_at')
|
||||
expires_at = state.get('expires_at')
|
||||
if expires_at and now_ts > expires_at:
|
||||
print(f"Claim for issue #{current_claim} has expired (claimed at {claimed_at}). Releasing.")
|
||||
# Delete the claim branch and clear state
|
||||
claim_branch = state.get('claim_branch')
|
||||
if claim_branch:
|
||||
cleanup_stale_claim(claim_branch)
|
||||
state = {}
|
||||
save_state(state)
|
||||
current_claim = None
|
||||
|
||||
if current_claim:
|
||||
print(f"Already working on issue #{current_claim} (branch {state.get('work_branch')})")
|
||||
# Optional: could check if that PR has been merged/closed and release claim here
|
||||
return
|
||||
|
||||
# Optional global cleanup: delete any stale claim branches (older than TTL)
|
||||
cleanup_global_stale_claims(now_ts)
|
||||
|
||||
issues = get_open_unassigned_issues()
|
||||
if not issues:
|
||||
print("No unassigned issues available.")
|
||||
@@ -126,25 +164,70 @@ def main():
|
||||
title = issue['title']
|
||||
labels = [lbl['name'] for lbl in issue.get('labels', [])]
|
||||
print(f"Attempting to claim issue #{num}: {title} (labels={labels})")
|
||||
|
||||
# Check if claim branch exists and is stale
|
||||
claim_branch = f'claim/{num}'
|
||||
if not is_claim_stale(claim_branch):
|
||||
print(f"Claim failed for #{num} (active claim exists). Trying next...")
|
||||
continue
|
||||
|
||||
# Force-delete any lingering claim branch before creating our own
|
||||
cleanup_stale_claim(claim_branch)
|
||||
|
||||
if claim_issue(num):
|
||||
assign_issue(num, MY_AGENT)
|
||||
work_branch = create_work_branch(num, title)
|
||||
expires_at = now_ts + CLAIM_TTL_SECONDS
|
||||
state.update({
|
||||
'current_claim': num,
|
||||
'claim_branch': f'claim/{num}',
|
||||
'claim_branch': claim_branch,
|
||||
'work_branch': work_branch,
|
||||
'claimed_at': datetime.utcnow().isoformat() + 'Z',
|
||||
'claimed_at': now_iso,
|
||||
'expires_at': expires_at,
|
||||
'issue_title': title,
|
||||
'labels': labels
|
||||
})
|
||||
save_state(state)
|
||||
print(f"✅ Claimed issue #{num}. Work branch: {work_branch}")
|
||||
add_comment(num, f"Agent `{MY_AGENT}` claiming this task. (automated)")
|
||||
print(f"✅ Claimed issue #{num}. Work branch: {work_branch} (expires {datetime.fromtimestamp(expires_at, tz=timezone.utc).isoformat()})")
|
||||
add_comment(num, f"Agent `{MY_AGENT}` claiming this task with TTL {CLAIM_TTL_SECONDS/3600}h. (automated)")
|
||||
return
|
||||
else:
|
||||
print(f"Claim failed for #{num} (branch exists). Trying next...")
|
||||
print(f"Claim failed for #{num} (push error). Trying next...")
|
||||
|
||||
print("Could not claim any issue; all taken or unavailable.")
|
||||
|
||||
def cleanup_global_stale_claims(now_ts=None):
|
||||
"""Remove claim branches that appear stale (based on commit age)."""
|
||||
if now_ts is None:
|
||||
now_ts = datetime.utcnow().timestamp()
|
||||
# List all remote claim branches
|
||||
result = subprocess.run(['git', 'ls-remote', '--heads', 'origin', 'claim/*'],
|
||||
capture_output=True, text=True, cwd=REPO_DIR)
|
||||
if result.returncode != 0 or not result.stdout.strip():
|
||||
return
|
||||
lines = result.stdout.strip().split('\n')
|
||||
cleaned = 0
|
||||
for line in lines:
|
||||
if not line.strip():
|
||||
continue
|
||||
parts = line.split()
|
||||
if len(parts) < 2:
|
||||
continue
|
||||
sha, branch = parts[0], parts[1]
|
||||
# Get commit timestamp
|
||||
ts_result = subprocess.run(['git', 'show', '-s', '--format=%ct', sha],
|
||||
capture_output=True, text=True, cwd=REPO_DIR)
|
||||
if ts_result.returncode == 0 and ts_result.stdout.strip():
|
||||
commit_ts = int(ts_result.stdout.strip())
|
||||
age = now_ts - commit_ts
|
||||
if age > CLAIM_TTL_SECONDS:
|
||||
print(f"Expired claim branch: {branch} (age {age/3600:.1f}h). Deleting.")
|
||||
cleanup_stale_claim(branch)
|
||||
cleaned += 1
|
||||
if cleaned == 0:
|
||||
print(" cleanup_global_stale_claims: none")
|
||||
else:
|
||||
print(f" cleanup_global_stale_claims: removed {cleaned} expired branch(es)")
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
@@ -4,14 +4,14 @@ Enhanced monitor for Gitea PRs:
|
||||
- Auto-request review from sibling on my PRs
|
||||
- Auto-validate sibling's PRs and approve if passing checks, with stability ring awareness
|
||||
- Monitor CI statuses and report failures
|
||||
- Release claim branches when associated PRs merge or close
|
||||
- Release claim branches when associated PRs merge, close, or EXPIRE
|
||||
"""
|
||||
import os
|
||||
import json
|
||||
import subprocess
|
||||
import tempfile
|
||||
import shutil
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
|
||||
GITEA_TOKEN = os.getenv('GITEA_TOKEN') or 'ffce3b62d583b761238ae00839dce7718acaad85'
|
||||
REPO = 'oib/aitbc'
|
||||
@@ -19,6 +19,7 @@ API_BASE = os.getenv('GITEA_API_BASE', 'http://gitea.bubuit.net:3000/api/v1')
|
||||
MY_AGENT = os.getenv('AGENT_NAME', 'aitbc1')
|
||||
SIBLING_AGENT = 'aitbc' if MY_AGENT == 'aitbc1' else 'aitbc1'
|
||||
CLAIM_STATE_FILE = '/opt/aitbc/.claim-state.json'
|
||||
CLAIM_TTL_SECONDS = 7200 # Must match claim-task.py
|
||||
|
||||
def query_api(path, method='GET', data=None):
|
||||
url = f"{API_BASE}/{path}"
|
||||
@@ -74,6 +75,14 @@ def release_claim(issue_number, claim_branch):
|
||||
save_claim_state(state)
|
||||
print(f"✅ Released claim for issue #{issue_number} (deleted branch {claim_branch})")
|
||||
|
||||
def is_claim_expired(state):
|
||||
"""Check if the current claim has exceeded TTL."""
|
||||
expires_at = state.get('expires_at')
|
||||
if not expires_at:
|
||||
return False
|
||||
now_ts = datetime.utcnow().timestamp()
|
||||
return now_ts > expires_at
|
||||
|
||||
def get_open_prs():
|
||||
return query_api(f'repos/{REPO}/pulls?state=open') or []
|
||||
|
||||
@@ -126,23 +135,30 @@ def validate_pr_branch(pr):
|
||||
shutil.rmtree(tmpdir, ignore_errors=True)
|
||||
|
||||
def main():
|
||||
now = datetime.utcnow().isoformat() + 'Z'
|
||||
print(f"[{now}] Monitoring PRs and claim locks...")
|
||||
now = datetime.utcnow().replace(tzinfo=timezone.utc)
|
||||
now_iso = now.isoformat()
|
||||
now_ts = now.timestamp()
|
||||
print(f"[{now_iso}] Monitoring PRs and claim locks...")
|
||||
|
||||
# 0. Check claim state: if we have a current claim, see if corresponding PR merged
|
||||
# 0. Check claim state: if we have a current claim, see if it expired or PR merged
|
||||
state = load_claim_state()
|
||||
if state.get('current_claim'):
|
||||
issue_num = state['current_claim']
|
||||
work_branch = state.get('work_branch')
|
||||
claim_branch = state.get('claim_branch')
|
||||
all_prs = get_all_prs(state='all')
|
||||
matched_pr = None
|
||||
for pr in all_prs:
|
||||
if pr['head']['ref'] == work_branch:
|
||||
matched_pr = pr
|
||||
break
|
||||
if matched_pr:
|
||||
if matched_pr['state'] == 'closed':
|
||||
# Check expiration
|
||||
if is_claim_expired(state):
|
||||
print(f"Claim for issue #{issue_num} has expired. Releasing.")
|
||||
release_claim(issue_num, claim_branch)
|
||||
else:
|
||||
# Check if PR merged/closed
|
||||
all_prs = get_all_prs(state='all')
|
||||
matched_pr = None
|
||||
for pr in all_prs:
|
||||
if pr['head']['ref'] == work_branch:
|
||||
matched_pr = pr
|
||||
break
|
||||
if matched_pr and matched_pr['state'] == 'closed':
|
||||
release_claim(issue_num, claim_branch)
|
||||
|
||||
# 1. Process open PRs
|
||||
@@ -191,10 +207,47 @@ def main():
|
||||
for s in failing:
|
||||
notifications.append(f"PR #{number} status check failure: {s.get('context','unknown')} - {s.get('status','unknown')}")
|
||||
|
||||
# 2. Global cleanup of stale claim branches (orphaned, older than TTL)
|
||||
cleanup_global_expired_claims(now_ts)
|
||||
|
||||
if notifications:
|
||||
print("\n".join(notifications))
|
||||
else:
|
||||
print("No new alerts.")
|
||||
|
||||
def cleanup_global_expired_claims(now_ts=None):
|
||||
"""Delete remote claim branches that are older than TTL, even if state file is gone."""
|
||||
if now_ts is None:
|
||||
now_ts = datetime.utcnow().timestamp()
|
||||
# List all remote claim branches
|
||||
result = subprocess.run(['git', 'ls-remote', '--heads', 'origin', 'claim/*'],
|
||||
capture_output=True, text=True, cwd='/opt/aitbc')
|
||||
if result.returncode != 0 or not result.stdout.strip():
|
||||
return
|
||||
lines = result.stdout.strip().split('\n')
|
||||
cleaned = 0
|
||||
for line in lines:
|
||||
if not line.strip():
|
||||
continue
|
||||
parts = line.split()
|
||||
if len(parts) < 2:
|
||||
continue
|
||||
sha, branch = parts[0], parts[1]
|
||||
# Get commit timestamp
|
||||
ts_result = subprocess.run(['git', 'show', '-s', '--format=%ct', sha],
|
||||
capture_output=True, text=True, cwd='/opt/aitbc')
|
||||
if ts_result.returncode == 0 and ts_result.stdout.strip():
|
||||
commit_ts = int(ts_result.stdout.strip())
|
||||
age = now_ts - commit_ts
|
||||
if age > CLAIM_TTL_SECONDS:
|
||||
print(f"Expired claim branch: {branch} (age {age/3600:.1f}h). Deleting.")
|
||||
subprocess.run(['git', 'push', 'origin', '--delete', branch],
|
||||
capture_output=True, cwd='/opt/aitbc')
|
||||
cleaned += 1
|
||||
if cleaned == 0:
|
||||
print(" cleanup_global_expired_claims: none")
|
||||
else:
|
||||
print(f" cleanup_global_expired_claims: removed {cleaned} expired branch(es)")
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
Reference in New Issue
Block a user