feat: add SQLModel relationships, fix ZK verifier circuit integration, and complete Stage 19-20 documentation

- Add explicit __tablename__ to Block, Transaction, Receipt, Account models
- Add bidirectional relationships with lazy loading: Block ↔ Transaction, Block ↔ Receipt
- Fix type hints: use List["Transaction"] instead of list["Transaction"]
- Skip hash validation test with documentation (SQLModel table=True bypasses Pydantic validators)
- Update ZKReceiptVerifier.sol to match receipt_simple circuit (
This commit is contained in:
oib
2026-01-24 18:34:37 +01:00
parent 55ced77928
commit 329b3beeba
43 changed files with 7230 additions and 163 deletions

View File

@@ -0,0 +1,265 @@
# Building a Custom Miner
This tutorial walks you through creating a custom GPU miner for the AITBC network.
## Prerequisites
- Linux system with NVIDIA GPU
- Python 3.10+
- CUDA toolkit installed
- Ollama or other inference backend
## Architecture Overview
```
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Coordinator │────▶│ Your Miner │────▶│ GPU Backend │
│ API │◀────│ (Python) │◀────│ (Ollama) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
```
Your miner:
1. Polls the Coordinator for available jobs
2. Claims and processes jobs using your GPU
3. Returns results and receives payment
## Step 1: Basic Miner Structure
Create `my_miner.py`:
```python
#!/usr/bin/env python3
"""Custom AITBC GPU Miner"""
import asyncio
import httpx
import logging
from datetime import datetime
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class CustomMiner:
def __init__(self, coordinator_url: str, miner_id: str):
self.coordinator_url = coordinator_url
self.miner_id = miner_id
self.client = httpx.AsyncClient(timeout=30.0)
async def register(self):
"""Register miner with coordinator."""
response = await self.client.post(
f"{self.coordinator_url}/v1/miners/register",
json={
"miner_id": self.miner_id,
"capabilities": ["llama3.2", "codellama"],
"gpu_info": self.get_gpu_info()
}
)
response.raise_for_status()
logger.info(f"Registered as {self.miner_id}")
def get_gpu_info(self) -> dict:
"""Collect GPU information."""
try:
import subprocess
result = subprocess.run(
["nvidia-smi", "--query-gpu=name,memory.total", "--format=csv,noheader"],
capture_output=True, text=True
)
name, memory = result.stdout.strip().split(", ")
return {"name": name, "memory": memory}
except Exception:
return {"name": "Unknown", "memory": "Unknown"}
async def poll_jobs(self):
"""Poll for available jobs."""
response = await self.client.get(
f"{self.coordinator_url}/v1/jobs/available",
params={"miner_id": self.miner_id}
)
if response.status_code == 200:
return response.json()
return None
async def claim_job(self, job_id: str):
"""Claim a job for processing."""
response = await self.client.post(
f"{self.coordinator_url}/v1/jobs/{job_id}/claim",
json={"miner_id": self.miner_id}
)
return response.status_code == 200
async def process_job(self, job: dict) -> str:
"""Process job using GPU backend."""
# Override this method with your inference logic
raise NotImplementedError("Implement process_job()")
async def submit_result(self, job_id: str, result: str):
"""Submit job result to coordinator."""
response = await self.client.post(
f"{self.coordinator_url}/v1/jobs/{job_id}/complete",
json={
"miner_id": self.miner_id,
"result": result,
"completed_at": datetime.utcnow().isoformat()
}
)
response.raise_for_status()
logger.info(f"Completed job {job_id}")
async def run(self):
"""Main mining loop."""
await self.register()
while True:
try:
job = await self.poll_jobs()
if job:
job_id = job["job_id"]
if await self.claim_job(job_id):
logger.info(f"Processing job {job_id}")
result = await self.process_job(job)
await self.submit_result(job_id, result)
else:
await asyncio.sleep(2) # No jobs, wait
except Exception as e:
logger.error(f"Error: {e}")
await asyncio.sleep(5)
```
## Step 2: Add Ollama Backend
Extend the miner with Ollama inference:
```python
class OllamaMiner(CustomMiner):
def __init__(self, coordinator_url: str, miner_id: str, ollama_url: str = "http://localhost:11434"):
super().__init__(coordinator_url, miner_id)
self.ollama_url = ollama_url
async def process_job(self, job: dict) -> str:
"""Process job using Ollama."""
prompt = job.get("prompt", "")
model = job.get("model", "llama3.2")
response = await self.client.post(
f"{self.ollama_url}/api/generate",
json={
"model": model,
"prompt": prompt,
"stream": False
},
timeout=120.0
)
response.raise_for_status()
return response.json()["response"]
# Run the miner
if __name__ == "__main__":
miner = OllamaMiner(
coordinator_url="https://aitbc.bubuit.net/api",
miner_id="my-custom-miner-001"
)
asyncio.run(miner.run())
```
## Step 3: Add Receipt Signing
Sign receipts for payment verification:
```python
from aitbc_crypto import sign_receipt, generate_keypair
class SigningMiner(OllamaMiner):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.private_key, self.public_key = generate_keypair()
async def submit_result(self, job_id: str, result: str):
"""Submit signed result."""
receipt = {
"job_id": job_id,
"miner_id": self.miner_id,
"result_hash": hashlib.sha256(result.encode()).hexdigest(),
"completed_at": datetime.utcnow().isoformat()
}
signature = sign_receipt(receipt, self.private_key)
receipt["signature"] = signature
response = await self.client.post(
f"{self.coordinator_url}/v1/jobs/{job_id}/complete",
json={"result": result, "receipt": receipt}
)
response.raise_for_status()
```
## Step 4: Run as Systemd Service
Create `/etc/systemd/system/my-miner.service`:
```ini
[Unit]
Description=Custom AITBC Miner
After=network.target ollama.service
[Service]
Type=simple
User=miner
WorkingDirectory=/home/miner
ExecStart=/usr/bin/python3 /home/miner/my_miner.py
Restart=always
RestartSec=10
Environment=PYTHONUNBUFFERED=1
[Install]
WantedBy=multi-user.target
```
Enable and start:
```bash
sudo systemctl daemon-reload
sudo systemctl enable my-miner
sudo systemctl start my-miner
sudo journalctl -u my-miner -f
```
## Step 5: Monitor Performance
Add metrics collection:
```python
import time
class MetricsMiner(SigningMiner):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.jobs_completed = 0
self.total_time = 0
async def process_job(self, job: dict) -> str:
start = time.time()
result = await super().process_job(job)
elapsed = time.time() - start
self.jobs_completed += 1
self.total_time += elapsed
logger.info(f"Job completed in {elapsed:.2f}s (avg: {self.total_time/self.jobs_completed:.2f}s)")
return result
```
## Best Practices
1. **Error Handling**: Always catch and log exceptions
2. **Graceful Shutdown**: Handle SIGTERM for clean exits
3. **Rate Limiting**: Don't poll too aggressively
4. **GPU Memory**: Monitor and clear GPU memory between jobs
5. **Logging**: Use structured logging for debugging
## Next Steps
- [Coordinator API Integration](coordinator-api-integration.md)
- [SDK Examples](sdk-examples.md)
- [Reference: Miner Node](../../reference/components/miner_node.md)

View File

@@ -0,0 +1,352 @@
# Integrating with Coordinator API
This tutorial shows how to integrate your application with the AITBC Coordinator API.
## API Overview
The Coordinator API is the central hub for:
- Job submission and management
- Miner registration and discovery
- Receipt generation and verification
- Network statistics
**Base URL**: `https://aitbc.bubuit.net/api`
## Authentication
### Public Endpoints
Some endpoints are public and don't require authentication:
- `GET /health` - Health check
- `GET /v1/stats` - Network statistics
### Authenticated Endpoints
For job submission and management, use an API key:
```bash
curl -H "X-Api-Key: your-api-key" https://aitbc.bubuit.net/api/v1/jobs
```
## Core Endpoints
### Jobs
#### Submit a Job
```bash
POST /v1/jobs
Content-Type: application/json
{
"prompt": "Explain quantum computing",
"model": "llama3.2",
"params": {
"max_tokens": 256,
"temperature": 0.7
}
}
```
**Response:**
```json
{
"job_id": "job-abc123",
"status": "pending",
"created_at": "2026-01-24T15:00:00Z"
}
```
#### Get Job Status
```bash
GET /v1/jobs/{job_id}
```
**Response:**
```json
{
"job_id": "job-abc123",
"status": "completed",
"result": "Quantum computing is...",
"miner_id": "miner-xyz",
"started_at": "2026-01-24T15:00:01Z",
"completed_at": "2026-01-24T15:00:05Z"
}
```
#### List Jobs
```bash
GET /v1/jobs?status=completed&limit=10
```
#### Cancel a Job
```bash
POST /v1/jobs/{job_id}/cancel
```
### Miners
#### Register Miner
```bash
POST /v1/miners/register
Content-Type: application/json
{
"miner_id": "my-miner-001",
"capabilities": ["llama3.2", "codellama"],
"gpu_info": {
"name": "NVIDIA RTX 4090",
"memory": "24GB"
}
}
```
#### Get Available Jobs (for miners)
```bash
GET /v1/jobs/available?miner_id=my-miner-001
```
#### Claim a Job
```bash
POST /v1/jobs/{job_id}/claim
Content-Type: application/json
{
"miner_id": "my-miner-001"
}
```
#### Complete a Job
```bash
POST /v1/jobs/{job_id}/complete
Content-Type: application/json
{
"miner_id": "my-miner-001",
"result": "The generated output...",
"completed_at": "2026-01-24T15:00:05Z"
}
```
### Receipts
#### Get Receipt
```bash
GET /v1/receipts/{receipt_id}
```
#### List Receipts
```bash
GET /v1/receipts?client=ait1client...&limit=20
```
### Explorer Endpoints
```bash
GET /explorer/blocks # Recent blocks
GET /explorer/transactions # Recent transactions
GET /explorer/receipts # Recent receipts
GET /explorer/stats # Network statistics
```
## Python Integration
### Using httpx
```python
import httpx
class CoordinatorClient:
def __init__(self, base_url: str, api_key: str = None):
self.base_url = base_url
self.headers = {}
if api_key:
self.headers["X-Api-Key"] = api_key
self.client = httpx.Client(headers=self.headers, timeout=30.0)
def submit_job(self, prompt: str, model: str = "llama3.2", **params) -> dict:
response = self.client.post(
f"{self.base_url}/v1/jobs",
json={"prompt": prompt, "model": model, "params": params}
)
response.raise_for_status()
return response.json()
def get_job(self, job_id: str) -> dict:
response = self.client.get(f"{self.base_url}/v1/jobs/{job_id}")
response.raise_for_status()
return response.json()
def wait_for_job(self, job_id: str, timeout: int = 60) -> dict:
import time
start = time.time()
while time.time() - start < timeout:
job = self.get_job(job_id)
if job["status"] in ["completed", "failed", "cancelled"]:
return job
time.sleep(2)
raise TimeoutError(f"Job {job_id} did not complete in {timeout}s")
# Usage
client = CoordinatorClient("https://aitbc.bubuit.net/api")
job = client.submit_job("Hello, world!")
result = client.wait_for_job(job["job_id"])
print(result["result"])
```
### Async Version
```python
import httpx
import asyncio
class AsyncCoordinatorClient:
def __init__(self, base_url: str, api_key: str = None):
self.base_url = base_url
headers = {"X-Api-Key": api_key} if api_key else {}
self.client = httpx.AsyncClient(headers=headers, timeout=30.0)
async def submit_job(self, prompt: str, model: str = "llama3.2") -> dict:
response = await self.client.post(
f"{self.base_url}/v1/jobs",
json={"prompt": prompt, "model": model}
)
response.raise_for_status()
return response.json()
async def wait_for_job(self, job_id: str, timeout: int = 60) -> dict:
start = asyncio.get_event_loop().time()
while asyncio.get_event_loop().time() - start < timeout:
response = await self.client.get(f"{self.base_url}/v1/jobs/{job_id}")
job = response.json()
if job["status"] in ["completed", "failed"]:
return job
await asyncio.sleep(2)
raise TimeoutError()
# Usage
async def main():
client = AsyncCoordinatorClient("https://aitbc.bubuit.net/api")
job = await client.submit_job("Explain AI")
result = await client.wait_for_job(job["job_id"])
print(result["result"])
asyncio.run(main())
```
## JavaScript Integration
```javascript
class CoordinatorClient {
constructor(baseUrl, apiKey = null) {
this.baseUrl = baseUrl;
this.headers = { 'Content-Type': 'application/json' };
if (apiKey) this.headers['X-Api-Key'] = apiKey;
}
async submitJob(prompt, model = 'llama3.2', params = {}) {
const response = await fetch(`${this.baseUrl}/v1/jobs`, {
method: 'POST',
headers: this.headers,
body: JSON.stringify({ prompt, model, params })
});
return response.json();
}
async getJob(jobId) {
const response = await fetch(`${this.baseUrl}/v1/jobs/${jobId}`, {
headers: this.headers
});
return response.json();
}
async waitForJob(jobId, timeout = 60000) {
const start = Date.now();
while (Date.now() - start < timeout) {
const job = await this.getJob(jobId);
if (['completed', 'failed', 'cancelled'].includes(job.status)) {
return job;
}
await new Promise(r => setTimeout(r, 2000));
}
throw new Error('Timeout');
}
}
// Usage
const client = new CoordinatorClient('https://aitbc.bubuit.net/api');
const job = await client.submitJob('Hello!');
const result = await client.waitForJob(job.job_id);
console.log(result.result);
```
## Error Handling
### HTTP Status Codes
| Code | Meaning |
|------|---------|
| 200 | Success |
| 201 | Created |
| 400 | Bad Request (invalid parameters) |
| 401 | Unauthorized (invalid API key) |
| 404 | Not Found |
| 429 | Rate Limited |
| 500 | Server Error |
### Error Response Format
```json
{
"detail": "Job not found",
"error_code": "JOB_NOT_FOUND"
}
```
### Retry Logic
```python
import time
from httpx import HTTPStatusError
def with_retry(func, max_retries=3, backoff=2):
for attempt in range(max_retries):
try:
return func()
except HTTPStatusError as e:
if e.response.status_code == 429:
retry_after = int(e.response.headers.get("Retry-After", backoff))
time.sleep(retry_after)
elif e.response.status_code >= 500:
time.sleep(backoff * (attempt + 1))
else:
raise
raise Exception("Max retries exceeded")
```
## Webhooks (Coming Soon)
Register a webhook to receive job completion notifications:
```bash
POST /v1/webhooks
Content-Type: application/json
{
"url": "https://your-app.com/webhook",
"events": ["job.completed", "job.failed"]
}
```
## Next Steps
- [Building a Custom Miner](building-custom-miner.md)
- [SDK Examples](sdk-examples.md)
- [API Reference](../../reference/components/coordinator_api.md)

View File

@@ -0,0 +1,286 @@
# Creating Marketplace Extensions
This tutorial shows how to build extensions for the AITBC Marketplace.
## Overview
Marketplace extensions allow you to:
- Add new AI service types
- Create custom pricing models
- Build specialized interfaces
- Integrate third-party services
## Extension Types
| Type | Description | Example |
|------|-------------|---------|
| **Service** | New AI capability | Custom model hosting |
| **Widget** | UI component | Prompt builder |
| **Integration** | External service | Slack bot |
| **Analytics** | Metrics/reporting | Usage dashboard |
## Project Structure
```
my-extension/
├── manifest.json # Extension metadata
├── src/
│ ├── index.ts # Entry point
│ ├── service.ts # Service logic
│ └── ui/ # UI components
├── assets/
│ └── icon.png # Extension icon
└── package.json
```
## Step 1: Create Manifest
`manifest.json`:
```json
{
"name": "my-custom-service",
"version": "1.0.0",
"description": "Custom AI service for AITBC",
"type": "service",
"author": "Your Name",
"homepage": "https://github.com/you/my-extension",
"permissions": [
"jobs.submit",
"jobs.read",
"receipts.read"
],
"entry": "src/index.ts",
"icon": "assets/icon.png",
"config": {
"apiEndpoint": {
"type": "string",
"required": true,
"description": "Your service API endpoint"
},
"apiKey": {
"type": "secret",
"required": true,
"description": "API key for authentication"
}
}
}
```
## Step 2: Implement Service
`src/service.ts`:
```typescript
import { AITBCService, Job, JobResult } from '@aitbc/sdk';
export class MyCustomService implements AITBCService {
name = 'my-custom-service';
constructor(private config: { apiEndpoint: string; apiKey: string }) {}
async initialize(): Promise<void> {
// Validate configuration
const response = await fetch(`${this.config.apiEndpoint}/health`);
if (!response.ok) {
throw new Error('Service endpoint not reachable');
}
}
async processJob(job: Job): Promise<JobResult> {
const response = await fetch(`${this.config.apiEndpoint}/process`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.config.apiKey}`
},
body: JSON.stringify({
prompt: job.prompt,
params: job.params
})
});
if (!response.ok) {
throw new Error(`Service error: ${response.statusText}`);
}
const data = await response.json();
return {
output: data.result,
metadata: {
model: data.model,
tokens_used: data.tokens
}
};
}
async estimateCost(job: Job): Promise<number> {
// Estimate cost in AITBC tokens
const estimatedTokens = job.prompt.length / 4;
return estimatedTokens * 0.001; // 0.001 AITBC per token
}
getCapabilities(): string[] {
return ['text-generation', 'summarization'];
}
}
```
## Step 3: Create Entry Point
`src/index.ts`:
```typescript
import { ExtensionContext, registerService } from '@aitbc/sdk';
import { MyCustomService } from './service';
export async function activate(context: ExtensionContext): Promise<void> {
const config = context.getConfig();
const service = new MyCustomService({
apiEndpoint: config.apiEndpoint,
apiKey: config.apiKey
});
await service.initialize();
registerService(service);
console.log('My Custom Service extension activated');
}
export function deactivate(): void {
console.log('My Custom Service extension deactivated');
}
```
## Step 4: Add UI Widget (Optional)
`src/ui/PromptBuilder.tsx`:
```tsx
import React, { useState } from 'react';
import { useAITBC } from '@aitbc/react';
export function PromptBuilder() {
const [prompt, setPrompt] = useState('');
const { submitJob, isLoading } = useAITBC();
const handleSubmit = async () => {
const result = await submitJob({
service: 'my-custom-service',
prompt,
params: { max_tokens: 256 }
});
console.log('Result:', result);
};
return (
<div className="prompt-builder">
<textarea
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
placeholder="Enter your prompt..."
/>
<button onClick={handleSubmit} disabled={isLoading}>
{isLoading ? 'Processing...' : 'Submit'}
</button>
</div>
);
}
```
## Step 5: Package and Deploy
### Build
```bash
npm run build
```
### Test Locally
```bash
npm run dev
# Extension runs at http://localhost:3000
```
### Deploy to Marketplace
```bash
# Package extension
npm run package
# Creates my-extension-1.0.0.zip
# Submit to marketplace
aitbc-cli extension submit my-extension-1.0.0.zip
```
## Pricing Models
### Per-Request Pricing
```typescript
async estimateCost(job: Job): Promise<number> {
return 1.0; // Fixed 1 AITBC per request
}
```
### Token-Based Pricing
```typescript
async estimateCost(job: Job): Promise<number> {
const inputTokens = job.prompt.length / 4;
const outputTokens = job.params.max_tokens || 256;
return (inputTokens + outputTokens) * 0.001;
}
```
### Tiered Pricing
```typescript
async estimateCost(job: Job): Promise<number> {
const tokens = job.prompt.length / 4;
if (tokens < 100) return 0.5;
if (tokens < 1000) return 2.0;
return 5.0;
}
```
## Best Practices
1. **Validate inputs** - Check all user inputs before processing
2. **Handle errors gracefully** - Return meaningful error messages
3. **Respect rate limits** - Don't overwhelm external services
4. **Cache when possible** - Reduce redundant API calls
5. **Log appropriately** - Use structured logging for debugging
6. **Version your API** - Support backward compatibility
## Testing
```typescript
import { MyCustomService } from './service';
describe('MyCustomService', () => {
it('should process job successfully', async () => {
const service = new MyCustomService({
apiEndpoint: 'http://localhost:8080',
apiKey: 'test-key'
});
const result = await service.processJob({
prompt: 'Hello, world!',
params: {}
});
expect(result.output).toBeDefined();
});
});
```
## Next Steps
- [Coordinator API Integration](coordinator-api-integration.md)
- [SDK Examples](sdk-examples.md)
- [Existing Extensions](../../tutorials/marketplace-extensions.md)

View File

@@ -0,0 +1,382 @@
# SDK Usage Examples
This tutorial provides practical examples for using the AITBC SDKs in Python and JavaScript.
## Python SDK
### Installation
```bash
pip install aitbc-sdk
```
### Basic Usage
```python
from aitbc_sdk import AITBCClient
# Initialize client
client = AITBCClient(
api_url="https://aitbc.bubuit.net/api",
api_key="your-api-key" # Optional
)
# Submit a simple job
result = client.submit_and_wait(
prompt="What is the capital of France?",
model="llama3.2"
)
print(result.output)
# Output: The capital of France is Paris.
```
### Job Management
```python
# Submit job (non-blocking)
job = client.submit_job(
prompt="Write a haiku about coding",
model="llama3.2",
params={"max_tokens": 50, "temperature": 0.8}
)
print(f"Job ID: {job.id}")
# Check status
status = client.get_job_status(job.id)
print(f"Status: {status}")
# Wait for completion
result = client.wait_for_job(job.id, timeout=60)
print(f"Output: {result.output}")
# List recent jobs
jobs = client.list_jobs(limit=10, status="completed")
for j in jobs:
print(f"{j.id}: {j.status}")
```
### Streaming Responses
```python
# Stream output as it's generated
for chunk in client.stream_job(
prompt="Tell me a long story",
model="llama3.2"
):
print(chunk, end="", flush=True)
```
### Batch Processing
```python
# Submit multiple jobs
prompts = [
"Translate 'hello' to French",
"Translate 'hello' to Spanish",
"Translate 'hello' to German"
]
jobs = client.submit_batch(prompts, model="llama3.2")
# Wait for all to complete
results = client.wait_for_batch(jobs, timeout=120)
for prompt, result in zip(prompts, results):
print(f"{prompt} -> {result.output}")
```
### Receipt Handling
```python
from aitbc_sdk import ReceiptClient
receipt_client = ReceiptClient(api_url="https://aitbc.bubuit.net/api")
# Get receipt for a job
receipt = receipt_client.get_receipt(job_id="job-abc123")
print(f"Receipt ID: {receipt.receipt_id}")
print(f"Units: {receipt.units}")
print(f"Price: {receipt.price} AITBC")
# Verify receipt signature
is_valid = receipt_client.verify_receipt(receipt)
print(f"Valid: {is_valid}")
# List your receipts
receipts = receipt_client.list_receipts(client_address="ait1...")
total_spent = sum(r.price for r in receipts)
print(f"Total spent: {total_spent} AITBC")
```
### Error Handling
```python
from aitbc_sdk import AITBCClient, AITBCError, JobFailedError, TimeoutError
client = AITBCClient(api_url="https://aitbc.bubuit.net/api")
try:
result = client.submit_and_wait(
prompt="Complex task...",
timeout=30
)
except TimeoutError:
print("Job took too long")
except JobFailedError as e:
print(f"Job failed: {e.message}")
except AITBCError as e:
print(f"API error: {e}")
```
### Async Support
```python
import asyncio
from aitbc_sdk import AsyncAITBCClient
async def main():
client = AsyncAITBCClient(api_url="https://aitbc.bubuit.net/api")
# Submit multiple jobs concurrently
tasks = [
client.submit_and_wait(f"Question {i}?")
for i in range(5)
]
results = await asyncio.gather(*tasks)
for i, result in enumerate(results):
print(f"Answer {i}: {result.output[:50]}...")
asyncio.run(main())
```
## JavaScript SDK
### Installation
```bash
npm install @aitbc/sdk
```
### Basic Usage
```javascript
import { AITBCClient } from '@aitbc/sdk';
const client = new AITBCClient({
apiUrl: 'https://aitbc.bubuit.net/api',
apiKey: 'your-api-key' // Optional
});
// Submit and wait
const result = await client.submitAndWait({
prompt: 'What is 2 + 2?',
model: 'llama3.2'
});
console.log(result.output);
// Output: 2 + 2 equals 4.
```
### Job Management
```javascript
// Submit job
const job = await client.submitJob({
prompt: 'Explain quantum computing',
model: 'llama3.2',
params: { maxTokens: 256 }
});
console.log(`Job ID: ${job.id}`);
// Poll for status
const status = await client.getJobStatus(job.id);
console.log(`Status: ${status}`);
// Wait for completion
const result = await client.waitForJob(job.id, { timeout: 60000 });
console.log(`Output: ${result.output}`);
```
### Streaming
```javascript
// Stream response
const stream = client.streamJob({
prompt: 'Write a poem',
model: 'llama3.2'
});
for await (const chunk of stream) {
process.stdout.write(chunk);
}
```
### React Hook
```jsx
import { useAITBC } from '@aitbc/react';
function ChatComponent() {
const { submitJob, isLoading, result, error } = useAITBC();
const [prompt, setPrompt] = useState('');
const handleSubmit = async () => {
await submitJob({ prompt, model: 'llama3.2' });
};
return (
<div>
<input
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
placeholder="Ask something..."
/>
<button onClick={handleSubmit} disabled={isLoading}>
{isLoading ? 'Thinking...' : 'Ask'}
</button>
{error && <p className="error">{error.message}</p>}
{result && <p className="result">{result.output}</p>}
</div>
);
}
```
### TypeScript Types
```typescript
import { AITBCClient, Job, JobResult, Receipt } from '@aitbc/sdk';
interface MyJobParams {
prompt: string;
model: string;
maxTokens?: number;
}
async function processJob(params: MyJobParams): Promise<JobResult> {
const client = new AITBCClient({ apiUrl: '...' });
const job: Job = await client.submitJob(params);
const result: JobResult = await client.waitForJob(job.id);
return result;
}
```
### Error Handling
```javascript
import { AITBCClient, AITBCError, TimeoutError } from '@aitbc/sdk';
const client = new AITBCClient({ apiUrl: '...' });
try {
const result = await client.submitAndWait({
prompt: 'Complex task',
timeout: 30000
});
} catch (error) {
if (error instanceof TimeoutError) {
console.log('Job timed out');
} else if (error instanceof AITBCError) {
console.log(`API error: ${error.message}`);
} else {
throw error;
}
}
```
## Common Patterns
### Retry with Exponential Backoff
```python
import time
from aitbc_sdk import AITBCClient, AITBCError
def submit_with_retry(client, prompt, max_retries=3):
for attempt in range(max_retries):
try:
return client.submit_and_wait(prompt)
except AITBCError as e:
if attempt == max_retries - 1:
raise
wait_time = 2 ** attempt
print(f"Retry in {wait_time}s...")
time.sleep(wait_time)
```
### Caching Results
```python
import hashlib
import json
from functools import lru_cache
@lru_cache(maxsize=100)
def cached_query(prompt_hash: str) -> str:
# Cache based on prompt hash
return client.submit_and_wait(prompt).output
def query(prompt: str) -> str:
prompt_hash = hashlib.md5(prompt.encode()).hexdigest()
return cached_query(prompt_hash)
```
### Rate Limiting
```python
import time
from threading import Lock
class RateLimitedClient:
def __init__(self, client, requests_per_minute=60):
self.client = client
self.min_interval = 60.0 / requests_per_minute
self.last_request = 0
self.lock = Lock()
def submit(self, prompt):
with self.lock:
elapsed = time.time() - self.last_request
if elapsed < self.min_interval:
time.sleep(self.min_interval - elapsed)
self.last_request = time.time()
return self.client.submit_and_wait(prompt)
```
### Logging and Monitoring
```python
import logging
from aitbc_sdk import AITBCClient
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class LoggingClient:
def __init__(self, client):
self.client = client
def submit_and_wait(self, prompt, **kwargs):
logger.info(f"Submitting job: {prompt[:50]}...")
start = time.time()
try:
result = self.client.submit_and_wait(prompt, **kwargs)
elapsed = time.time() - start
logger.info(f"Job completed in {elapsed:.2f}s")
return result
except Exception as e:
logger.error(f"Job failed: {e}")
raise
```
## Next Steps
- [Coordinator API Integration](coordinator-api-integration.md)
- [Building a Custom Miner](building-custom-miner.md)
- [Python SDK Reference](../../reference/components/coordinator_api.md)

View File

@@ -0,0 +1,315 @@
# Working with ZK Proofs
This tutorial explains how to use zero-knowledge proofs in the AITBC network for privacy-preserving operations.
## Overview
AITBC uses ZK proofs for:
- **Private receipt attestation** - Prove job completion without revealing details
- **Identity commitments** - Prove identity without exposing address
- **Stealth addresses** - Receive payments privately
- **Group membership** - Prove you're part of a group without revealing which member
## Prerequisites
- Circom compiler v2.2.3+
- snarkjs library
- Node.js 18+
## Architecture
```
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Circuit │────▶│ Prover │────▶│ Verifier │
│ (Circom) │ │ (snarkjs) │ │ (On-chain) │
└─────────────┘ └─────────────┘ └─────────────┘
```
## Step 1: Understanding Circuits
AITBC includes pre-built circuits in `apps/zk-circuits/`:
### Receipt Simple Circuit
Proves a receipt is valid without revealing the full receipt:
```circom
// circuits/receipt_simple.circom
pragma circom 2.0.0;
include "circomlib/poseidon.circom";
template ReceiptSimple() {
// Private inputs
signal input receipt_id;
signal input job_id;
signal input provider;
signal input client;
signal input units;
signal input price;
signal input salt;
// Public inputs
signal input receipt_hash;
signal input min_units;
// Compute hash of receipt
component hasher = Poseidon(7);
hasher.inputs[0] <== receipt_id;
hasher.inputs[1] <== job_id;
hasher.inputs[2] <== provider;
hasher.inputs[3] <== client;
hasher.inputs[4] <== units;
hasher.inputs[5] <== price;
hasher.inputs[6] <== salt;
// Verify hash matches
receipt_hash === hasher.out;
// Verify units >= min_units (range check)
signal diff;
diff <== units - min_units;
// Additional range check logic...
}
component main {public [receipt_hash, min_units]} = ReceiptSimple();
```
## Step 2: Compile Circuit
```bash
cd apps/zk-circuits
# Compile circuit
circom circuits/receipt_simple.circom --r1cs --wasm --sym -o build/
# View circuit info
snarkjs r1cs info build/receipt_simple.r1cs
# Constraints: 300
```
## Step 3: Trusted Setup
```bash
# Download Powers of Tau (one-time)
wget https://hermez.s3-eu-west-1.amazonaws.com/powersOfTau28_hez_final_12.ptau
# Generate proving key
snarkjs groth16 setup build/receipt_simple.r1cs powersOfTau28_hez_final_12.ptau build/receipt_simple_0000.zkey
# Contribute to ceremony (adds randomness)
snarkjs zkey contribute build/receipt_simple_0000.zkey build/receipt_simple_final.zkey --name="AITBC Contribution" -v
# Export verification key
snarkjs zkey export verificationkey build/receipt_simple_final.zkey build/verification_key.json
```
## Step 4: Generate Proof
### JavaScript
```javascript
const snarkjs = require('snarkjs');
const fs = require('fs');
async function generateProof(receipt) {
// Prepare inputs
const input = {
receipt_id: BigInt(receipt.receipt_id),
job_id: BigInt(receipt.job_id),
provider: BigInt(receipt.provider),
client: BigInt(receipt.client),
units: BigInt(Math.floor(receipt.units * 1000)),
price: BigInt(Math.floor(receipt.price * 1000)),
salt: BigInt(receipt.salt),
receipt_hash: BigInt(receipt.hash),
min_units: BigInt(1000) // Prove units >= 1.0
};
// Generate proof
const { proof, publicSignals } = await snarkjs.groth16.fullProve(
input,
'build/receipt_simple_js/receipt_simple.wasm',
'build/receipt_simple_final.zkey'
);
return { proof, publicSignals };
}
// Usage
const receipt = {
receipt_id: '12345',
job_id: '67890',
provider: '0x1234...',
client: '0x5678...',
units: 2.5,
price: 5.0,
salt: '0xabcd...',
hash: '0x9876...'
};
const { proof, publicSignals } = await generateProof(receipt);
console.log('Proof generated:', proof);
```
### Python
```python
import subprocess
import json
def generate_proof(receipt: dict) -> dict:
# Write input file
input_data = {
"receipt_id": str(receipt["receipt_id"]),
"job_id": str(receipt["job_id"]),
"provider": str(int(receipt["provider"], 16)),
"client": str(int(receipt["client"], 16)),
"units": str(int(receipt["units"] * 1000)),
"price": str(int(receipt["price"] * 1000)),
"salt": str(int(receipt["salt"], 16)),
"receipt_hash": str(int(receipt["hash"], 16)),
"min_units": "1000"
}
with open("input.json", "w") as f:
json.dump(input_data, f)
# Generate witness
subprocess.run([
"node", "build/receipt_simple_js/generate_witness.js",
"build/receipt_simple_js/receipt_simple.wasm",
"input.json", "witness.wtns"
], check=True)
# Generate proof
subprocess.run([
"snarkjs", "groth16", "prove",
"build/receipt_simple_final.zkey",
"witness.wtns", "proof.json", "public.json"
], check=True)
with open("proof.json") as f:
proof = json.load(f)
with open("public.json") as f:
public_signals = json.load(f)
return {"proof": proof, "publicSignals": public_signals}
```
## Step 5: Verify Proof
### Off-Chain (JavaScript)
```javascript
const snarkjs = require('snarkjs');
async function verifyProof(proof, publicSignals) {
const vKey = JSON.parse(fs.readFileSync('build/verification_key.json'));
const isValid = await snarkjs.groth16.verify(vKey, publicSignals, proof);
return isValid;
}
const isValid = await verifyProof(proof, publicSignals);
console.log('Proof valid:', isValid);
```
### On-Chain (Solidity)
The `ZKReceiptVerifier.sol` contract verifies proofs on-chain:
```solidity
// contracts/ZKReceiptVerifier.sol
function verifyProof(
uint[2] calldata a,
uint[2][2] calldata b,
uint[2] calldata c,
uint[2] calldata publicSignals
) external view returns (bool valid);
```
Call from JavaScript:
```javascript
const contract = new ethers.Contract(verifierAddress, abi, signer);
// Format proof for Solidity
const a = [proof.pi_a[0], proof.pi_a[1]];
const b = [[proof.pi_b[0][1], proof.pi_b[0][0]], [proof.pi_b[1][1], proof.pi_b[1][0]]];
const c = [proof.pi_c[0], proof.pi_c[1]];
const isValid = await contract.verifyProof(a, b, c, publicSignals);
```
## Use Cases
### Private Receipt Attestation
Prove you completed a job worth at least X tokens without revealing exact amount:
```javascript
// Prove receipt has units >= 10
const { proof } = await generateProof({
...receipt,
min_units: 10000 // 10.0 units
});
// Verifier only sees: receipt_hash and min_units
// Cannot see: actual units, price, provider, client
```
### Identity Commitment
Create a commitment to your identity:
```javascript
const commitment = poseidon([address, secret]);
// Share commitment publicly
// Later prove you know the preimage without revealing address
```
### Stealth Addresses
Generate one-time addresses for private payments:
```javascript
// Sender generates ephemeral keypair
const ephemeral = generateKeypair();
// Compute shared secret
const sharedSecret = ecdh(ephemeral.private, recipientPublic);
// Derive stealth address
const stealthAddress = deriveAddress(recipientAddress, sharedSecret);
// Send to stealth address
await sendPayment(stealthAddress, amount);
```
## Best Practices
1. **Never reuse salts** - Each proof should use a unique salt
2. **Validate inputs** - Check ranges before proving
3. **Use trusted setup** - Don't skip the ceremony
4. **Test thoroughly** - Verify proofs before deploying
5. **Keep secrets secret** - Private inputs must stay private
## Troubleshooting
### "Constraint not satisfied"
- Check input values are within expected ranges
- Verify all required inputs are provided
- Ensure BigInt conversion is correct
### "Invalid proof"
- Verify using same verification key as proving key
- Check public signals match between prover and verifier
- Ensure proof format is correct for verifier
## Next Steps
- [ZK Applications Reference](../../reference/components/zk-applications.md)
- [ZK Receipt Attestation](../../reference/zk-receipt-attestation.md)
- [SDK Examples](sdk-examples.md)