R

ResonanceNet

A blockchain that trains AI instead of hashing.
Every block makes the model smarter.

The Problem

Bitcoin mining burns $20B+/year in electricity producing heat.
AI training costs $100M+ per model, controlled by corporations.
What if mining produced something useful?

Bitcoin vs ResonanceNet

BitcoinResonanceNet
ProofSHA-256 < targetval_loss < prev_val_loss
OutputHeatTrained AI model
Useful workNoYes
Block time10 min10 min
Supply cap21M BTC21M RNET
Inference150K tokens/sec

Why MinGRU?

Based on "Were RNNs All We Needed?" by Feng et al.

FeatureMinGRUTransformer
Parameter efficiency~388x1x
Inference memoryO(1) fixed stateO(seq_len) KV cache
Context lengthInfiniteLimited (8K-128K)
Tokens/sec (RTX 5080)~150,000~3,000-5,000
Runs on phone (INT4)YesBarely

Model Growth

The model starts small and grows with the network. Growth rate is bounded by real GPU compute.

TimelineModel SizeTransformer Equivalent
Month 1~50M~19B
Year 1~160M~62B
Year 3~700M~272B
Year 5+~4.9B~1.9T

Implementation

65,000 lines of C++20 • No PyTorch dependency
4 GPU backends: CUDA, Vulkan, Metal, CPU
Full forward + backward pass
BF16/INT8/INT4 quantization
Lightning Network for instant payments
Ed25519 signatures • Keccak-256d hashing
Genesis: "OpenAI burns $14B in 2026, adds ads to ChatGPT as last resort"
Block 0 • 2025-03-13 00:00:00 UTC