An extensible and modular RPC request router and proxy service built in Rust.
demo.mov
Roxy is a JSON-RPC proxy that sits between clients and upstream RPC backends. It distributes requests across multiple backends using exponential moving average (EMA) based health tracking to route traffic toward healthier endpoints. Responses can be cached in a tiered system with an in-memory LRU cache and optional Redis backing. Rate limiting uses a sliding window algorithm to control request throughput. The server accepts both HTTP and WebSocket connections and exposes Prometheus metrics for observability.
cargo install roxy-proxyRun the proxy with a configuration file:
roxy-proxy --config roxy.tomlValidate configuration without starting the server:
roxy-proxy --config roxy.toml --checkCreate a TOML configuration file:
[[backends]]
name = "primary"
url = "https://eth-mainnet.example.com"
[[backends]]
name = "fallback"
url = "https://eth-mainnet-fallback.example.com"
[[groups]]
name = "main"
backends = ["primary", "fallback"]
load_balancer = "ema"
[routing]
default_group = "main"
[cache]
enabled = true
memory_size = 10000
[server]
host = "0.0.0.0"
port = 8545| Section | Description |
|---|---|
| server | Bind address, port, connection limits |
| backends | Upstream RPC endpoints with timeout and retry |
| groups | Backend groups with load balancing strategy |
| cache | Memory size and TTL settings |
| rate_limit | Requests per second and burst limits |
| routing | Method routing rules and blocked methods |
| metrics | Prometheus metrics endpoint |
Roxy's design draws inspiration from the commonwarexyz/monorepo.
This project is licensed under the MIT License - see the LICENSE file for details.