footer{text-align:center;padding:2rem;font-size:0.75rem;font-family:var(--font-mono);color:var(--cream-muted);border-top:1px solid var(--bg3)} footer a{color:var(--cream-muted)}footer a:hover{color:var(--cream)} .sig{font-family:var(--font-serif);font-style:italic;color:var(--leather-light);font-size:0.9rem}
Stockyard speaks the OpenAI API format. If your app, SDK, or tool talks to OpenAI, point it at Stockyard instead. Same request format, same response format, 16 providers behind it.
Cursor: Settings → Models → Override OpenAI Base URL → http://localhost:4200/v1
Windsurf / Copilot: Set the OpenAI-compatible endpoint in settings. Stockyard handles the rest.
Aider / Cline: Set OPENAI_API_BASE=http://localhost:4200/v1 and they route through Stockyard automatically.
Any OpenAI SDK: Python, Node, Go, Rust — any SDK that lets you set a base URL works out of the box.
Your app sends OpenAI-formatted requests. Stockyard translates to the right provider format underneath. Request claude-sonnet-4-20250514 as the model and Stockyard sends it to the Anthropic API. Request gemini-pro and it goes to Google. Your app code stays the same.
Set "stream": true in your request and Stockyard streams server-sent events in OpenAI format, regardless of which provider is behind it. Failover works mid-stream — if a provider drops the connection, Stockyard retries on the next provider.
Every request through Stockyard is automatically traced with cost, latency, and token count. You get a tamper-proof audit ledger, rate limiting, content filtering, PII redaction, caching, and failover — all without changing your application code beyond that one URL.
The 76 middleware modules run on every request. You don't configure them individually — they work out of the box. Turn specific modules on or off when you need to.
Install Stockyard, change your base URL, and every LLM request gains tracing, controls, and multi-provider routing. No SDK swap, no code rewrite.
Install Stockyard