Integrations
Works with any framework that speaks OpenAI.
LangChain (Python)
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://your-host/v1",
api_key="sk-sy-YOUR_KEY",
model="gpt-4o",
)
response = llm.invoke("Explain LLM proxies in one sentence.")
print(response.content)
LangChain (TypeScript)
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
configuration: {
baseURL: "https://your-host/v1",
apiKey: "sk-sy-YOUR_KEY",
},
model: "gpt-4o",
});
const response = await llm.invoke("Hello from LangChain");
console.log(response.content);
Vercel AI SDK
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const stockyard = createOpenAI({
baseURL: "https://your-host/v1",
apiKey: "sk-sy-YOUR_KEY",
});
const { text } = await generateText({
model: stockyard("gpt-4o"),
prompt: "Hello from Vercel AI SDK",
});
console.log(text);
LiteLLM
import litellm
response = litellm.completion(
model="openai/gpt-4o",
messages=[{"role": "user", "content": "Hello from LiteLLM"}],
api_key="sk-sy-YOUR_KEY",
api_base="https://your-host/v1",
)
print(response.choices[0].message.content)
Instructor (Structured Output)
import instructor
from openai import OpenAI
from pydantic import BaseModel
client = instructor.from_openai(OpenAI(
base_url="https://your-host/v1",
api_key="sk-sy-YOUR_KEY",
))
class User(BaseModel):
name: str
age: int
user = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "John is 30 years old"}],
response_model=User,
)
print(user) # User(name='John', age=30)
Continue.dev (IDE)
Add to your ~/.continue/config.json:
{
"models": [
{
"title": "Stockyard GPT-4o",
"provider": "openai",
"model": "gpt-4o",
"apiBase": "https://your-host/v1",
"apiKey": "sk-sy-YOUR_KEY"
}
]
}
Cursor / Windsurf / Cline
Any AI code editor that supports custom OpenAI endpoints works with Stockyard. Set the base URL to https://your-host/v1 and use your Stockyard API key. See Editor Setup for per-editor instructions.
Generic Pattern
For any OpenAI-compatible library or tool:
1. Find the "base URL" or "API base" setting.
2. Set it to https://your-stockyard-host/v1.
3. Use your Stockyard API key (sk-sy-...) as the API key.
4. Use any model name your configured providers support.
Stockyard handles provider routing, failover, caching, and all middleware transparently. The framework never knows it's not talking directly to OpenAI.