API Documentation
junto is fully compatible with the OpenAI API format. Use your API key to get started.
Sign in and go to API Keys to create a new key.
Your API key looks like: sk-junto-xxxxxxxxxxxx
Keep your key safe. It will only be shown once when created.
junto uses the same format as the OpenAI API. Just change the base URL and use your junto API key.
curl https://juntorouter-api.moonshine-studio.net/api/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o-mini",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'Python
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://juntorouter-api.moonshine-studio.net/api/v1"
)
response = client.chat.completions.create(
model="openai/gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)Node.js / TypeScript
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "YOUR_API_KEY",
baseURL: "https://juntorouter-api.moonshine-studio.net/api/v1",
});
const response = await client.chat.completions.create({
model: "anthropic/claude-sonnet-4-5",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);Browse all models at Models. Use the model slug as the model parameter.
openai/gpt-4oopenai/gpt-4o-minianthropic/claude-sonnet-4-5anthropic/claude-haiku-4-5google/gemini-2.0-flashgoogle/gemini-2.0-proAppend a suffix to any model to control routing behavior:
:nitroRoute to the fastest provider (highest throughput):floorRoute to the cheapest provider (lowest price):onlineAugment prompt with web search results:thinkingEnable extended reasoning (Anthropic models)// Use the fastest provider for GPT-4o
{ "model": "openai/gpt-4o:nitro" }
// Use the cheapest provider
{ "model": "openai/gpt-4o:floor" }Add stream: true to get Server-Sent Events (SSE):
curl https://juntorouter-api.moonshine-studio.net/api/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic/claude-sonnet-4-5",
"messages": [{"role": "user", "content": "Tell me a story"}],
"stream": true
}'Control routing with the provider object:
{
"model": "openai/gpt-4o",
"messages": [...],
"provider": {
"order": ["anthropic", "openai"],
"allow_fallbacks": true,
"data_collection": "deny",
"require_parameters": true
}
}order — Provider priority order
allow_fallbacks — Allow fallback to other providers
data_collection — "deny" = only no-data-collection providers
zdr — true = only Zero Data Retention providers
require_parameters — Only route to providers supporting all request params

