API Documentation

junto is fully compatible with the OpenAI API format. Use your API key to get started.

1. Get Your API Key

Sign in and go to API Keys to create a new key.

Your API key looks like: sk-junto-xxxxxxxxxxxx

Keep your key safe. It will only be shown once when created.

2. Make Your First Request

junto uses the same format as the OpenAI API. Just change the base URL and use your junto API key.

curl https://juntorouter-api.moonshine-studio.net/api/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o-mini",
    "messages": [
      {"role": "user", "content": "Hello!"}
    ]
  }'
3. Use with OpenAI SDK

Python

from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://juntorouter-api.moonshine-studio.net/api/v1"
)

response = client.chat.completions.create(
    model="openai/gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

Node.js / TypeScript

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "YOUR_API_KEY",
  baseURL: "https://juntorouter-api.moonshine-studio.net/api/v1",
});

const response = await client.chat.completions.create({
  model: "anthropic/claude-sonnet-4-5",
  messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);
4. Available Models

Browse all models at Models. Use the model slug as the model parameter.

openai/gpt-4o
openai/gpt-4o-mini
anthropic/claude-sonnet-4-5
anthropic/claude-haiku-4-5
google/gemini-2.0-flash
google/gemini-2.0-pro
5. Dynamic Variants

Append a suffix to any model to control routing behavior:

:nitroRoute to the fastest provider (highest throughput)
:floorRoute to the cheapest provider (lowest price)
:onlineAugment prompt with web search results
:thinkingEnable extended reasoning (Anthropic models)
// Use the fastest provider for GPT-4o
{ "model": "openai/gpt-4o:nitro" }

// Use the cheapest provider
{ "model": "openai/gpt-4o:floor" }
6. Streaming

Add stream: true to get Server-Sent Events (SSE):

curl https://juntorouter-api.moonshine-studio.net/api/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic/claude-sonnet-4-5",
    "messages": [{"role": "user", "content": "Tell me a story"}],
    "stream": true
  }'
7. Provider Preferences

Control routing with the provider object:

{
  "model": "openai/gpt-4o",
  "messages": [...],
  "provider": {
    "order": ["anthropic", "openai"],
    "allow_fallbacks": true,
    "data_collection": "deny",
    "require_parameters": true
  }
}

order — Provider priority order

allow_fallbacks — Allow fallback to other providers

data_collection"deny" = only no-data-collection providers

zdrtrue = only Zero Data Retention providers

require_parameters — Only route to providers supporting all request params