Docs
Build, then ship.
MixerBox Cloud speaks OpenAI. If your code already calls OpenAI, you're 30 seconds away from calling every other model too.
Quick start
- 1. Sign in at /dashboard with your MixerBox ID.
- 2. Create an API key from the dashboard. Copy it — it's only shown once.
- 3. Top up at least $5 of credit (Stripe).
- 4. Point your OpenAI client at https://cloud.mixerbox.com/v1.
API reference
Endpoints exposed under /v1:
| Method | Path | Purpose |
|---|---|---|
| POST | /v1/chat/completions | Chat-style generation, streaming or non-streaming |
| POST | /v1/embeddings | Embedding vectors |
| POST | /v1/audio/transcriptions | Audio → text (whisper / aai) |
| POST | /v1/audio/speech | Text → speech (TTS) |
| GET | /v1/models | List models with pricing & capabilities |
| GET | /v1/usage | Current key spend, remaining budget |
SDKs
No bespoke SDK needed. The official OpenAI client libraries work out of the box — just override the base URL.
- openai-python ·
base_url= - openai-node ·
baseURL: - langchain ·
ChatOpenAI(base_url=...) - llama-index ·
api_base - vercel ai sdk ·
openai({ baseURL }) - cursor / cline · custom provider
OpenAPI spec
The full machine-readable spec is at /openapi.yaml (coming soon). Suitable for generating client SDKs, importing into Postman/Insomnia, or feeding to an AI agent.