Darkbloom exposes an OpenAI-compatible API atDocumentation Index
Fetch the complete documentation index at: https://docs.darkbloom.dev/llms.txt
Use this file to discover all available pages before exploring further.
https://api.darkbloom.dev/v1. If you’ve used the OpenAI SDK before, the only change is the base URL and your API key — everything else works the same way. This guide walks you through getting a key, making your first request, and exploring what’s available.
Get an API key
Go to darkbloom.dev and sign up for an account. Once you’re logged in, open the console and create an API key. Your key will start with
eigeninference-.Send your first chat completion
Point any OpenAI-compatible SDK at
https://api.darkbloom.dev/v1 and pass your API key. The request format is identical to the OpenAI Chat Completions API.List available models
To see which models are currently online and accepting requests, call The response lists only models that have at least one provider currently online. If a model isn’t in the list, no provider is serving it right now.
GET /v1/models.| Model ID | Best for |
|---|---|
qwen3.5-27b-claude-opus-8bit | Frontier reasoning, Claude Opus distilled |
mlx-community/gemma-4-26b-a4b-it-8bit | Fast multimodal requests |
mlx-community/Trinity-Mini-8bit | Fast agentic inference |
mlx-community/Qwen3.5-122B-A10B-8bit | Highest quality reasoning |
mlx-community/MiniMax-M2.5-8bit | State-of-the-art coding, ~100 tok/s |
Stream a response
Add The streaming response uses server-sent events, the same format as the OpenAI API. Each line is prefixed with
stream=True (Python) or stream: true (Node.js) to receive tokens as they’re generated instead of waiting for the full response.data: and the stream ends with data: [DONE].Using the Anthropic Messages API
Darkbloom also supports the Anthropic Messages API format at/v1/messages. Use this if your code is already written for the Anthropic SDK or if you prefer the system parameter as a top-level field.
Darkbloom is an experimental research prototype. Do not use it in production applications.