Skip to main content

Available Models

Access all major LLM providers through a single API Use any model from Anthropic, OpenAI, DeepSeek, Google, and more.

Model Format​

Models are specified in the format: provider/model-name

{
"model": "anthropic/claude-sonnet-4-5-20250929"
}

Anthropic Models​

ModelContextInput CostOutput CostBest For
claude-opus-4-20250514200k$15.00/1M$75.00/1MComplex reasoning
claude-sonnet-4-5-20250929200k$3.00/1M$15.00/1MBalance of quality/speed
claude-haiku-4-5-20251001200k$0.25/1M$1.25/1MFast, cost-effective
response = client.chat.completions.create(
model="anthropic/claude-sonnet-4-5-20250929",
messages=[...]
)

OpenAI Models​

ModelContextInput CostOutput CostBest For
gpt-4o128k$2.50/1M$10.00/1MMultimodal, general purpose
gpt-4o-mini128k$0.15/1M$0.60/1MFast, cost-effective
gpt-4-turbo128k$10.00/1M$30.00/1MComplex tasks
response = client.chat.completions.create(
model="openai/gpt-4o",
messages=[...]
)

DeepSeek Models​

ModelContextInput CostOutput CostBest For
deepseek-chat128k$0.14/1M$0.28/1MGeneral chat
deepseek-reasoner64k$0.55/1M$2.19/1MComplex reasoning
response = client.chat.completions.create(
model="deepseek-chat/deepseek-chat",
messages=[...]
)

Google Gemini Models​

ModelContextInput CostOutput CostBest For
gemini-2.5-pro1M$1.25/1M$10.00/1MComplex tasks
gemini-2.5-flash1M$0.075/1M$0.30/1MFast, general purpose
gemini-2.0-flash1M$0.075/1M$0.30/1MFast, cost-effective
response = client.chat.completions.create(
model="google-gemini/gemini-2.5-flash",
messages=[...]
)

Choosing the Right Model​

For Quality​

  • claude-opus-4-20250514 - Best for complex reasoning
  • gpt-4o - Best for multimodal tasks
  • gemini-2.5-pro - Best for very large contexts

For Value​

  • claude-haiku-4-5-20251001 - Fast and cheap
  • gpt-4o-mini - Good balance
  • deepseek-chat - Lowest cost

For Speed​

  • claude-haiku-4-5-20251001 - Fastest Anthropic model
  • gemini-2.0-flash - Fastest overall
  • deepseek-chat - Quick responses

Model Availability​

Models are subject to:

  • Provider uptime
  • Your Virtual Key configuration
  • Regional availability

List Models Endpoint​

Get the list of available models:

curl http://localhost:8084/v1/models \
-H "x-bf-vk: sk-bf-YOUR_VIRTUAL_KEY"

Response:

{
"object": "list",
"data": [
{
"id": "anthropic/claude-opus-4-20250514",
"object": "model",
"owned_by": "anthropic"
},
{
"id": "anthropic/claude-sonnet-4-5-20250929",
"object": "model",
"owned_by": "anthropic"
}
]
}

Use any model from any provider through a single API.