Anthropic API Compatibility
Use the Anthropic-compatible endpoint to access any LLM model through the familiar Anthropic API format.
Anthropic API Compatibility
LLMGateway provides a native Anthropic-compatible endpoint at /v1/messages
that allows you to use any model in our catalog while maintaining the familiar Anthropic API format
This is especially useful for applications designed for Claude that you want to extend to use other models.
Enjoy a 50% discount on our Anthropic models for a limited time.
Overview
The Anthropic endpoint transforms requests from Anthropic's message format to the OpenAI-compatible format used by LLMGateway, then transforms the responses back to Anthropic's format. This means you can:
- Use any model available in LLMGateway with Anthropic's API format
- Maintain existing code that uses Anthropic's SDK or API format
- Access models from OpenAI, Google, Cohere, and other providers through the Anthropic interface
- Leverage LLMGateway's routing, caching, and cost optimization features
Basic Usage
Configuration for Claude Code
This endpoint is perfect for configuring Claude Code to use any model available in LLMGateway:
export ANTHROPIC_BASE_URL=https://api.llmgateway.io
export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here
# optional: specify a model, otherwise it uses the default Claude model
export ANTHROPIC_MODEL=gpt-5 # or any model from our catalog
# now run claude!
claude
Choosing Models
You can use any model from the models page. Popular options for Claude Code include:
# Use OpenAI's latest model
export ANTHROPIC_MODEL=gpt-5
# Use a cost-effective alternative
export ANTHROPIC_MODEL=gpt-5-mini
# Use Google's Gemini
export ANTHROPIC_MODEL=google/gemini-2.5-pro
# Use Anthropic's actual Claude models
export ANTHROPIC_MODEL=anthropic/claude-3-5-sonnet-20241022
Environment Variables
When configuring Claude Code or other Anthropic-compatible applications, you can use these environment variables:
ANTHROPIC_MODEL
Specifies the main model to use for primary requests.
- Default:
claude-sonnet-4-20250514
- Example:
export ANTHROPIC_MODEL=gpt-5
ANTHROPIC_SMALL_FAST_MODEL
Specifies a smaller, faster model used for background functionality and internal operations.
- Default:
claude-3-5-haiku-20241022
- Example:
export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-nano
# Example configuration
export ANTHROPIC_BASE_URL=https://api.llmgateway.io
export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here
export ANTHROPIC_MODEL=gpt-5
export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-nano
Advanced Features
Making a manual request
curl -X POST "https://api.llmgateway.io/v1/messages" \
-H "Authorization: Bearer $LLM_GATEWAY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
],
"max_tokens": 100
}'
Response Format
The endpoint returns responses in Anthropic's message format:
{
"id": "msg_abc123",
"type": "message",
"role": "assistant",
"model": "gpt-5",
"content": [
{
"type": "text",
"text": "Hello! I'm doing well, thank you for asking. How can I help you today?"
}
],
"stop_reason": "end_turn",
"stop_sequence": null,
"usage": {
"input_tokens": 13,
"output_tokens": 20
}
}