LLM Gateway

Anthropic Messages

Create a message using Anthropic's API format

POST
/v1/messages
AuthorizationBearer <token>

Bearer token authentication using API keys

In: header

modelstring

The model to use for completion

messagesarray<object>

Array of message objects

max_tokensnumber

Maximum number of tokens to generate

Range1 <= value
system?string & array<object>

System prompt to provide context

temperature?number

Sampling temperature between 0 and 1

Range0 <= value <= 1
tools?array<object>

Available tools for the model to use

stream?boolean

Whether to stream the response

Defaultfalse

Response Body

curl -X POST "https://api.llmgateway.io/v1/messages" \  -H "Content-Type: application/json" \  -d '{    "model": "claude-3-5-sonnet-20241022",    "messages": [      {        "role": "user",        "content": "string"      }    ],    "max_tokens": 1024  }'
{
  "id": "string",
  "type": "message",
  "role": "assistant",
  "model": "string",
  "content": [
    {
      "type": "text",
      "text": "string",
      "id": "string",
      "name": "string",
      "input": {
        "property1": null,
        "property2": null
      }
    }
  ],
  "stop_reason": "end_turn",
  "stop_sequence": "string",
  "usage": {
    "input_tokens": 0,
    "output_tokens": 0
  }
}