LLM Gateway
V1/Chat

V1_chat_completions

Create a completion for the chat conversation

POST
/v1/chat/completions

Request Body

application/jsonOptional
modelRequiredstring
messagesRequiredarray<object>
temperaturenumber
max_tokensnumber
top_pnumber
frequency_penaltynumber
presence_penaltynumber
streamboolean
Default: false

Response Body

User response object or streaming response.

TypeScript Definitions

Use the response body type in TypeScript.

messageRequiredstring
curl -X POST "https://example.com/v1/chat/completions" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "messages": [
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "temperature": 0,
    "max_tokens": 0,
    "top_p": 0,
    "frequency_penalty": 0,
    "presence_penalty": 0,
    "stream": false
  }'
{
  "message": "string"
}