Introduction
LLM Gateway is an open-source API gateway for Large Language Models. Route requests to multiple providers, manage API keys, track usage, and optimize costs.
LLM Gateway is an open-source API gateway that sits between your applications and LLM providers like OpenAI, Anthropic, Google AI Studio, and more. It provides a unified, OpenAI-compatible API interface with built-in cost tracking, caching, and intelligent routing.
Features
Routing
Intelligently route requests to the best available models and providers.
Caching
Reduce costs and latency by caching identical requests.
Image Generation
Generate images using AI models through the OpenAI-compatible API.
Web Search
Enable real-time web search capabilities for up-to-date information.
Reasoning
Use reasoning-capable models that show their step-by-step thought process.
Vision
Send images to vision-enabled models using URLs or inline base64 data.
AI Tooling
LLM Gateway is built to work seamlessly with AI agents and development tools.
llms.txt
Machine-readable index of all documentation pages for LLM consumption.
llms-full.txt
Complete documentation content in a single file for full-context LLM ingestion.
MCP Server
Use LLM Gateway as an MCP server for Claude Code, Cursor, and other MCP-compatible clients.
Agent Skills
Packaged instructions and guidelines for AI coding agents to generate higher-quality code.
Templates & Agents
Pre-built templates and agent configurations to get started quickly.
Next Steps
- Quickstart — Get up and running in minutes
- Overview — Learn more about what LLM Gateway offers
- Self-Host — Deploy on your own infrastructure
How is this guide?
Last updated on