Self Host LLMGateway
Simple guide to self-hosting LLMGateway using Docker Compose.
Self Host LLMGateway
LLMGateway is a self-hostable platform that provides a unified API gateway for multiple LLM providers. This guide will help you get started quickly using Docker Compose.
What You'll Get
- Gateway Service: Routes LLM API requests to different providers
- Web Interface: Manage projects and API keys
- Database: Stores users, projects, and request logs
- Redis: Handles caching
Prerequisites
- Latest Docker (which will include Docker Compose)
- API keys for the LLM providers you want to use (OpenAI, Anthropic, etc.)
Quick Start
-
Clone the repository:
-
Set up environment:
-
Configure your API keys (see Configuration section below)
-
Start the services:
-
Initialize the database:
-
Access your LLMGateway:
- Web Interface: http://localhost:3002
- Docs: http://localhost:3005
- API Endpoint: http://localhost:4002
- Gateway: http://localhost:4001
Configuration
Edit the .env
file to configure your LLMGateway instance:
Required Settings
LLM Provider API Keys
Add API keys for the providers you want to use:
That's it! The other settings in .env.example
have sensible defaults.
Managing Your Instance
Basic Commands
Database Management
Next Steps
Once your LLMGateway is running:
- Open the web interface at http://localhost:3002
- Create your first organization and project
- Generate API keys for your applications
- Test the gateway by making API calls to http://localhost:4001
- Read the docs at http://localhost:3005
For more information, see the API Documentation and User Guide.