Self Host LLMGateway
Simple guide to self-hosting LLMGateway using Docker.
Self Host LLMGateway
LLMGateway is a self-hostable platform that provides a unified API gateway for multiple LLM providers. This guide offers two simple options to get started.
Prerequisites
- Latest Docker
- API keys for the LLM providers you want to use (OpenAI, Anthropic, etc.)
Option 1: Unified Docker Image (Simplest)
This option uses a single Docker container that includes all services (UI, API, Gateway, Database, Redis).
# Run the container
docker run -d \
--name llmgateway \
--restart unless-stopped \
-p 3002:3002 \
-p 3005:3005 \
-p 4001:4001 \
-p 4002:4002 \
-v ~/llmgateway_data:/var/lib/postgresql/data \
-e OPENAI_API_KEY=sk-your_openai_key_here \
-e AUTH_SECRET=your-secret-key-here \
ghcr.io/theopenco/llmgateway-unified:latest
Note: it is recommended to use the latest version tag from here instead of latest
: https://github.com/theopenco/llmgateway/releases
Using Docker Compose (Alternative for unified image)
# Download the compose file
curl -O https://raw.githubusercontent.com/theopenco/llmgateway/main/infra/docker-compose.unified.yml
curl -O https://raw.githubusercontent.com/theopenco/llmgateway/main/.env.example
# Configure environment
cp .env.example .env
# Edit .env with your configuration
# Start the service
docker compose -f docker-compose.unified.yml up -d
Note: it is recommended to replace the latest
version tag in the image with the latest version from here: https://github.com/theopenco/llmgateway/releases
Option 2: Separate Services with Docker Compose
This option uses separate containers for each service, offering more flexibility.
# Clone the repository
git clone https://github.com/theopenco/llmgateway.git
cd llmgateway
# Configure environment
cp .env.example .env
# Edit .env with your configuration
# Start the services
docker compose -f infra/docker-compose.split.yml up -d
Note: it is recommended to replace the latest
version tag in all images in the compose file with the latest version from here: https://github.com/theopenco/llmgateway/releases
Accessing Your LLMGateway
After starting either option, you can access:
- Web Interface: http://localhost:3002
- Documentation: http://localhost:3005
- API Endpoint: http://localhost:4002
- Gateway Endpoint: http://localhost:4001
Required Configuration
At minimum, you need to set these environment variables:
# Database (change the password!)
POSTGRES_PASSWORD=your_secure_password_here
# Authentication
AUTH_SECRET=your-secret-key-here
# LLM Provider API Keys (add the ones you need)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
Basic Management Commands
For Unified Docker (Option 1)
# View logs
docker logs llmgateway
# Restart container
docker restart llmgateway
# Stop container
docker stop llmgateway
For Docker Compose (Option 2)
# View logs
docker compose -f infra/docker-compose.split.yml logs -f
# Restart services
docker compose -f infra/docker-compose.split.yml restart
# Stop services
docker compose -f infra/docker-compose.split.yml down
Build locally
To build locally, you can use the *.local.yml compose file in the infra
directory, which will build the images from the source code.
All provider API keys
You can set any of the following API keys:
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
VERTEX_API_KEY=
GOOGLE_AI_STUDIO_API_KEY=
INFERENCE_NET_API_KEY=
TOGETHER_AI_API_KEY=
Next Steps
Once your LLMGateway is running:
- Open the web interface at http://localhost:3002
- Create your first organization and project
- Generate API keys for your applications
- Test the gateway by making API calls to http://localhost:4001
For more information, see the API Documentation and User Guide.