LLM Gateway

Self Host LLMGateway

Simple guide to self-hosting LLMGateway using Docker Compose.

Self Host LLMGateway

LLMGateway is a self-hostable platform that provides a unified API gateway for multiple LLM providers. This guide will help you get started quickly using Docker Compose.

What You'll Get

  • Gateway Service: Routes LLM API requests to different providers
  • Web Interface: Manage projects and API keys
  • Database: Stores users, projects, and request logs
  • Redis: Handles caching

Prerequisites

  • Latest Docker (which will include Docker Compose)
  • API keys for the LLM providers you want to use (OpenAI, Anthropic, etc.)

Quick Start

  1. Clone the repository:

    git clone https://github.com/theopenco/llmgateway.git
    cd llmgateway
  2. Set up environment:

    cp .env.example .env
  3. Configure your API keys (see Configuration section below)

  4. Start the services:

    docker compose -f docker-compose.prod.yml up -d
  5. Initialize the database:

    # Wait for services to start, then run:
    export DATABASE_URL=postgres://${POSTGRES_USER:-postgres}:${POSTGRES_PASSWORD:-change_this_secure_password}@localhost:5432/${POSTGRES_DB:-llmgateway}
    pnpm migrate
  6. Access your LLMGateway:

Configuration

Edit the .env file to configure your LLMGateway instance:

Required Settings

# Database (change the password!)
POSTGRES_PASSWORD=your_secure_password_here
DATABASE_URL=postgres://postgres:your_secure_password_here@postgres:5432/llmgateway

# Redis
REDIS_PASSWORD=your_redis_password_here

# Basic URLs
UI_URL=http://localhost:3002
DOCS_URL=http://localhost:3005
API_URL=http://localhost:4002
ORIGIN_URL=http://localhost:3002

LLM Provider API Keys

Add API keys for the providers you want to use:

# OpenAI
OPENAI_API_KEY=sk-...

# Anthropic
ANTHROPIC_API_KEY=sk-ant-...

# Google AI Studio
GOOGLE_AI_STUDIO_API_KEY=your_google_ai_key

# Add other providers as needed

That's it! The other settings in .env.example have sensible defaults.

Managing Your Instance

Basic Commands

# View logs
docker compose -f docker-compose.prod.yml logs -f

# Restart services
docker compose -f docker-compose.prod.yml restart

# Stop services
docker compose -f docker-compose.prod.yml down

# Update to latest version
docker compose -f docker-compose.prod.yml pull
docker compose -f docker-compose.prod.yml up -d

Database Management

# Run database migrations (after updates)
export DATABASE_URL=postgres://postgres:your_password@localhost:5432/llmgateway
pnpm migrate

Next Steps

Once your LLMGateway is running:

  1. Open the web interface at http://localhost:3002
  2. Create your first organization and project
  3. Generate API keys for your applications
  4. Test the gateway by making API calls to http://localhost:4001
  5. Read the docs at http://localhost:3005

For more information, see the API Documentation and User Guide.

On this page