LLM Gateway
Guides

LLM Gateway CLI

Command-line tool for scaffolding and managing LLM Gateway projects

The LLM Gateway CLI (@llmgateway/cli) is a command-line utility for scaffolding projects, managing AI applications, and discovering models.

Installation

Run commands directly without installation:

npx @llmgateway/cli init

Install globally for faster access:

npm install -g @llmgateway/cli

Then run commands directly:

llmgateway init

Quick Start

Initialize a Project

Create a new project from a template:

npx @llmgateway/cli init

Or specify the template and name directly:

npx @llmgateway/cli init --template image-generation --name my-ai-app

Configure Authentication

Login to save your API key locally:

npx @llmgateway/cli auth login

This opens a browser window to authenticate with LLM Gateway. Your credentials are stored in ~/.llmgateway/config.json.

Alternatively, set the LLMGATEWAY_API_KEY environment variable which takes precedence over the config file.

Start Development

Navigate to your project and start the development server:

cd my-ai-app
npx @llmgateway/cli dev

Or specify a custom port:

npx @llmgateway/cli dev --port 3000

Commands

init

Initialize a new project from a template.

npx @llmgateway/cli init [options]

Options:

  • --template <name> — Template to use (e.g., image-generation, weather-agent)
  • --name <name> — Project name

Examples:

# Interactive mode
npx @llmgateway/cli init

# With options
npx @llmgateway/cli init --template image-generation --name my-app

list

Display available project templates.

npx @llmgateway/cli list

Options:

  • --json — Output in JSON format

models

Browse and filter available AI models.

npx @llmgateway/cli models [options]

Options:

  • --capability <type> — Filter by capability (e.g., chat, image, embedding)
  • --provider <name> — Filter by provider (e.g., openai, anthropic, google)
  • --search <term> — Search models by name

Examples:

# List all models
npx @llmgateway/cli models

# Filter by provider
npx @llmgateway/cli models --provider openai

# Search models
npx @llmgateway/cli models --search gpt

add

Add tools or API routes to an existing project.

npx @llmgateway/cli add

Tools available:

  • weather — Weather lookup functionality
  • search — Web search capability
  • calculator — Mathematical operations

API routes available:

  • generate — Text generation endpoint
  • chat — Chat completion endpoint

auth

Manage API authentication.

# Login via browser
npx @llmgateway/cli auth login

# Check authentication status
npx @llmgateway/cli auth status

# Logout
npx @llmgateway/cli auth logout

dev

Start the local development server.

npx @llmgateway/cli dev [options]

Options:

  • --port <number> — Port to run on (default: 3000)

upgrade

Update LLM Gateway dependencies in your project.

npx @llmgateway/cli upgrade [options]

Options:

  • --dry-run — Show what would be updated without making changes

docs

Open the documentation in your browser.

npx @llmgateway/cli docs

Available Templates

Image Generation

A full-stack application for AI image generation.

  • Stack: Next.js 16, React 19, TypeScript
  • Features: Multi-provider support (DALL-E, Stable Diffusion), unified API
  • Use case: Image generation apps, creative tools
npx @llmgateway/cli init --template image-generation

QA Agent

An AI-powered QA testing agent that uses browser automation to test your web app.

  • Stack: Next.js 16, React 19, TypeScript, Agent Browser
  • Features: Natural language testing, real-time action timeline, live browser preview
  • Use case: Automated QA testing, regression testing, user flow validation
npx @llmgateway/cli init --template qa-agent

Weather Agent

A CLI agent demonstrating tool calling capabilities.

  • Stack: TypeScript, AI SDK, OpenAI
  • Features: Tool calling, real-time data, natural language
  • Use case: Learning tool usage, building CLI agents
npx @llmgateway/cli init --template weather-agent

Configuration

The CLI stores configuration in ~/.llmgateway/config.json:

{
	"apiKey": "llmgtwy_...",
	"defaultTemplate": "image-generation"
}

Environment Variables

The LLMGATEWAY_API_KEY environment variable takes precedence over the config file:

export LLMGATEWAY_API_KEY="llmgtwy_..."

More Resources

Need help or want to request a feature? Open an issue on GitHub.

How is this guide?

Last updated on

On this page

Ready for production?

Ship to production with SSO, audit logs, spend controls, and guardrails your security team will approve.

Explore Enterprise