LLM Gateway
Learn

Group Chat

Compare responses from multiple LLM models side by side

The Group Chat page lets you send a single prompt to multiple models simultaneously and compare their responses side by side. This is useful for evaluating model quality, speed, and cost.

Group Chat

How It Works

  1. Select two or more models from the model picker
  2. Type your prompt in the input field
  3. All selected models receive the same prompt at once
  4. Responses stream in parallel, displayed in separate columns

Use Cases

  • Model evaluation — Compare output quality across providers
  • Cost optimization — See which models give the best results for the price
  • Speed comparison — Observe latency differences between models
  • Migration testing — Verify that a new model produces equivalent results

How is this guide?

Last updated on