Chat UI documentation
Running Locally
Running Locally
Quick Start
- Create a
.env.localfile with your API credentials:
OPENAI_BASE_URL=/static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fv1
OPENAI_API_KEY=hf_************************- Install and run:
npm install npm run dev -- --open
That’s it! Chat UI will discover available models automatically from your endpoint.
Configuration
Chat UI connects to any OpenAI-compatible API. Set OPENAI_BASE_URL to your provider:
| Provider | OPENAI_BASE_URL | ||||||
|---|---|---|---|---|---|---|---|
| Hugging Face | /static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fv1%3C%2Fcode%3E%3C%2Ftd%3E%3C%2Ftr%3E |
See the configuration overview for all available options.
Database
For development, MongoDB is optional. When MONGODB_URL is not set, Chat UI uses an embedded MongoDB server that persists data to the ./db folder.
For production, you should use a dedicated MongoDB instance:
Option 1: Local MongoDB (Docker)
docker run -d -p 27017:27017 -v mongo-chat-ui:/data --name mongo-chat-ui mongo:latest
Then set MONGODB_URL=mongodb://localhost:27017 in .env.local.
Option 2: MongoDB Atlas (Managed)
Use MongoDB Atlas free tier for a managed database. Copy the connection string to MONGODB_URL.
Running in Production
For production deployments:
npm install npm run build npm run preview
The server listens on http://localhost:4173 by default.