Integrating AI Controller with LibreChat
This guide explains how to integrate AI Controller with LibreChat, enabling you to leverage AI Controller's robust security, monitoring, and governance features while using LibreChat as a user interface.
Overview
LibreChat is an open-source chat interface with support for various AI models. By integrating LibreChat with AI Controller, you can:
- Provide a user-friendly chat interface for your organization
- Apply AI Controller's security and governance policies to all interactions
- Monitor and log all usage through AI Controller
- Centralize API key management
Prerequisites
- AI Controller installed and operational
- An API key created in AI Controller
- LibreChat installed
- Admin access to configure LibreChat
Installation
Before integrating with AI Controller, you'll need to install LibreChat. The recommended method is using Docker Compose:
-
Clone the LibreChat repository:
-
Navigate to the LibreChat directory:
-
Create a
.env
file from the example: -
Start LibreChat using Docker Compose:
For more detailed installation instructions, see the official LibreChat documentation.
Integration Steps
Step 1: Configure LibreChat to Use AI Controller as an OpenAI-compatible Endpoint
LibreChat can be configured to use custom endpoints for OpenAI-compatible APIs. We'll configure LibreChat to use AI Controller as the endpoint for all OpenAI requests.
Option 1: Configure via .env file (Basic)
- Locate and open the
.env
file in your LibreChat directory - Configure the OpenAI endpoint to point to your AI Controller server:
# LibreChat .env file
# Enable OpenAI
OPENAI_API_KEY=your-aic-api-key
OPENAI_API_BASE_URL=https://your-aic-server:9090/
Option 2: Configure via librechat.yaml (Recommended)
For more advanced configuration, LibreChat uses a YAML configuration file. This method is preferred for integration with AI Controller.
- Create a
librechat.yaml
file in the root of the LibreChat project (where the.env
file is located) - If using Docker, create a
docker-compose.override.yml
file to ensure the config file is properly mounted:
version: '3.4'
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
- Configure the
librechat.yaml
file to use AI Controller as a custom endpoint. Below is an example configuration:
# For more information, see the Configuration Guide:
# https://www.librechat.ai/docs/configuration/librechat_yaml
# Configuration version (required)
version: 1.2.1
# Cache settings: Set to true to enable caching
cache: true
# Custom interface configuration
interface:
# Privacy policy and terms of service settings - customize as needed
privacyPolicy:
externalUrl: 'https://your-company.com/privacy-policy'
openNewTab: true
termsOfService:
externalUrl: 'https://your-company.com/terms-of-service'
openNewTab: true
# Interface feature toggles
endpointsMenu: true
modelSelect: true
parameters: false
sidePanel: true
presets: false
prompts: false
bookmarks: true
multiConvo: true
# Define your custom AI Controller endpoint
endpoints:
custom:
- name: 'AI Controller'
apiKey: user_provided # Prompts user to enter API key
baseURL: 'https://your-aic-server:9090/work/'
directEndpoint: true
models:
default:
[
"gpt-3.5-turbo",
"gpt-4",
"gpt-4-turbo"
# Add other models available through your AI Controller
]
fetch: false # Set to true if AI Controller provides a model list endpoint
titleConvo: true
titleModel: 'gpt-3.5-turbo' # Model to use for generating conversation titles
modelDisplayLabel: 'AI Controller' # Label shown in the UI
- Restart LibreChat to apply the configuration:
Step 2: Configure Multiple AI Provider Endpoints (Optional)
If you're using AI Controller to manage multiple AI providers, you can configure LibreChat to access them through different endpoints:
endpoints:
custom:
- name: 'OpenAI via AIC'
apiKey: user_provided
baseURL: 'https://your-aic-server:9090/work/'
directEndpoint: true
models:
default: ["gpt-3.5-turbo", "gpt-4", "gpt-4-turbo"]
fetch: false
titleConvo: true
titleModel: 'gpt-3.5-turbo'
modelDisplayLabel: 'OpenAI'
- name: 'Anthropic via AIC'
apiKey: user_provided
baseURL: 'https://your-aic-server:9090/work/'
directEndpoint: true
models:
default: ["claude-3-opus", "claude-3-sonnet", "claude-3-haiku"]
fetch: false
titleConvo: true
titleModel: 'claude-3-haiku'
modelDisplayLabel: 'Claude'
Using LibreChat with AI Controller
Once integrated, users can:
- Log in to LibreChat
- Select "AI Controller" (or your custom endpoint name) from the model provider dropdown
- Choose from the models you've configured
- Provide the AIC API key configured for the particular user
- Start chatting
All requests will be routed through AI Controller, which will:
- Apply security policies and access controls
- Track usage
- Log conversations according to your configuration
- Provide details of model usage
Advanced Configuration
User Authentication
For enterprise environments, you may want to integrate LibreChat with your organization's authentication system. LibreChat supports various authentication methods, including:
- Local authentication
- OAuth (Google, GitHub, etc.)
- LDAP integration
- Azure AD
Configure authentication in the registration
section of your librechat.yaml
file:
registration:
socialLogins: ['github', 'google', 'openid']
# Restrict to specific domains if needed
allowedDomains:
- "yourcompany.com"
Troubleshooting
If you encounter issues with the integration:
- Check the AI Controller logs to see if requests are reaching the server
- Verify that the API key used in LibreChat has the necessary permissions in AI Controller
- Ensure the models requested in LibreChat are configured in AI Controller
- Check LibreChat logs for connection errors or timeouts
- For Docker deployments, verify your
docker-compose.override.yml
is correctly mounting thelibrechat.yaml
file
For more information about LibreChat configuration, refer to the official LibreChat documentation.
Updated: 2025-05-23