Managing LLM Providers
This guide explains how to configure and manage Large Language Model (LLM) providers in the AI Controller administration interface. Providers are the AI services that AI Controller connects to when processing requests. A provider can either be a commercial LLM endpoint provider or an endpoint of an LLM running locally or on a machine within your network/control.
Providers Overview
AI Controller serves as a gateway between your applications and various LLM providers. These providers can be external commercial services (like OpenAI or Groq) or internal/local LLM endpoints running within your own infrastructure. The Providers administration page allows you to:
- View all configured LLM providers
- Add new provider connections
- Update existing provider configurations
- Delete providers that are no longer needed
Accessing Provider Management
Provider configuration is managed through the Providers Page in the the AI Controller web interface. Click Admin -> Providers in the sidebar to open it. This area is accessible only to users with administrative privileges.
Screenshot: Shows the Provider Management interface with filtering options, CREATE button, and a list of configured providers
Providers Interface
The Provider Management page includes:
- A "CREATE" button in the top right corner for adding new providers
- Filtering options:
- A "Filters" button that displays Provider filter selector
- Search field for Provider name
- "HIDE FILTERS" and "APPLY FILTERS" buttons
- A table displaying all configured providers with columns for:
- Pagination controls showing items per page and navigation
Adding a New Provider
To add a new LLM provider to AI Controller follow the steps below:
- Click "CREATE" in the top right corner of the Provider Management page
- In the displayed dialog:
- Type the desired name for the Provider that will be created (3 to 32 characters)
- Use a descriptive name that identifies the service (e.g., "openai")
- Add the Provider URL, e.g. https://api.anthropic.com/v1/messages
- External commercial provider APIs (groq, openai): Use their public endpoints
- Internal/local LLM endpoints: Use the appropriate URL for your locally hosted or network-accessible LLM (e.g., http://localhost:8000/v1/chat/completions)
- Always refer to the provider's documentation for the most current endpoint URLs or your internal documentation for locally hosted models
- Set the Authentication method used by this Provider (e.g., "Bearer" for OpenAI)
- Set the API Key
- Enter your provider API key obtained from the provider's website or dashboard
- These keys connect AI Controller to external services like OpenAI or Groq
- Note that an API Key must be set, but can be simply 'None' if a Key is not required for this Provider
- To rotate a key later, simply edit the provider and update this field - AI Controller will immediately begin using the new key
- Click "CREATE" on the dialog to create the new Provider
Screenshot: The Create/Edit Provider dialog where API keys can be configured
Once your provider is properly configured, you can create an AI Controller API key linked to this provider and use it in your applications. You can verify keys work correctly by using the Prompt an LLM feature.
Managing Existing Providers
Once providers are created, you can manage them using the action buttons in the table:
- To edit a provider, click the edit (pencil) icon
in the Actions column
- To delete a provider, click the delete (trash) icon
in the Actions column and confirm when prompted
Warning: Deleting a provider will remove its configuration from the system, which may affect any applications using that provider.
Provider Authentication Types
AIC supports the following authentication methods by default:
Authorization
header with the following prefixes:Bearer
AWS4-HMAC-SHA256
Basic
token
Bot
x-api-key
api-key
A system administrator can also configure other Authorization header prefixes as detailed in the Server Configuration documentation.
Common Provider Configurations
Based on the Provider Management interface shown in the screenshot at the beginning of this document, several types of providers can be configured:
External commercial providers:
Common Providers are OpenAI, Groq, Gemini, and Anthropic, but there are many others. The following table gives the settings required for each of these 4 common Providers:
Provider Name | URL | Auth Method |
---|---|---|
OpenAI | https://api.openai.com/v1/chat/completions | Bearer |
Groq | https://api.groq.com/openai/v1/chat/completions | Bearer |
Gemini | https://generativelanguage.googleapis.com/v1beta | key |
Anthropic | https://api.anthropic.com/v1/messages | X-API-Key |
Internal/local LLM endpoints
You can also configure providers that point to LLMs running within your own infrastructure:
- Self-hosted open source models
- On-premises enterprise LLM deployments
- Local development instances
For these internal providers, you'll need to specify the appropriate URL and authentication method based on your specific deployment.
For detailed instructions on setting up local LLM endpoints, see the Serving Local Models guide.
Provider Management
For optimal provider management:
- Use descriptive provider names that identify the service or purpose
- Regularly validate API keys to ensure they remain active
- Create provider connections for different environments (dev, test, prod)
- Keep provider endpoints updated as APIs evolve
- Label internal vs. external providers for clear distinction
- Document network requirements for internal LLM endpoints
Troubleshooting Provider Issues
Common provider-related issues and their solutions:
Issue | Possible Causes | Solutions |
---|---|---|
Connection errors | - Incorrect URL - Network issues - Provider outage |
- Verify endpoint URL - Check network connectivity - Confirm provider status |
Authentication failures | - Invalid API key - Expired token - Wrong auth type |
- Check API key validity - Regenerate provider token - Verify auth method |
Model not found | - Mistyped model name - Model not available - Model discontinued |
- Check exact model name - Verify model availability - Update to current model version |
Related Documentation
Updated: 2025-05-27