Integrating with AI Controller
AI Controller is designed to integrate seamlessly with your existing applications, frameworks, and workflows. This section covers the various integration options and provides guidance for developers.
Integration Overview
AI Controller provides several ways to integrate with your applications and tools:
- REST API: Direct HTTP access to AI Controller endpoints
- API Keys: Secure authentication for application access
- Client Integration: Integration with popular AI clients and interfaces
- Framework Integration: Support for AI development frameworks
- Local LLM Integration: Connection to locally hosted language models
REST API Integration
The most direct integration method is through AI Controller's REST API. AI Controller provides OpenAI-compatible endpoints that make it easy to adapt existing code. This enables you to route requests through AI Controller with minimal changes to your code.
API Keys for Integration
AI Controller provides a robust API key system for application authentication.
Local LLM Integration
AI Controller can connect to locally hosted language models running on your infrastructure, providing the same governance and monitoring capabilities as with cloud-based models while keeping data within your control.
Learn how to integrate local LLMs
Client Integrations
AI Controller seamlessly integrates with popular AI clients and interfaces to provide user-friendly access to language models while maintaining security and governance.
Some popular client integrations include:
- LibreChat - An open-source chat interface supporting multiple AI models
- Continue-Dev - A code assistance tool for developers
Framework and Library Integrations
AI Controller works with popular AI frameworks and libraries, allowing developers to build applications using familiar tools while leveraging AI Controller's security and monitoring features.
Integration Examples
import requests
# AI Controller endpoint
aic_url = "https://your-aic-server:9090/work/"
# Your AIC API key
api_key = "your-aic-api-key"
# Request payload
payload = {
"model": "gpt-4",
"messages": [
{"role": "user", "content": "Hello, world!"}
]
}
# Send request to AIC
response = requests.post(
aic_url,
json=payload,
headers={"Authorization": f"Bearer {api_key}"}
)
# Process response
if response.status_code == 200:
print(response.json())
else:
print(f"Error: {response.status_code}")
print(response.text)
async function queryAIC() {
const response = await fetch('https://your-aic-server:9090/work/', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer your-aic-api-key'
},
body: JSON.stringify({
model: 'gpt-4',
messages: [
{role: 'user', content: 'Hello, world!'}
]
})
});
if (response.ok) {
const data = await response.json();
console.log(data);
} else {
console.error('Error:', response.status);
}
}
Enterprise Integration Patterns
AI Controller supports common enterprise integration patterns:
API Gateway Pattern
Use AI Controller as a specialized AI gateway within your existing API gateway infrastructure to route, monitor, and secure all AI model interactions.
Sidecar Pattern
Deploy AI Controller as a sidecar to applications that need AI capabilities, allowing localized routing and caching while maintaining centralized governance.
Next Steps
After reviewing the integration options, consider:
- Exploring the detailed API reference
- Learning about caching strategies to optimize performance
- Understanding access control for secure integration
- Setting up monitoring and logging for your integrations
Updated: 2025-05-27