Integrating AI Controller with Continue.dev
This guide explains how to integrate AI Controller with Continue.dev, enabling developers to use AI assistance in their code editors while maintaining the security, cost control, and monitoring benefits of AI Controller.
Overview
Continue.dev is an open-source AI coding assistant that integrates with popular code editors like VS Code and JetBrains IDEs. By integrating Continue.dev with AI Controller, you can:
- Provide AI-powered coding assistance to developers
- Apply consistent security policies and governance to AI interactions
- Monitor and track AI usage for coding assistance
- Centralize API key management across your development teams
Prerequisites
- AI Controller installed and operational
- At least one AI Controller API key
- Continue.dev extension installed in your preferred code editor
- Administrator access to configure Continue.dev
- Providers created within AI Controller that the owner of the AI Controller API Key is allowed to use
- See What are Providers for details about AI Controller Providers.
Integration Steps
This document should not be used as a substitute for the official Continue.dev documentation.
Configure Continue.dev to use AI Controller
To get Continue.dev working through AI Controller, the 'models' created need to be directed to the AI Controller 'work' end point.
This is done by modifying the config.yaml for Continue.dev.
- Open the configuration file (usually
config.yaml
) as described in the Continue.dev Configuration guide. - Add a new entry in the 'models' list; give it a descriptive name and set the provider and model you wish to use.
- Set the API Key to be the AI Controller API Key that should be used with these requests.
- Redirect where Continue.dev sends the request by setting
apiBase
to point to the AI Controller 'work' end point.
For example:
The example below uses the 'groq' provider, and the 'llama-3.3-70b-versatile' model, and expects AI Controller to be running on port 9090 on 'localhost'
models:
- name: AIC-llama-3.3-70b-versatile
provider: groq
model: llama-3.3-70b-versatile
apiKey: <Your AIC API Key>
apiBase: https://localhost:9090/work/
In the above configuration:
name
: the display name of the 'model' as it would be seen in Continue.devprovider
: the provider that Continue.dev should construct the LLM requests for- This does not need to match the name of the Provider in AI Controller
model
: the model on the external provider that will be used to fulfill the request- This model does not need to exist within AI Controller when the AI Controller API Key specifies which Provider to use
- If the AI Controller API Key does not specify the Provider, a Model will need to exist within AI Controller that matches this model name exactly
apikey
: The AI Controller API Key that should be used for requestsapiBase
: The URL of the AI Controller 'work' end point. This is where Continue.dev will send the requests.
SSL Communication
If AI Controller is using a self-signed certificate, it is likely to raise errors with Continue.dev. To resolve this, the following additional setting is needed:
Use an AI Controller Enabled model
Once an AI Controller enabled model has been configured in Continue.dev it will automatically be used for Chat, Edit and Apply roles if it's the only configured model.
If there are multiple Continue.dev models configured, the newly created AI Controller enabled model will need to be selected for use, as shown in the screenshot below:
Once the model is selected, the normal Continue.dev chat interface can be used. All communication will then be tracked and logged with AI Controller.
Test the Integration
Once an AI Controller enabled model is selected for use in AI Controller the integration is ready to be used. For example:
- Select some code
- Press CTRL + L to add the code to the Continue.dev chat window
- In the chat window ask something about the code, e.g. 'What does this code do?'
- Execute the query.
- The response should be seen in the Continue.dev view as normal.
Check that AI Controller has successfully tracked the request:
- Visit the AI Controller web frontend and log in
- Click the 'Logs' link on the left sidebar to view your log entries
- There should be some log entries created for the request from Continue.dev
- Click the 'Cache' tab on the Logs Page
- There should be a single new entry that corresponds to the new request sent to the external LLM.
Features and Capabilities
With this integration, developers can use Continue.dev's features while AI Controller manages the backend:
Code Completion
Continue.dev provides contextual code completion suggestions. With AI Controller integration, the communication to the LLMs for these suggestions is logged.
To enable use with code completion, the 'model' needs to be assigned the 'role' of 'autocomplete':
This would disable it for use with chat, edit and apply functionality, so in reality it would be more likely to look like this:
Code Explanation
Developers can select code sections and ask Continue.dev to explain them. AI Controller routes these requests to the appropriate model and applies governance policies (see Governance for details) and Rules (see Rules Engine for details)
Code Generation
When developers need to generate new code based on comments or requirements, Continue.dev can help. AI Controller ensures these requests follow your organization's policies (see Governance and Rules Engine for details)
Contextual Conversations
Continue.dev allows developers to have ongoing conversations about their code. AI Controller tracks and logs these conversations.
Troubleshooting
If you encounter issues with the integration:
- Check the AI Controller logs to verify that requests from Continue.dev are reaching AI Controller
- Ensure the API key has the necessary permissions in AI Controller
- Check the API Key has been assigned the correct Provider
- Check Continue.dev logs for connection errors
- Test the AI Controller end point using a simple curl command to ensure it's responding correctly
For example:
Updated: 2025-05-15