v1.0.2 Release
This release delivers major enhancements to the Prompt an LLM (PaLLM) interface with conversation support, optimized response handling for all major providers, and smarter provider configuration to reduce setup errors.
Improvements
Improved PaLLM Interface
The Prompt an LLM page has been completely reimagined to provide a chat-like experience. Conversations now maintain context across multiple prompts, responses display as clean readable text, and the interface supports OpenAI, Anthropic, and Google formats seamlessly.
Key Benefits:
- Natural conversation flow with maintained context
- Clean, formatted responses without technical artifacts
- Multi-provider support with optimized streaming
- Session-based conversation persistence
- Enhanced Gemini streaming compatibility
Optimized Response Handling
Streamed responses from language models are now consolidated into single, readable entries in the cache. This improvement enhances both storage efficiency and response readability across all supported providers.
Performance Gains:
- Cleaner cache entries without streaming artifacts
- Improved storage efficiency for long responses
- Better search and retrieval performance
- Support for all major streaming formats
- Unified handling for OpenAI, Anthropic, and Google responses
Smarter Provider Configuration
The Auth Method field in provider configuration now intelligently pre-populates based on your selected Provider Format, reducing configuration errors while maintaining flexibility for custom setups.
Automatic Mappings:
- OpenAI format → Bearer authentication
- Google format → API key authentication
- Anthropic format → x-api-key header
- Manual override always available when needed
Bug Fixes
Administrator Account Protection
Additional safeguards have been implemented to protect the system administrator account. The default admin account now has enhanced protection against accidental role changes, ensuring uninterrupted system administration capabilities.
Response Streaming Compatibility
Improved streaming support for Gemini providers and enhanced compatibility across various language models, ensuring smooth real-time response delivery for all supported providers.
Technical Notes
This release includes updates to the following components:
- Complete frontend redesign of the PaLLM interface
- Backend response consolidation system for streamed content
- Provider configuration intelligence
- Security improvements for administrator accounts
For administrators planning to upgrade, no breaking changes are included in this release. All existing configurations will continue to work as expected.
Important: This release adds new database columns. Please run the MigrationsRunner
to update your database schema after upgrading.
Updated: 2025-08-12