-
-
Notifications
You must be signed in to change notification settings - Fork 9.4k
Open
Description
Is your feature request related to a problem? Please describe.
Chatbot UI users need provider redundancy and cost flexibility. Relying on limited providers causes downtime during rate limits or outages. CometAPI (OpenAI-compatible) would improve reliability without changing existing workflows.
Describe the solution you'd like
- Add CometAPI as a first-class LLM provider or support custom
baseURLin OpenAI integration - Support chat features: text generation, model selection, API key management, streaming
- Environment variables like existing providers (e.g.,
COMETAPI_API_KEY) - Filter non-chat models from CometAPI's model list
Describe alternatives you've considered
- Using OpenAI integration with custom
baseURL—works but not obvious - Third-party proxies—adds dependencies
- Current providers only—limits resilience
Additional context
CometAPI Resources:
- Website: https://www.cometapi.com/
- API Documentation: https://api.cometapi.com/doc
- Base URL: https://api.cometapi.com/v1/
- Model List: https://api.cometapi.com/v1/models
- Get API Key: https://api.cometapi.com/console/token
- Pricing: https://api.cometapi.com/pricing
Compatibility:
- OpenAI-compatible endpoints, enabling seamless integration
- Supports streaming responses for real-time chat
- Filter needed for non-chat models (image/audio/video generation)
Implementation Offer
If you are willing, we can implement this and submit a PR containing documentation, examples, and test API keys for verification.
Metadata
Metadata
Assignees
Labels
No labels