Skip to content

Feature request: Add CometAPI as an LLM provider #1975

@tensornull

Description

@tensornull

Is your feature request related to a problem? Please describe.
Chatbot UI users need provider redundancy and cost flexibility. Relying on limited providers causes downtime during rate limits or outages. CometAPI (OpenAI-compatible) would improve reliability without changing existing workflows.

Describe the solution you'd like

  • Add CometAPI as a first-class LLM provider or support custom baseURL in OpenAI integration
  • Support chat features: text generation, model selection, API key management, streaming
  • Environment variables like existing providers (e.g., COMETAPI_API_KEY)
  • Filter non-chat models from CometAPI's model list

Describe alternatives you've considered

  • Using OpenAI integration with custom baseURL—works but not obvious
  • Third-party proxies—adds dependencies
  • Current providers only—limits resilience

Additional context
CometAPI Resources:

Compatibility:

  • OpenAI-compatible endpoints, enabling seamless integration
  • Supports streaming responses for real-time chat
  • Filter needed for non-chat models (image/audio/video generation)

Implementation Offer
If you are willing, we can implement this and submit a PR containing documentation, examples, and test API keys for verification.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions