Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Vllm provider #124

Merged
merged 4 commits into from
Nov 28, 2024
Merged

Vllm provider #124

merged 4 commits into from
Nov 28, 2024

Conversation

lukehinds
Copy link

Allows the injection of a custom URL:

codegate serve --vllm-url https://inference.codegate.ai

It allows the same for the other providers (but defaults the current). I tested this directly to anthropic with no issues as well.

Copy link
Contributor

@jhrozek jhrozek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's merge this and I will rebase 121 atop this one

@lukehinds lukehinds merged commit d18e34a into main Nov 28, 2024
2 checks passed
@lukehinds lukehinds deleted the vllm-provider branch November 28, 2024 21:54
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Task]: Add a vllm provider for the stacklok hosted instance
3 participants