Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Use ollama python client for completion #241

Merged
merged 3 commits into from
Dec 10, 2024
Merged

Conversation

aponcedeleonch
Copy link
Contributor

Until now we had been using self-made calls to Ollama. The problem is that the output of the calls didn't follow the standard format of the other calls. Initally I tried to implement Ollama completion calls with LiteLLM but also their interfaces seem quite broken. I was getting errors using their method acompletion and everytime that I set stream=False. Hence I resorted to use python's official Ollama client and implemented normalizers for it

Copy link
Contributor

@ptelang ptelang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested this change and it works as expected. Great job!

Until now we had been using self-made calls to Ollama. The problem
is that the output of the calls didn't follow the standard format
of the other calls. Initally I tried to implement Ollama completion
calls with LiteLLM but also their interfaces seem quite broken. I was
getting errors using their method `acompletion` and everytime that I set
`stream=False`. Hence I resorted to use python's official Ollama client
and implemented normalizers for it
Copy link
Contributor

@ptelang ptelang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@aponcedeleonch aponcedeleonch merged commit a7bebae into main Dec 10, 2024
3 checks passed
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants