You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 5, 2025. It is now read-only.
While LiteLLM is very useful (huge coverage of providers), we always invariably hit issues around certain idiomatic approaches taken by LiteLLM, and feel we need to take a different approach. It therefore makes sense to build our own native implementation
If possible, we should initially seek to support both LiteLLM and the CodeGate implementation (if viable and not to heavy a lift).
The text was updated successfully, but these errors were encountered:
what we gain from litellm is an easy way to talk to multiple providers, but since we receive the provider-native data and litellm expects to receive openAI formatted data and replies with openAI formatted data, we still need to implement our own normalizers
we get behaviour like throwing away some data fields. This has not been a problem so far, but might be in the future.
Not knocking your efforts, LiteLLM is an awesome project and you have a lot of support coverage to maintain! We just need something it bit more specific to our needs.
While LiteLLM is very useful (huge coverage of providers), we always invariably hit issues around certain idiomatic approaches taken by LiteLLM, and feel we need to take a different approach. It therefore makes sense to build our own native implementation
If possible, we should initially seek to support both LiteLLM and the CodeGate implementation (if viable and not to heavy a lift).
The text was updated successfully, but these errors were encountered: