-
-
Notifications
You must be signed in to change notification settings - Fork 152
Support options on completion calls #130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Support options on completion calls #130
Conversation
7a73a65
to
d3b8091
Compare
Needed this too 🙏 |
Does that make sense @crmne? If it does I can write some tests and move the PR to Ready |
@adenta I saw your comment on the other PR, so do you think this method is good? Just waiting for confirmation from folks more familiar with the gem before I put any more work on it (tests, etc) |
@crmne |
@crmne What do you think of this? Would love to see it merged (after conflict resolution ofc), since it would allow for any option with the underlying API. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR! I can see the real need here - providers keep adding unique features and we can't wrap every single one.
I like the direction, but let's fix the merge order:
render_payload(...).merge(options)
is dangerous because user options could override critical RubyLLM parameters like model
, messages
, stream
, etc.
Simple fix: Change it to options.merge(render_payload(...))
instead. This way user options go in first, then RubyLLM's required parameters take precedence.
For tests: I want to see this tested for each provider - verify that custom options work without breaking normal chat functionality across OpenAI, Anthropic, Gemini, etc.
Make that change and add provider tests, then this is good to merge! 🚀
This is a draft for allowing passing API parameters in chat completions.