Skip to content

WIP - Support thinking mode for Anthropic models #170

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

rhys117
Copy link
Contributor

@rhys117 rhys117 commented May 14, 2025

Resolves #154

Description

Add thinking support for Anthropic models.

Implementation Details

  • Set thinking on the supported model in models.json
  • Expose thinking response on Message via .thinking_content

Testing

So far, this has been tested manually.

 bin/console                                                                                                                                ─╯
3.2.2 :001 > chat = RubyLLM::Chat.new(model: 'claude-3-7-sonnet-20250219').with_reasoning.with_temperature(1.0)
 =>
#<RubyLLM::Chat:0x0000000105cc97e0
...
3.2.2 :002 > chat.ask("Short story about dog")
 =>
#<RubyLLM::Message:0x0000000102851198
 @content=
  #<RubyLLM::Content:0x0000000105ba2290
   @attachments=[],
   @text=
    "# The Old Trail\n\nMax's paws had worn this forest path for twelve years. His graying muzzle dipped low, sniffing familiar scents as he limped slightly beside Sarah.\n\n\"Easy, old boy,\" Sarah said, slowing her pace. Once, Max had pulled her eagerly along this trail, but age had mellowed his golden retriever enthusiasm.\n\nThey reached the overlook where the lake sparkled below. Sarah sat on their usual boulder, and Max settled at her feet with a contented sigh. This was their ritual since Sarah's difficult divorce—the loyal companion who'd never left her side.\n\nA rustling in the bushes caught Max's attention. His ears perked up as a small child emerged, tears streaming down her face.\n\n\"I can't find my mommy,\" she sobbed.\n\nDespite his arthritis, Max rose and gently nudged the girl's hand. Sarah knelt beside her, \"We'll help you. Max has the best nose in the county.\"\n\nMax led them steadily through the trees, following a scent only he could detect. When they reached a frantic woman calling out \"Emma!\", the little girl ran forward with a cry of relief.\n\nLater that evening, Max received an extra treat. As he curled up on his bed, Sarah whispered, \"Some things age, but a good heart only grows stronger.\"\n\nMax thumped his tail in agreement, dreams of tomorrow's walk already dancing in his head.">,
 @input_tokens=39,
 @model_id="claude-3-7-sonnet-20250219",
 @output_tokens=405,
 @reasoning_content=
  "I'm being asked to write a short story about a dog. I'll craft a brief, heartwarming story with a canine protagonist that has some character development and a positive message. I'll make sure it has a clear beginning, middle, and end, and create an emotional connection to the dog character. I'll keep it family-friendly and accessible to a general audience.",
 @role=:assistant,
 @tool_call_id=nil,
 @tool_calls=nil>

TODO:

  • Make it more configurable
  • Ensure temperature configurable
  • Ensure support for acts_as persistent models

@@ -39,36 +39,51 @@ def build_base_payload(chat_messages, temperature, model, stream)
{
model: model,
messages: chat_messages.map { |msg| format_message(msg) },
temperature: temperature,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO:

@rhys117
Copy link
Contributor Author

rhys117 commented Jun 6, 2025

Hey @crmne, can you please let me know if this looks to be heading in a direction that you're happy with architecturally?

Cheers!

Copy link
Owner

@crmne crmne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @rhys117!

Thanks for tackling this feature!

Issue: Inconsistent Naming

We've got with_reasoning(), reasoning_content, default_reasoning_budget, but Anthropic's API calls it "thinking" - and we're mixing both terms in capabilities. This breaks our clean API pattern.

Solution: Standardize on "thinking"

Let's match Anthropic's terminology throughout:

# Method
def with_thinking(budget: nil)
  # budget parameter instead of separate config call
end

# Response  
response.thinking     # not reasoning_content
response.thinking?    # boolean check

# Config
default_thinking_budget  # not reasoning

# Error
UnsupportedThinkingError

Usage:

chat = RubyLLM.chat(model: 'claude-3-7-sonnet')
  .with_thinking(budget: 2048)
  .with_temperature(1.0)

response = chat.ask("Short story about a dog")
puts response.thinking
puts response.content  

This keeps our API consistent and beautiful. Thoughts?

@rhys117
Copy link
Contributor Author

rhys117 commented Jun 12, 2025

That all sounds good to me, I'll take a look at this again in the coming days

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Claude 3.7 Thinking support and budget
2 participants