AiToolkit provides a simple Ruby DSL for interacting with Anthropic's Claude models. It can talk directly to the Claude HTTP API or via AWS Bedrock.
Add this line to your application's Gemfile:
gem 'ai_toolkit'And then execute:
bundle installOr install it yourself as:
gem install ai_toolkitclient = AiToolkit::Client.new(provider)
response = client.request do |c|
c.system_prompt 'My Prompt'
c.message :user, 'Hello'
c.tool :example_tool, {}
endFor Claude built-in server side tools, just provide the tool name and any configuration options:
client.request do |c|
c.tool :web_search, max_uses: 3, allowed_domains: ['example.com']
endThe response object exposes the stop reason and a chronologically ordered list of
results via #results. Each element of this array is one of three objects:
AiToolkit::MessageResult for LLM messages, AiToolkit::ToolRequest
for tool calls requested by the LLM, and AiToolkit::ToolResponse for
the data returned back to the model from executed tools. Each response also
provides #execution_time, the number of seconds spent performing the LLM call.
Requests automatically loop through tool calls. A basic request looks like:
response = client.request do |c|
c.system_prompt 'My Prompt'
c.message :user, 'First message'
c.tool MyToolObject
endYou can override the maximum number of tokens sent to the provider and the iteration limit for tool looping. A specific tool can be forced by passing a tool_choice hash:
client.request(max_tokens: 2048, max_iterations: 10,
tool_choice: { type: 'tool', name: 'example_tool' }) do |c|
c.message :user, 'Hello'
endAdditional generation options like temperature, top_k, and top_p can also be specified and are passed directly through to the provider:
client.request(temperature: 0.2, top_k: 5, top_p: 0.9) do |c|
c.message :user, 'Hello'
endA tool may terminate further LLM calls by raising
AiToolkit::StopToolLoop from #perform.
See lib/ai_toolkit/providers/claude.rb and lib/ai_toolkit/providers/bedrock.rb for the provider implementations. A simple fake provider for testing is available in lib/ai_toolkit/providers/fake.rb.
To use the Bedrock provider you need the aws-sdk-bedrockruntime gem. This gem is not included in ai_toolkit's runtime dependencies, so install it separately when required:
gem install aws-sdk-bedrockruntimeWhen building the Docker image for testing with docker-compose, the gem is installed automatically via Bundler's docker group:
docker-compose build
docker-compose upRun the test suite with:
rake test
AiToolkit::Client allows registering callbacks before and after each provider
call:
client.before_request do |req, model:, provider:|
# inspect or modify `req`
end
client.after_request do |req, res, model:, provider:|
# inspect request and response
endThe before hook may modify the request hash. Errors raised by the before hook propagate and abort the LLM request. Errors raised in the after hook are swallowed but will stop any automatic tool loop.