README.md
<strong>One beautiful Ruby API for GPT, Claude, Gemini, and more.</strong>
Battle tested at <picture><source media="(prefers-color-scheme: dark)" srcset="https://chatwithwork.com/logotype-dark.svg"></picture> — Your AI coworker
<a href="https://trendshift.io/repositories/13640" target="_blank"></a>
</div>[!NOTE] Using RubyLLM? Share your story! Takes 5 minutes.
Build chatbots, AI agents, RAG applications. Works with OpenAI, xAI, Anthropic, Google, AWS, local models, and any OpenAI-compatible API.
https://github.com/user-attachments/assets/65422091-9338-47da-a303-92b918bd1345
Every AI provider ships their own bloated client. Different APIs. Different response formats. Different conventions. It's exhausting.
RubyLLM gives you one beautiful API for all of them. Same interface whether you're using GPT, Claude, or your local Ollama. Just three dependencies: Faraday, Zeitwerk, and Marcel. That's it.
# Just ask questions
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"
# Analyze any file type
chat.ask "What's in this image?", with: "ruby_conf.jpg"
chat.ask "What's happening in this video?", with: "video.mp4"
chat.ask "Describe this meeting", with: "meeting.wav"
chat.ask "Summarize this document", with: "contract.pdf"
chat.ask "Explain this code", with: "app.rb"
# Multiple files at once
chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"]
# Stream responses
chat.ask "Tell me a story about Ruby" do |chunk|
print chunk.content
end
# Generate images
RubyLLM.paint "a sunset over mountains in watercolor style"
# Create embeddings
RubyLLM.embed "Ruby is elegant and expressive"
# Transcribe audio to text
RubyLLM.transcribe "meeting.wav"
# Moderate content for safety
RubyLLM.moderate "Check if this text is safe"
# Let AI use your code
class Weather < RubyLLM::Tool
description "Get current weather"
param :latitude
param :longitude
def execute(latitude:, longitude:)
url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}¤t=temperature_2m,wind_speed_10m"
JSON.parse(Faraday.get(url).body)
end
end
chat.with_tool(Weather).ask "What's the weather in Berlin?"
# Define an agent with instructions + tools
class WeatherAssistant < RubyLLM::Agent
model "gpt-5-nano"
instructions "Be concise and always use tools for weather."
tools Weather
end
WeatherAssistant.new.ask "What's the weather in Berlin?"
# Get structured output
class ProductSchema < RubyLLM::Schema
string :name
number :price
array :features do
string
end
end
response = chat.with_schema(ProductSchema).ask "Analyze this product", with: "product.txt"
RubyLLM.chatRubyLLM.transcribeRubyLLM.paintRubyLLM.embedRubyLLM.moderateRubyLLM::Agentacts_as_chatAdd to your Gemfile:
gem 'ruby_llm'
Then bundle install.
Configure your API keys:
# config/initializers/ruby_llm.rb
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
end
# Install Rails Integration
bin/rails generate ruby_llm:install
bin/rails db:migrate
bin/rails ruby_llm:load_models # v1.13+
# Add Chat UI (optional)
bin/rails generate ruby_llm:chat_ui
class Chat < ApplicationRecord
acts_as_chat
end
chat = Chat.create! model: "claude-sonnet-4"
chat.ask "What's in this file?", with: "report.pdf"
Visit http://localhost:3000/chats for a ready-to-use chat interface!
See CONTRIBUTING.md.
Released under the MIT License.