docs/_advanced/upgrading.md
{: .no_toc }
{{ page.description }} {: .fs-6 .fw-300 }
{: .no_toc .text-delta }
# Run the upgrade generator
bin/rails generate ruby_llm:upgrade_to_v1_14
# Run migrations
bin/rails db:migrate
That's it! The generator:
thought_signature on tool calls from string to textAmong other features:
# Run the upgrade generator
bin/rails generate ruby_llm:upgrade_to_v1_10
# Run migrations
bin/rails db:migrate
That's it! The generator:
thinking_text and thinking_signature for storing extended thinking outputthinking_tokens for tracking thinking token usagethought_signature to tool calls for Gemini 3 Pro function callingAmong other features:
# Run the upgrade generator
bin/rails generate ruby_llm:upgrade_to_v1_9
# Run migrations
bin/rails db:migrate
That's it! The generator:
cached_tokens and cache_creation_tokens columns for tracking accessed cached tokens and created cache tokens respectively.content_raw column for the new [Raw Content Blocks]({% link _core_features/chat.md %}#raw-content-blocks) featureAmong other features:
Upgrade to the DB-backed model registry for better data integrity and rich model metadata.
# Run the upgrade generator
bin/rails generate ruby_llm:upgrade_to_v1_7
# Run migrations
bin/rails db:migrate
That's it! The generator:
config.use_new_acts_as = true to your initializeracts_as declarations to the new versionmodel_id_string)If you're using custom model names:
bin/rails generate ruby_llm:upgrade_to_v1_7 chat:Conversation message:ChatMessage tool_call:MyToolCall model:MyModel
bin/rails db:migrate
Your existing 1.6 app continues working without any changes. You'll see a deprecation warning on Rails boot:
!!! RubyLLM's legacy acts_as API is deprecated and will be removed in RubyLLM 2.0.0.
Among other features, the DB-backed model registry replaces simple string fields with proper ActiveRecord associations. Additionally, the acts_as helpers have been redesigned with a more Rails-like API.
{: .d-inline-block }
v1.7.0+ {: .label .label-green }
New Rails-like acts_as API
# New API uses association names as primary parameters
acts_as_chat messages: :messages, model: :model
acts_as_message chat: :chat, tool_calls: :tool_calls, model: :model
# vs Legacy API which required explicit class names
acts_as_chat message_class: 'Message', tool_call_class: 'ToolCall'
acts_as_message chat_class: 'Chat', chat_foreign_key: 'chat_id'
Rich model metadata
chat.model.name # => "GPT-4"
chat.model.context_window # => 128000
chat.model.supports_vision # => true
chat.model.input_token_cost # => 2.50
Provider routing
Chat.create!(model: "{{ site.models.anthropic_current }}", provider: "bedrock")
Model associations and queries
Chat.joins(:model).where(models: { provider: 'anthropic' })
Model.select { |m| m.supports_functions? } # Use delegated methods
Model alias resolution
Chat.create!(model: "{{ site.models.default_chat }}", provider: "openrouter") # Resolves to openai/{{ site.models.default_chat }} automatically
Usage tracking
Model.joins(:chats).group(:id).order('COUNT(chats.id) DESC')
{: .d-inline-block }
Legacy mode {: .label .label-yellow }
Legacy acts_as API - Still uses the old parameter style
acts_as_chat message_class: 'Message', tool_call_class: 'ToolCall'
acts_as_message chat_class: 'Chat', tool_call_class: 'ToolCall'
Basic functionality - All core RubyLLM features work
chat.ask("Hello!") # Works fine
chat.model_id # => "{{ site.models.openai_standard }}" (string only, no metadata)
Limited to:
If you're using custom model names (e.g., Conversation instead of Chat), you may need to update your acts_as declarations to the new API:
Before (1.6):
class Conversation < ApplicationRecord
acts_as_chat message_class: 'ChatMessage', tool_call_class: 'AIToolCall'
end
class ChatMessage < ApplicationRecord
acts_as_message chat_class: 'Conversation', chat_foreign_key: 'conversation_id'
end
After (1.7):
class Conversation < ApplicationRecord
acts_as_chat messages: :chat_messages, # Association name
message_class: 'ChatMessage' # Class name if not inferrable
end
class ChatMessage < ApplicationRecord
acts_as_message chat: :conversation, # Association name
chat_class: 'Conversation' # Class name if not inferrable
end
{: .d-inline-block }
v1.7.0+ {: .label .label-green }
Add a fully-functional chat UI to your Rails app with Turbo streaming:
# Default model names
bin/rails generate ruby_llm:chat_ui
# Or with custom model names (same as install generator)
bin/rails generate ruby_llm:chat_ui chat:Conversation message:ChatMessage model:LLMModel
This creates:
The chat UI works with your existing Chat and Message models and includes:
If you're setting use_new_acts_as = true in an initializer (like config/initializers/ruby_llm.rb), it won't work. Rails loads models before initializers run, causing various issues:
Symptoms:
acts_as module gets included even though you set use_new_acts_as = trueundefined local variable or method 'acts_as_model' error during migrationlib/ruby_llm/active_record/acts_as_legacy.rb in backtracesSolution:
Add the configuration to config/application.rb before your Application class:
# config/application.rb
require_relative "boot"
require "rails/all"
# Configure RubyLLM before Rails::Application is inherited
RubyLLM.configure do |config|
config.use_new_acts_as = true
end
module YourApp
class Application < Rails::Application
# ...
end
end
This ensures RubyLLM is configured before ActiveRecord loads your models. Other configuration options (API keys, timeouts, etc.) can still go in your initializer.
This limitation exists because both legacy and new
acts_asAPIs need to coexist during the 1.x series. It will be resolved in RubyLLM 2.0 when the legacy API is removed. {: .note }
See the [Configuration guide]({% link _getting_started/configuration.md %}#initializer-load-timing-issue-with-use_new_acts_as) for more details.
Fresh installs get the model registry automatically:
bin/rails generate ruby_llm:install
bin/rails db:migrate
# Optional: Add chat UI
bin/rails generate ruby_llm:chat_ui