Your Rails app is growing. Users are signing up. Data is accumulating.
And now things are getting slow.
The culprit is almost always the database. Not Rails. Not your server. The database.
Here's how to find slow queries, fix them, and keep your app fast as you scale.
The Problem: Why Databases Get Slow
Databases are incredibly fast at small scale. With 100 users and 10,000 records, everything runs instantly.
But databases don't scale linearly. At some point, the amount of data and complexity of queries exceeds what the database can handle efficiently.
What Actually Happens
| Stage | Symptoms | Cause |
|---|---|---|
| Early | Everything fast | Small data, simple queries |
| Growth | Some pages slow | Missing indexes, N+1 queries |
| Scaling | Many pages slow | Complex joins, large tables |
| Crisis | Timeouts everywhere | Full table scans, lock contention |
The good news: Most performance problems can be fixed with a few targeted changes.
Part 1: Finding Slow Queries
You can't fix what you can't measure.
Enable Query Logging in Development
# config/environments/development.rb
Rails.application.configure do
# Log all SQL queries
config.active_record.logger = Logger.new(STDOUT)
# Log slow queries (queries taking longer than 500ms)
config.active_record.verbose_query_logs = true
end
Now watch your logs. Look for lines that take a long time.
Using the Rails Console
rails console
# Enable timing in console
ActiveRecord::Base.connection.execute("SET timing = ON")
# Run your query and see execution time
User.joins(:orders).where(orders: { status: "completed" }).count
# => (35.2ms) SELECT COUNT(*) ...
The Bullet Gem for N+1 Detection
# Gemfile
group :development, :test do
gem "bullet"
end
bundle install
# config/environments/development.rb
config.after_initialize do
Bullet.enable = true
Bullet.alert = true
Bullet.rails_logger = true
Bullet.add_footer = true
end
Bullet will warn you when you have:
- N+1 queries
- Unused eager loading
- Missing counter caches
Using the Database Itself
For PostgreSQL, run:
EXPLAIN ANALYZE
SELECT users.*, orders.*
FROM users
LEFT JOIN orders ON orders.user_id = users.id
WHERE users.created_at > '2024-01-01';
This shows you exactly where the database is spending time.
Part 2: Fixing N+1 Queries
N+1 is the most common Rails performance problem.
What N+1 Looks Like
# Bad - makes 1 + 100 queries
users = User.limit(100)
users.each do |user|
puts user.orders.count # Hits database for each user
end
# Good - 2 queries total
users = User.includes(:orders).limit(100)
users.each do |user|
puts user.orders.count # Already loaded
end
The includes Method
# Single association
User.includes(:profile).all
# Multiple associations
User.includes(:orders, :profile).all
# Nested associations
User.includes(orders: :line_items).all
The references Method with Conditions
# When you need to filter on included associations
User.includes(:orders).where(orders: { status: "completed" }).references(:orders)
Eager Loading vs Lazy Loading
| Method | Loads Association | When |
|---|---|---|
includes |
Optional | Best for most cases |
preload |
Always separate queries | When you always need the data |
eager_load |
Single LEFT JOIN | When you need to filter on association |
joins |
No loading | When you only need to filter, not access |
Real-World Fix Example
# Before: 101 queries
class DashboardController < ApplicationController
def index
@projects = current_user.projects
@projects.each do |project|
project.tasks.each do |task|
task.assignee.name # N+1 for each task's assignee
end
end
end
end
# After: 3 queries
def index
@projects = current_user.projects.includes(tasks: :assignee)
# All data is already loaded, no more queries
end
Part 3: Adding Indexes
Indexes are the single biggest performance win for databases.
What Indexes Do
Without an index, the database scans every row. With an index, it jumps directly to the relevant rows.
-- Without index: scans all 1,000,000 rows
SELECT * FROM users WHERE email = 'alice@example.com';
-- With index: finds the row instantly
CREATE INDEX index_users_on_email ON users(email);
When to Add Indexes
Add indexes on columns that appear in:
| Clause | Example |
|---|---|
WHERE |
User.where(email: "...") |
ORDER BY |
User.order(created_at: :desc) |
JOIN |
User.joins(:orders) |
GROUP BY |
User.group(:status) |
How to Create Indexes
# db/migrate/xxxxxx_add_indexes_to_users.rb
class AddIndexesToUsers < ActiveRecord::Migration[7.0]
def change
# Simple index
add_index :users, :email
# Unique index
add_index :users, :email, unique: true
# Compound index (multiple columns)
add_index :orders, [:user_id, :created_at]
# Index with order
add_index :orders, :created_at, order: { created_at: :desc }
# Partial index (only index rows that match condition)
add_index :orders, :user_id, where: "status = 'pending'"
end
end
Compound Index Order Matters
# Good for queries: WHERE user_id = ? AND created_at > ?
add_index :orders, [:user_id, :created_at]
# Bad for same query (PostgreSQL won't use it efficiently)
add_index :orders, [:created_at, :user_id]
Rule: Put equality columns first, range columns last.
Finding Missing Indexes
-- PostgreSQL: Find slow queries that need indexes
SELECT
schemaname,
tablename,
attname,
n_distinct,
correlation
FROM pg_stats
WHERE tablename = 'orders'
ORDER BY n_distinct DESC;
Part 4: Query Optimization
Sometimes indexes aren't enough. You need to rewrite the query.
Select Only What You Need
# Bad: loads all columns
User.find(1).orders
# Good: loads only what you need
User.find(1).orders.select(:id, :total, :created_at)
Count vs Size vs Length
| Method | Behavior | When to Use |
|---|---|---|
count |
Always queries database | When you need accurate count |
size |
Uses cached count if available | In most cases |
length |
Loads all records then counts | When records are already loaded |
# Bad
user.orders.count # Always queries even if orders loaded
# Better
user.orders.size # Uses cached count if available
Using pluck Instead of Map
# Bad: loads entire User objects
User.all.map(&:email) # Loads all columns
# Good: loads only email column
User.pluck(:email) # Loads only emails
Using exists? Instead of present?
# Bad: loads all records
User.where(active: true).present?
# Good: stops at first match
User.where(active: true).exists?
Batch Processing Large Datasets
# Bad: loads all 100,000 users into memory
User.all.each do |user|
user.send_welcome_email
end
# Good: processes in batches of 1000
User.find_each(batch_size: 1000) do |user|
user.send_welcome_email
end
# With order
User.order(created_at: :asc).find_each do |user|
# ...
end
Part 5: Counter Caches
Need to show count of associated records? Use counter caches.
Without Counter Cache
# Each time you call this, it queries the database
user.orders.count # SELECT COUNT(*) FROM orders WHERE user_id = 1
With Counter Cache
# Add column to users table
class AddOrdersCountToUsers < ActiveRecord::Migration[7.0]
def change
add_column :users, :orders_count, :integer, default: 0
# Backfill existing data
User.find_each do |user|
User.reset_counters(user.id, :orders)
end
end
end
class Order < ApplicationRecord
belongs_to :user, counter_cache: true
end
Now user.orders_count returns instantly without querying the database.
Part 6: Database Views and Materialized Views
For complex queries that run frequently, use database views.
Regular View (Always Fresh, Always Slow)
class CreateActiveUsersView < ActiveRecord::Migration[7.0]
def up
execute <<-SQL
CREATE VIEW active_users AS
SELECT users.*, COUNT(orders.id) as order_count
FROM users
LEFT JOIN orders ON orders.user_id = users.id
WHERE users.last_active_at > NOW() - INTERVAL '30 days'
GROUP BY users.id
SQL
end
def down
execute "DROP VIEW active_users"
end
end
# app/models/active_user.rb
class ActiveUser < ApplicationRecord
self.table_name = "active_users"
def readonly?
true
end
end
# Use it like a regular model
ActiveUser.where(order_count: 0)
Materialized View (Cached, Fast, Needs Refresh)
class CreateDailySalesReport < ActiveRecord::Migration[7.0]
def up
execute <<-SQL
CREATE MATERIALIZED VIEW daily_sales AS
SELECT
DATE(created_at) as sale_date,
COUNT(*) as order_count,
SUM(total_cents) as total_cents
FROM orders
GROUP BY DATE(created_at)
SQL
add_index :daily_sales, :sale_date, unique: true
end
def down
execute "DROP MATERIALIZED VIEW daily_sales"
end
end
Refresh it:
# Refresh manually
ActiveRecord::Base.connection.execute("REFRESH MATERIALIZED VIEW daily_sales")
# Or use a gem like scenic for easier management
Part 7: Partitioning Large Tables
When tables grow to millions of rows, consider partitioning.
Why Partition
| Without Partition | With Partition |
|---|---|
| All rows in one table | Rows split by date or key |
| Deleting old data is slow | Dropping a partition is instant |
| Indexes are huge | Each partition has smaller indexes |
| Query scans all data | Query scans relevant partitions only |
PostgreSQL Range Partitioning
class CreatePartitionedOrders < ActiveRecord::Migration[7.0]
def up
# Create parent table
create_table :orders, partition_key: :created_at do |t|
t.belongs_to :user
t.integer :total_cents
t.datetime :created_at
end
# Create monthly partitions
connection.execute(<<-SQL)
CREATE TABLE orders_2024_01 PARTITION OF orders
FOR VALUES FROM ('2024-01-01') TO ('2024-02-01');
SQL
connection.execute(<<-SQL)
CREATE TABLE orders_2024_02 PARTITION OF orders
FOR VALUES FROM ('2024-02-01') TO ('2024-03-01');
SQL
end
end
Gem for Easier Partitioning
# Gemfile
gem "pg_partition_manager"
# config/initializers/partition_manager.rb
PgPartitionManager.configure do |config|
config.partition_tables = {
orders: {
type: :range,
column: :created_at,
interval: :monthly,
retain_for: 12.months
}
}
end
Part 8: Read Replicas
When your database can't keep up with read traffic, add a replica.
Configuration
# config/database.yml
production:
primary:
database: myapp_production
username: myapp
password: <%= ENV["DATABASE_PASSWORD"] %>
host: primary-db.example.com
replica:
database: myapp_production
username: myapp
password: <%= ENV["DATABASE_PASSWORD"] %>
host: replica-db.example.com
replica: true
Using Read Replicas
# Automatic (requires additional config)
ActiveRecord::Base.connected_to(role: :reading) do
# All queries here use the replica
User.where(active: true).count
Order.completed.last_30_days
end
# Or in controllers
class ReportsController < ApplicationController
def index
ActiveRecord::Base.connected_to(role: :reading) do
@report = ReportGenerator.generate
end
end
end
Which Queries Should Use Replicas
| Use Replica | Use Primary |
|---|---|
| Analytics queries | User writes (updates) |
| Reports | Critical reads (must be exact) |
| Search | After successful writes |
| Dashboard summaries | Anything real-time |
| Admin panels | Anything with immediate feedback |
Part 9: Real-World Optimization Examples
Example 1: Slow Dashboard
Problem: Dashboard loads in 8 seconds
The query:
@stats = {
total_users: User.count,
active_orders: Order.where(status: "active").count,
revenue: Order.sum(:total_cents),
recent_users: User.where("created_at > ?", 7.days.ago).order(created_at: :desc).limit(10)
}
Analysis: 4 separate queries, one scanning entire orders table
Solution:
# Add indexes
add_index :orders, :status
add_index :orders, :total_cents
add_index :users, :created_at
# Use counter caches
# Use cached values where possible
Result: 8 seconds → 200ms
Example 2: Slow Search
Problem: Searching products takes 5 seconds
The query:
Product.where("name LIKE ? OR description LIKE ?", "%#{query}%", "%#{query}%")
Analysis: Full table scan on both columns
Solution:
# Use PostgreSQL full-text search
Product.where("to_tsvector(name || ' ' || description) @@ to_tsquery(?)", query)
# Add gin index
execute "CREATE INDEX products_search_idx ON products USING gin(to_tsvector('english', name || ' ' || description))"
Result: 5 seconds → 50ms
Example 3: Slow Export
Problem: Exporting 100,000 orders times out
The code:
Order.all.each do |order|
CSV << [order.id, order.user.email, order.total]
end
Analysis: Loads all orders, N+1 for user email
Solution:
Order.includes(:user).find_each(batch_size: 1000) do |order|
CSV << [order.id, order.user.email, order.total]
end
Result: Times out → completes in 30 seconds
Part 10: Monitoring in Production
You can't optimize what you don't measure.
Key Metrics to Track
| Metric | What It Tells You | Alert Threshold |
|---|---|---|
| Query time | How long queries take | > 200ms |
| Query count per page | How many queries | > 20 |
| Active connections | DB load | > 80% of pool |
| Slow query log | Long-running queries | > 500ms |
| Lock contention | Concurrent write issues | > 10 locks waiting |
Adding Query Comments
# Add comments to identify where queries come from
class ApplicationRecord < ActiveRecord::Base
around_action :comment_queries, if: -> { Rails.env.production? }
def comment_queries
ApplicationRecord.annotate(controller: controller_name, action: action_name) do
yield
end
end
end
Using Scout or New Relic
# Gemfile
gem "scout_apm"
These tools show you:
- Slowest queries
- N+1 detection
- Database call traces
- Performance over time
Simple SQL Logging
# config/initializers/sql_logging.rb
ActiveSupport::Notifications.subscribe("sql.active_record") do |*args|
event = ActiveSupport::Notifications::Event.new(*args)
if event.duration > 500
Rails.logger.warn("SLOW SQL: #{event.duration}ms - #{event.payload[:sql]}")
end
end
The Optimization Checklist
Use this checklist as your app grows.
At 1,000 Users
- [ ] Run Bullet to find N+1 queries
- [ ] Add indexes for all WHERE columns
- [ ] Use
includesfor associations
At 10,000 Users
- [ ] Add counter caches for counts
- [ ] Use
pluckinstead ofmap - [ ] Batch process with
find_each
At 100,000 Users
- [ ] Add read replicas for reporting
- [ ] Implement query caching
- [ ] Partition largest tables
At 1,000,000 Users
- [ ] Use materialized views for complex reports
- [ ] Implement database sharding
- [ ] Consider NoSQL for specific use cases
Summary
Database optimization is not magic. It's systematic.
| Problem | Fix |
|---|---|
| N+1 queries | includes, preload, eager_load |
| Missing indexes | Add indexes on WHERE, ORDER BY, JOIN columns |
| Too many columns | Use select, pluck |
| Large datasets | Batch with find_each |
| Frequent counts | Counter caches |
| Complex queries | Database views |
| Huge tables | Partitioning |
| Read traffic | Read replicas |
Quick Wins (Do These First)
- Install Bullet gem and fix every N+1
- Add indexes for all foreign keys
- Add indexes for columns you
WHEREorORDER BY - Use
includesin every controller - Replace
countwithsizewhere possible
Your database can handle a lot. But you have to help it.
Find the slow queries. Add the right index. Rewrite the bad query.
Your users will thank you.