- Home
- Blog
- Ruby & Rails Core
- ActiveRecord to Raw SQL—When ORM Costs 80% Performance in Production
ActiveRecord to Raw SQL—When ORM Costs 80% Performance in Production
Real migration story: complex joins that forced us back to find_by_sql after Rails queries hit 2.8s on 50K records
I spent three weeks converting raw SQL queries to “proper” ActiveRecord code. The test suite went green. Code review approved. We shipped to production. Within 48 hours, our analytics dashboard was timing out. What used to take 340ms was now hitting 2.8 seconds on the same 50K record dataset. Here’s when to use raw SQL instead of ActiveRecord—and when I learned that sometimes the Rails way isn’t the right way.
The Performance Cliff I Drove Off
Our dashboard query joined four tables with conditional aggregations. The raw SQL version ran clean:
SQL version:
SELECT accounts.id, accounts.name,
COUNT(DISTINCT subscriptions.id) as subscription_count,
SUM(CASE WHEN invoices.status = 'paid' THEN invoices.amount_cents ELSE 0 END) as revenue,
MAX(orders.created_at) as last_order_date
FROM accounts
LEFT JOIN subscriptions ON subscriptions.account_id = accounts.id
LEFT JOIN invoices ON invoices.subscription_id = subscriptions.id
LEFT JOIN orders ON orders.account_id = accounts.id
WHERE accounts.created_at > '2024-01-01'
GROUP BY accounts.id, accounts.name
HAVING COUNT(DISTINCT subscriptions.id) > 0
ORDER BY revenue DESC
LIMIT 100
ActiveRecord version (my mistake):
Account
.joins(:subscriptions)
.left_joins(:invoices, :orders)
.where('accounts.created_at > ?', Date.new(2024, 1, 1))
.select('accounts.*, COUNT(DISTINCT subscriptions.id) as subscription_count')
.group('accounts.id')
.having('COUNT(DISTINCT subscriptions.id) > 0')
.order('SUM(CASE WHEN invoices.status = ? THEN invoices.amount_cents ELSE 0 END) DESC', 'paid')
.limit(100)
What broke:
- Query time jumped from 340ms to 2.8s on 50K accounts
- ActiveRecord generated a Cartesian product with duplicate rows
- The
orderclause didn’t work withselectaggregations (Rails couldn’t alias properly) - Memory usage spiked from 180MB to 920MB per request
Why it broke: ActiveRecord isn’t designed for complex analytics queries with multiple conditional aggregations. It tried to be clever with the joins, but the query planner made terrible decisions. The ORM abstraction was fighting the database optimizer.
When Raw SQL Beats ActiveRecord Every Time
After profiling with rack-mini-profiler, I found four scenarios where when to use raw SQL instead of ActiveRecord is the only answer:
1. Multi-table aggregations with conditional logic
If you’re doing SUM(CASE WHEN...) or COUNT(DISTINCT) across 3+ tables, raw SQL wins. ActiveRecord can’t optimize these paths.
2. Complex subqueries in JOIN conditions
-- This pattern breaks ActiveRecord's query builder
SELECT accounts.*
FROM accounts
INNER JOIN (
SELECT account_id, MAX(created_at) as latest
FROM orders
GROUP BY account_id
) latest_orders ON latest_orders.account_id = accounts.id
WHERE latest_orders.latest > NOW() - INTERVAL '30 days'
ActiveRecord has no clean way to express this. You’d need three separate queries and merge in Ruby—obliterating performance.
3. Window functions for ranking or running totals
SELECT invoices.*,
SUM(amount_cents) OVER (PARTITION BY account_id ORDER BY created_at) as running_total,
ROW_NUMBER() OVER (PARTITION BY account_id ORDER BY created_at DESC) as invoice_rank
FROM invoices
Rails 7.1 doesn’t support window functions in the query interface. You need find_by_sql or Arel (which is just raw SQL with extra steps).
4. Database-specific performance features
PostgreSQL’s DISTINCT ON, MySQL’s GROUP_CONCAT, or database-specific index hints. ActiveRecord strips these out or generates incorrect syntax.
Production impact: On our 50K account base, switching these four query patterns to raw SQL cut dashboard load time from 4.2s to 680ms. Memory usage dropped 73%. That’s not premature optimization—that’s the difference between users waiting and users leaving.
The find_by_sql Pattern That Actually Works
Here’s the pragmatic approach I landed on after the production fire:
# app/models/account.rb
class Account < ApplicationRecord
def self.dashboard_stats(since_date)
sql = <<~SQL
SELECT accounts.id, accounts.name,
COUNT(DISTINCT subscriptions.id) as subscription_count,
SUM(CASE WHEN invoices.status = 'paid'
THEN invoices.amount_cents ELSE 0 END) as revenue_cents,
MAX(orders.created_at) as last_order_date
FROM accounts
LEFT JOIN subscriptions ON subscriptions.account_id = accounts.id
LEFT JOIN invoices ON invoices.subscription_id = subscriptions.id
LEFT JOIN orders ON orders.account_id = accounts.id
WHERE accounts.created_at > ?
GROUP BY accounts.id, accounts.name
HAVING COUNT(DISTINCT subscriptions.id) > 0
ORDER BY revenue_cents DESC
LIMIT 100
SQL
# Sanitize user input—CRITICAL for raw SQL
find_by_sql([sql, since_date])
end
end
Why this works:
-
find_by_sqlreturns ActiveRecord objects, not hashes - Array syntax sanitizes parameters (prevents SQL injection)
- Query is explicit—no hidden N+1 traps
- Database optimizer sees the full query plan, not ActiveRecord’s generated mess
Watch out:
The result objects are read-only. If you try account.update(...), it works, but associations like account.subscriptions will trigger new queries (they’re not eager loaded). You’ll need to explicitly preload if you need writable objects with associations.
When NOT to use find_by_sql:
- Simple WHERE clauses (
Account.where(status: 'active')is fine) - Single-table queries with basic joins
- Anything that fits in 2-3 chained ActiveRecord methods
- Queries you’ll need to dynamically modify (adding optional filters)
The rule: if you can’t write it cleanly in ActiveRecord in under 5 lines, raw SQL is probably clearer and faster.
The Hybrid Approach for Complex Filters
Sometimes you need raw SQL performance with dynamic filters. Here’s the pattern:
def self.filtered_stats(filters = {})
conditions = ["accounts.created_at > ?"]
values = [filters[:since] || 30.days.ago]
if filters[:min_revenue]
# Add HAVING clause dynamically
conditions << "SUM(CASE WHEN invoices.status = 'paid' THEN invoices.amount_cents ELSE 0 END) >= ?"
values << filters[:min_revenue]
end
sql = <<~SQL
SELECT accounts.id, accounts.name,
COUNT(DISTINCT subscriptions.id) as subscription_count,
SUM(CASE WHEN invoices.status = 'paid'
THEN invoices.amount_cents ELSE 0 END) as revenue_cents
FROM accounts
LEFT JOIN subscriptions ON subscriptions.account_id = accounts.id
LEFT JOIN invoices ON invoices.subscription_id = subscriptions.id
WHERE #{conditions[0]}
GROUP BY accounts.id, accounts.name
HAVING #{conditions[1..-1].join(' AND ')}
ORDER BY revenue_cents DESC
LIMIT 100
SQL
find_by_sql([sql] + values)
end
Trade-offs:
- ✅ Pros: Full SQL control with dynamic filters, sanitized inputs, stays DRY
- ❌ Cons: String concatenation is error-prone, harder to test than scopes, loses ActiveRecord’s chainability
I use this for admin dashboards where performance matters more than code elegance. For user-facing features, I stick with scopes and accept the 20% performance hit.
Measuring Before You Migrate
Before rewriting anything, profile it. I learned this the expensive way.
Use rack-mini-profiler in development:
# Gemfile
gem 'rack-mini-profiler', group: :development
# config/environments/development.rb
config.after_initialize do
Rack::MiniProfiler.config.position = 'bottom-right'
end
Load your page. You’ll see query times in the browser corner. Anything over 100ms deserves inspection.
Use Bullet gem to catch N+1 queries:
# Gemfile
gem 'bullet', group: :development
# config/environments/development.rb
config.after_initialize do
Bullet.enable = true
Bullet.alert = true
Bullet.rails_logger = true
end
Bullet yells at you when eager loading would help. It saved me from shipping 847 queries on a dashboard that should’ve been 3.
Check PostgreSQL’s query planner:
EXPLAIN ANALYZE
SELECT accounts.*...
If you see “Seq Scan” on a large table or “Hash Join” estimates way off, your indexes are wrong or ActiveRecord generated bad SQL.
Production numbers that made me switch:
- ActiveRecord query: 2.8s average, 4.1s P95
- Raw SQL version: 340ms average, 520ms P95
- Same 50K records, same indexes, same server
That 8x improvement paid for the technical debt of maintaining raw SQL.
When ActiveRecord Is Still the Right Choice
Don’t cargo cult raw SQL everywhere. ActiveRecord wins when:
1. Simple CRUD operations
# This is perfect—don't replace it
Account.where(status: 'active').order(created_at: :desc).limit(20)
2. Queries you’ll need to chain dynamically
# Scopes compose beautifully
scope :active, -> { where(status: 'active') }
scope :trial, -> { where('trial_ends_at > ?', Time.current) }
scope :recent, -> { where('created_at > ?', 30.days.ago) }
# Chain based on user filters
accounts = Account.active
accounts = accounts.trial if params[:trial_only]
accounts = accounts.recent if params[:recent]
Raw SQL can’t do this without string concatenation madness.
3. When you need writable objects with associations
accounts = Account.includes(:subscriptions, :invoices).where(...)
accounts.each do |account|
account.subscriptions.create!(plan: 'pro') # Works because of includes
end
With find_by_sql, associations aren’t loaded. You’d trigger N+1 queries or need separate fetches.
4. Database-agnostic code If you’re supporting PostgreSQL and MySQL (why?), ActiveRecord abstracts the differences. Raw SQL means maintaining two versions.
My rule:
Start with ActiveRecord. Profile. If a query hits 200ms or causes N+1 problems that includes can’t fix, consider raw SQL. Don’t optimize queries that run once per hour.
The Lessons I Learned the Hard Way
Lesson 1: Code elegance doesn’t matter if users wait 3 seconds
I was proud of that pure-ActiveRecord analytics query. It read like English. It also timed out on production data. Pragmatism beats purity when performance matters.
Lesson 2: find_by_sql isn’t giving up—it’s using the right tool
For 18 months I avoided raw SQL because “it wasn’t the Rails way.” The Rails way is solving problems, not avoiding database features that make your app fast.
Lesson 3: Profile before you refactor
I converted 14 queries to ActiveRecord. Only 2 actually needed it. The rest got slower for no reason. Measure first, optimize second.
Lesson 4: Raw SQL needs more tests
ActiveRecord scopes break loudly when you change associations. Raw SQL breaks silently when you rename columns. I now require integration tests for any find_by_sql method—unit tests don’t catch schema drift.
Tools that caught my mistakes:
- PgHero (showed slow queries in production)
- rack-mini-profiler (exposed the 2.8s queries)
- Bullet (found N+1s I introduced while “fixing” things)
Final Thoughts
Use raw SQL when to use raw SQL instead of ActiveRecord becomes obvious from profiling: complex aggregations, subqueries in joins, window functions, or database-specific features that ActiveRecord can’t express. Stick with ActiveRecord for simple queries, dynamic scopes, and anything that needs to chain or load associations.
The 80% performance drop I caused taught me this: there’s no shame in dropping to SQL when the ORM fights the database. Profile first, optimize what hurts, and don’t refactor for elegance when users are waiting.
Drop your SQL-vs-ActiveRecord war stories in the comments. What’s the gnarliest query you had to convert—either direction?
Was this article helpful?
Your feedback helps us improve our content
How We Verify Conversions
Every conversion shown on this site follows a strict verification process to ensure correctness:
- Compare results on same dataset — We run both SQL and ActiveRecord against identical test data and verify results match
-
Check generated SQL with
to_sql— We inspect the actual SQL Rails generates to catch semantic differences (INNER vs LEFT JOIN, WHERE vs ON, etc.) - Add regression tests for tricky cases — Edge cases like NOT EXISTS, anti-joins, and predicate placement are tested with multiple scenarios
- Tested on Rails 8.1.1 — All conversions verified on current Rails version to ensure compatibility
Last updated: March 07, 2026
Try These Queries in Our Converter
See the SQL examples from this article converted to ActiveRecord—and compare the SQL Rails actually generates.
Deep Dive into ActiveRecord
Raza Hussain
Full-stack developer specializing in Ruby on Rails, React, and modern JavaScript. 15+ years upgrading and maintaining production Rails apps. Led Rails 4/5 → 7 upgrades with 40% performance gains, migrated apps from Heroku to Render cutting costs by 35%, and built systems for StatusGator, CryptoZombies, and others. Available for Rails upgrades, performance work, and cloud migrations.
More on SQL to ActiveRecord
Added includes() Everywhere to Fix N+1. Made Everything Slower. Eager Loading Isn't Always the Answer.
Added includes to fix N+1 and P95 spiked? Learn when eager loading hurts, how to profile, and patterns that cut rows/req 135k→8.1k and P95 1.9s→280ms.
Added includes() Everywhere to Fix N+1. Made Everything Slower
Over-eager loading with includes() can make Rails apps slower. Learn when 100 simple queries beat 1 massive join. Real benchmarks from production apps.
Read "Agile Web Development with Rails." Still Couldn't Write Queries. Needed Examples, Not Theory.
Books teach concepts. You need examples. See SQL vs ActiveRecord side-by-side, when to use scopes/Arel/SQL, and how to ship maintainable queries fast.
Leave a Response
Responses (0)
No responses yet
Be the first to share your thoughts