Eloquent is the productivity engine that makes Laravel pleasant to write. It's also the perf bottleneck once you scale past a few thousand records per request. Here's the optimization toolkit I use, in order of effectiveness.
1. Select Only What You Need
// Bad
$posts = Post::all(); // 30 columns × 10,000 rows
// Good
$posts = Post::select(['id', 'title', 'slug', 'published_at'])->get();
For an admin list page that shows title and date, selecting body (a TEXT column with article content) wastes ~500ms of network and memory.
2. Use chunk() or cursor() for Large Iterations
// Bad — loads 100k rows into memory
Post::all()->each(fn ($post) => $post->reindex());
// Good — pages through 200 at a time
Post::chunk(200, function ($posts) {
$posts->each->reindex();
});
// Better for read-only — uses generators, low memory
Post::cursor()->each->reindex();
chunk materializes pages of records. cursor yields one row at a time (PHP generators), using almost no memory regardless of total row count.
The catch with cursor: it holds an open database connection for the duration. Don't cursor() over millions of rows in a request that needs to return quickly.
3. chunkById for Mutations
// Bad — chunk() can skip rows when ordering changes mid-iteration
Post::where('status', 'pending')->chunk(200, function ($posts) {
$posts->each->update(['status' => 'processed']); // skips rows!
});
// Good
Post::where('status', 'pending')->chunkById(200, function ($posts) {
$posts->each->update(['status' => 'processed']);
});
chunkById orders by primary key. Even when you mutate rows out of the filter, the offset stays consistent.
4. Compound Where vs Multiple Calls
// Same SQL, but the second is more readable:
Post::where('status', 'published')->where('author_id', $userId)->get();
Post::where(['status' => 'published', 'author_id' => $userId])->get();
Both compile to identical SQL. Pick what reads better.
5. Avoid whereHas When Possible
// Slow — creates a subquery per row
Post::whereHas('comments', fn ($q) => $q->where('approved', true))->get();
// Fast on most schemas — direct join
Post::join('comments', 'comments.post_id', '=', 'posts.id')
->where('comments.approved', true)
->select('posts.*')
->distinct()
->get();
For simple existence checks, the join is 5-10× faster. For complex multi-condition checks, whereHas is more readable and the perf difference disappears with proper indexes.
6. Index Your where Columns
// migration
$table->index(['status', 'published_at']);
A query like WHERE status = 'published' ORDER BY published_at DESC LIMIT 10 does a full table scan without this index. With it, ~1ms regardless of table size.
Rule of thumb: any column you where on, orderBy on, or join on should have an index. The cost is increased write latency (each insert updates the index); for read-heavy tables it's almost always worth it.
7. pluck() for Single-Column Needs
// Bad — instantiates 10k Post models
$ids = Post::all()->pluck('id');
// Good — one query, no model instantiation
$ids = Post::pluck('id');
pluck on a query (not on a collection) compiles to SELECT id FROM posts and returns a flat array.
8. Subquery Selects for Aggregates
// Slow — loads all comments
Post::with('comments')->get()->each(fn ($post) => $post->comments_count);
// Fast — aggregates in one query
Post::withCount('comments')->get();
// Even faster — no relation, custom subquery
Post::select([
'id',
'title',
DB::raw('(SELECT COUNT(*) FROM comments WHERE comments.post_id = posts.id) AS comments_count'),
])->get();
For complex aggregates (max, sum, conditional counts), subquery selects are sometimes faster than withCount because the database can optimize them as a single statement.
9. Skip Eloquent for Bulk Operations
// Slow — 1000 inserts, model events fire each time
foreach ($users as $user) {
Post::create($user);
}
// Fast — one query, no events
Post::insert($users); // takes array of arrays
Trade-off: insert() doesn't update created_at/updated_at, doesn't fire model events, and doesn't validate. Use it for trusted data; for user input, create() is safer.
For mass updates with conditions:
Post::where('status', 'pending')->update(['status' => 'processed']);
This is one UPDATE statement. Bypassing the model layer here is the right call.
10. Database Profiler — Trust the Numbers
DB::enableQueryLog();
// ... your code
$queries = DB::getQueryLog();
collect($queries)->sortByDesc('time')->take(10);
The slow ones are usually obvious. The medium-slow ones (50-100ms each, called many times) are where the real wins hide. Postgres EXPLAIN ANALYZE on the medium-slow queries reveals seq scans hiding behind your orderBy('created_at') without an index.
What I Don't Optimize
Premature optimization is real. I don't optimize:
- Admin pages with <100 rows on the page
- One-shot reports that run weekly via a queue
- Endpoints called <10 times per day
I optimize:
- The homepage
- The search endpoint
- Anything in the request path of >1000 daily users
- The slow query log's top 10
That's a 10× concentration of effort, in return for ~80% of the user-visible perf gain.
Al Amin Ahamed
Senior software engineer & AI practitioner. Laravel, PHP, WordPress plugins, WooCommerce extensions.
About me →More from the blog
← Older
Building a WooCommerce Payment Gateway from Scratch
Newer →
Hunting N+1 Queries in Laravel: Tools, Strict Mode, and Fixes
One email a month. No noise.
What I shipped, what I read, occasional deep dive. Unsubscribe anytime.
Check your inbox — confirmation link sent.