How to Process 1 Million Rows in Laravel Without Crashing Your Server
The Out of Memory Exception Every SaaS application eventually needs a massive data export or import feature. A client asks to download a CSV of their entire transaction history, or they upload a ma...

Source: DEV Community
The Out of Memory Exception Every SaaS application eventually needs a massive data export or import feature. A client asks to download a CSV of their entire transaction history, or they upload a massive Excel file to update their inventory. The junior developer writes a simple query: Transaction::all(). On a local machine with 50 rows, it works instantly. In production with 1,000,000 rows, it attempts to load all one million Eloquent models into the server's RAM at the exact same time. The server instantly runs out of memory, throws a fatal 500 Error, and crashes your application for everyone. Stop Using all() and get() for Heavy Jobs When dealing with massive datasets, you must architect your code to respect your server's hardware limits. You cannot hold the entire ocean in a bucket. You have to process the data in streams. Laravel provides two incredible architectural tools to handle this: Chunking and Cursors. The Architectural Fix: Cursors and Lazy Collections If you need to iterat