On his Medium.com site Muhammad Zamroni has a quick tutorial posted showing how to create a system that will [stream CSV data] in a Laravel application (https://medium.com/@matriphe/streaming-csv-using-php-46c717e33d87) back to the waiting client.
In one of our application, there’s a need to get list of data from a service. This data is used to generate report by background process, resulting a XLSX file that can be downloaded or attached to email. This service (an API) is written in Laravel. The background process is also written in Laravel. Both service are hosted in different server.
We pull data from MySQL and simply send the response in JSON to be easily parsed. The problem we encountered was the amount of data.
The main issue was the memory that it required to pull in all of the data and work with it. Based on some suggestions from another article they decided to switch from JSON to CSV for output and use the
chunk handling to process pieces of data at a time. He includes the code for the controller that shows the use of
chunk and a manual file pointer to push the data into the response 1000 records at a time.