Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Muhammad Zamroni:
Streaming CSV Using PHP
Feb 16, 2018 @ 15:19:47

On his Medium.com site Muhammad Zamroni has a quick tutorial posted showing how to create a system that will [stream CSV data] in a Laravel application (https://medium.com/@matriphe/streaming-csv-using-php-46c717e33d87) back to the waiting client.

In one of our application, there’s a need to get list of data from a service. This data is used to generate report by background process, resulting a XLSX file that can be downloaded or attached to email. This service (an API) is written in Laravel. The background process is also written in Laravel. Both service are hosted in different server.

We pull data from MySQL and simply send the response in JSON to be easily parsed. The problem we encountered was the amount of data.

The main issue was the memory that it required to pull in all of the data and work with it. Based on some suggestions from another article they decided to switch from JSON to CSV for output and use the chunk handling to process pieces of data at a time. He includes the code for the controller that shows the use of chunk and a manual file pointer to push the data into the response 1000 records at a time.

tagged: stream csv content response laravel chunk tutorial

Link: https://medium.com/@matriphe/streaming-csv-using-php-46c717e33d87

Grok Interactive:
Importing Large CSV Files with PHP Part 1: Import Using One Query
Sep 23, 2015 @ 17:19:33

The Grok Interactive blog has posted a tutorial, the first part in a series, showing you how to work with large CSV files in PHP.

Importing CSV files into your application can become a problem when the file is really big, > 65,000 rows big. Each row of the file needs to be parsed, converted into an object, and then saved to a database. All of this needs to happen within a 30 second timeout window. It may sound like an impossible task, but there are actually a couple of solutions that can solve this problem. While working on a project at Grok, I was tasked with doing exactly that.

He talks about the method he tried initially for parsing the large files, splitting it up into different files and processing them as chunks. He points out that it relies on the file system, though, and this made it difficult to debug. He finally came up with a different, more simple solution: importing the files directly into MySQL via a LOAD DATA LOCAL INFILE command. He shows how to set this up in a controller and "importer" class that handles the upload and import via the importFileContents method (complete code included). He walks through what the code is doing and includes a few notes about the configuration of the database connection to specify additional options on the PDO connection to allow the local file load.

tagged: tutorial csv file import large processing chunk mysql load file query

Link: http://www.grok-interactive.com/blog/import-large-csv-into-mysql-with-php-part-1/

Nikita Popov:
Fast request routing using regular expressions
Feb 19, 2014 @ 15:03:07

In his latest post Nikita Popov talks about routing and regular expresions. He also shares some work he's done to create a fast request router using them in "userland" code instead of a C extension.

Some time ago I stumbled on the Pux routing library, which claims to implement a request router that is many orders of magnitude faster than the existing solutions. In order to accomplish this, the library makes use of a PHP extension written in C. However, after a cursory look at the code I had the strong suspicion that the library was optimizing the wrong parts of the routing process. [...] To investigate the issue further I wrote a small routing library: FastRoute. This library implements the dispatch process that I will describe below.

He includes some benchmarks against the results from a C-based routing engine showing his solution performing slightly better. What he's really talking about, though, is the dispatch process in general, not just his implementation. He talks about "the routing problem" many engines face - having to loop through a potentially large set of routes to find a match. He offers an alternative using regular expressions and compiling all of the routes down into one large expression. He includes a simple implementation of the method and reruns the same benchmarks with some different results. He offers one potential solution for speeding it up using "chunked expressions" to break it down into more manageable matching. He includes benchmarks for this last solution as well, showing a slight improvement.

tagged: regularexpression routing dispatch engine chunk compile

Link: http://nikic.github.io/2014/02/18/Fast-request-routing-using-regular-expressions.html

Rob Young's Blog:
Chunking Large Queries with Iterators in PHP
Oct 07, 2009 @ 15:42:02

Since sometimes you just don't want all of the results of a query back at once, Rob Young has posted a solution of his own using the Iterators included with PHP as a part of the SPL. His solution is to wrap it in a ChunkedQueryIterator that handles the work behind the scenes.

When executing large queries it's usually best not to load the whole result set in one go. Memory isn't infinite and PHP isn't renowned for handling it very well. So the obvious answer is to chunk the large query in to lots of smaller queries. [...] We want something to which we can just provide a PDO object, an SQL query and the chunk size. We should then be able to iterate over the resulting object as though it were a single result set.

He includes two code snippets of it in action, but asks the question of his readers - "How do you handle large database queries?" - to get some feedback on other alternatives.

tagged: chunk large query iterator pdo

Link:

Michael Caplan's Blog:
Don't Forget to Flush
Jan 08, 2009 @ 18:09:15

In this recent post to his blog Michael Caplan looks at a feature of PHP that's sometimes forgotten when pushing out larger chunks of data - flushing.

As a recluse who prefers hiding behind servers rather than dancing around your web browser’s canvas, I was intrigued with their server side recommendations - however sparse they may be. In particular, flushing generated head content early to speed up overall page delivery and rending time was a technique new to me.

Michael looks at what "flushing generated head content" means and includes a scenario - pulling the top palettes from the COLOURlovers site - and some performance stats on page load time and response time directly from the server (complete with graphs).

tagged: flush chunk compress head content load time statistics response

Link:

ThinkPHP Blog:
Handling large files with(out) PHP
Aug 02, 2006 @ 10:47:06

On the ThinkPHP blog today, there's a quick hint about dealing with larger files both with and whithout PHP.

As one man was quoted "640K of memory should be enough for anybody" no one will need to access more than 2 GB data. What happens if you - just for scientific reasons of course - try to access larger files using your 32bit hardware and your favorite programming language PHP?

They give the example of opening a large 2 gig file with PHP and the resulting error that would pop up. They try a few differnt ways before getting down to more of a non-PHP PHP solution (yes, you read that right). They decided, instead, to create a script to work with the file chunked, using an exec() call to the unix split command to break it up.

tagged: file handling large fopen error split chunk exec file handling large fopen error split chunk exec

Link:

ThinkPHP Blog:
Handling large files with(out) PHP
Aug 02, 2006 @ 10:47:06

On the ThinkPHP blog today, there's a quick hint about dealing with larger files both with and whithout PHP.

As one man was quoted "640K of memory should be enough for anybody" no one will need to access more than 2 GB data. What happens if you - just for scientific reasons of course - try to access larger files using your 32bit hardware and your favorite programming language PHP?

They give the example of opening a large 2 gig file with PHP and the resulting error that would pop up. They try a few differnt ways before getting down to more of a non-PHP PHP solution (yes, you read that right). They decided, instead, to create a script to work with the file chunked, using an exec() call to the unix split command to break it up.

tagged: file handling large fopen error split chunk exec file handling large fopen error split chunk exec

Link:


Trending Topics: