News Feed
Sections




News Archive
feed this:

Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Nikita Popov:
Fast request routing using regular expressions
February 19, 2014 @ 09:03:07

In his latest post Nikita Popov talks about routing and regular expresions. He also shares some work he's done to create a fast request router using them in "userland" code instead of a C extension.

Some time ago I stumbled on the Pux routing library, which claims to implement a request router that is many orders of magnitude faster than the existing solutions. In order to accomplish this, the library makes use of a PHP extension written in C. However, after a cursory look at the code I had the strong suspicion that the library was optimizing the wrong parts of the routing process. [...] To investigate the issue further I wrote a small routing library: FastRoute. This library implements the dispatch process that I will describe below.

He includes some benchmarks against the results from a C-based routing engine showing his solution performing slightly better. What he's really talking about, though, is the dispatch process in general, not just his implementation. He talks about "the routing problem" many engines face - having to loop through a potentially large set of routes to find a match. He offers an alternative using regular expressions and compiling all of the routes down into one large expression. He includes a simple implementation of the method and reruns the same benchmarks with some different results. He offers one potential solution for speeding it up using "chunked expressions" to break it down into more manageable matching. He includes benchmarks for this last solution as well, showing a slight improvement.

0 comments voice your opinion now!
regularexpression routing dispatch engine chunk compile

Link: http://nikic.github.io/2014/02/18/Fast-request-routing-using-regular-expressions.html

Rob Young's Blog:
Chunking Large Queries with Iterators in PHP
October 07, 2009 @ 10:42:02

Since sometimes you just don't want all of the results of a query back at once, Rob Young has posted a solution of his own using the Iterators included with PHP as a part of the SPL. His solution is to wrap it in a ChunkedQueryIterator that handles the work behind the scenes.

When executing large queries it's usually best not to load the whole result set in one go. Memory isn't infinite and PHP isn't renowned for handling it very well. So the obvious answer is to chunk the large query in to lots of smaller queries. [...] We want something to which we can just provide a PDO object, an SQL query and the chunk size. We should then be able to iterate over the resulting object as though it were a single result set.

He includes two code snippets of it in action, but asks the question of his readers - "How do you handle large database queries?" - to get some feedback on other alternatives.

0 comments voice your opinion now!
chunk large query iterator pdo


Michael Caplan's Blog:
Don't Forget to Flush
January 08, 2009 @ 12:09:15

In this recent post to his blog Michael Caplan looks at a feature of PHP that's sometimes forgotten when pushing out larger chunks of data - flushing.

As a recluse who prefers hiding behind servers rather than dancing around your web browser's canvas, I was intrigued with their server side recommendations - however sparse they may be. In particular, flushing generated head content early to speed up overall page delivery and rending time was a technique new to me.

Michael looks at what "flushing generated head content" means and includes a scenario - pulling the top palettes from the COLOURlovers site - and some performance stats on page load time and response time directly from the server (complete with graphs).

0 comments voice your opinion now!
flush chunk compress head content load time statistics response


ThinkPHP Blog:
Handling large files with(out) PHP
August 02, 2006 @ 05:47:06

On the ThinkPHP blog today, there's a quick hint about dealing with larger files both with and whithout PHP.

As one man was quoted "640K of memory should be enough for anybody" no one will need to access more than 2 GB data. What happens if you - just for scientific reasons of course - try to access larger files using your 32bit hardware and your favorite programming language PHP?

They give the example of opening a large 2 gig file with PHP and the resulting error that would pop up. They try a few differnt ways before getting down to more of a non-PHP PHP solution (yes, you read that right). They decided, instead, to create a script to work with the file chunked, using an exec() call to the unix split command to break it up.

0 comments voice your opinion now!
file handling large fopen error split chunk exec file handling large fopen error split chunk exec



Community Events





Don't see your event here?
Let us know!


list series tips library community deployment release podcast laravel symfony voicesoftheelephpant framework interview zendserver developer api introduction bugfix language conference

All content copyright, 2014 PHPDeveloper.org :: info@phpdeveloper.org - Powered by the Solar PHP Framework