John Lim has two new posts covering parallel processing in PHP and how to use this "divide and conquer" idea to not only speed up your code but to make it more maintainable down the road.
In the first post:
One problem we were having is that some of our batch processing jobs were taking too long to run. In order to speed the processing, we tried to split the processing file into half, and let a separate PHP process run each job. [...] Here is our technique for running multiple parallel jobs in PHP. In this example, we have two job files: j1.php and j2.php we want to run.
The code is included for the job files and the "controller" that manages them. In the second article, he builds on this and shows a more practical example - finding the median of a set of records out of a database.