John Lim has a new blog post today sharing his method for simulating parallel processing inside of a PHP application.
One problem we were having is that some of our batch processing jobs were taking too long to run. In order to speed the processing, we tried to split the processing file into half, and let a separate PHP process run each job. Given that we were using a dual core server, each process would be able to run close to full speed (subject to I/O constraints).
He shows the two "jobs" files that just echo out the job name and the number of seconds it's been running and the "control.php" that makes use of streams (pointed at localhost) to start the jobs apart from the main script. Another function checks the stream resource to see if it gets an EOF from it and returns back the output.