Speeding up the Zend Framework

Like many frameworks, the Zend framework is fairly heavy by itself. Today, I decided to do some profiling on code running to see how all that load was distributed and see if there was any way I could pinpoint the problem to one particular location. I found a major one. Just to get things booted up and route the query to the appropriate method, over 40 files get included via require_once(). IO is definitely a problem here. I didn’t keep the original cachegrind trace (from xdebug), but over half of the processing time was lost in those calls. Even if using APC or some sort of op-code caching, calls still have to be made and the code has to be retrieved.

The base idea was to convert all those includes to a big one that could be cached properly and avoid all those require_once() calls. The best way to do it was to build a dirty PHP script to grab all those files and aggregate them. It removes the PHP tags and calls to require_once() inside the files on the way. Here it is:

< ?php
$files = file( 'files.txt' );

$fp = fopen( 'zend_core.php', 'w' );
fwrite( $fp, "<?php\n" );

foreach( $files as $file )
    $content = substr( file_get_contents( trim($file) ), 5 );
    $content = str_replace( '?>', '', $content );
    $content = preg_replace( '/require_once\s*\(?\s*[\'"][^;]+;/', '', $content );
    fwrite( $fp, $content );

fwrite( $fp, "?>" );
fclose( $fp );


Prior to running the script, I got the list of files that get loaded with a simple call to get_required_files(). After some path clean-up, these are the files that got loaded in my setup. The list will vary depending on which components of the framework you use. I got the list for a simple call to a very basic controller. I expect some files to be loaded on the fly as required for the rare components.


I stored the list in files.txt and the aggregate script built a 300k file. Including that file instead of the other ones gave around 30% speed improvement (benchmark made with ab -c10 -n1000) on a page not really representing normal usage. Still quite an interesting gain. Of course, using this is completely inconvenient for the framework developers, but it’s nice for those using it. The interesting aspect is that after those changes, the traces provided by xdebug were a lot more meaningful, altought they are not very precise as the slow parts would change from run to run.

So, is the Zend Framework slow and bloated? Well, it certainly is more than a straight PHP script, but it does have benefits. At the moment, it’s only a preview. I expect quite a few changes to be made before the final release to improve the performance of some components. I have been using it quite extensively on a project for little over a week now and the way the controller components handles request dispatching is just too convenient not to use it. Plus, there are plenty of hooks in the controller actions to handle all sorts of special cases, like page caching or authorization. I didn’t use so many components. I had a few bad experiences as well (would you really expect something simple like a logging class not to work?), but overall, I think there is a lot of potential ahead.

6 thoughts on “Speeding up the Zend Framework”

  1. I had exactly the same issue and a very high performance gain doing exactly the same thing on our own framework. We had roughly 2 meg of code being included once we merged them together. We did question whether it was realistic to have that being required on pages dynamically or whether to start writing C++ and embedding functions directly into PHP.

  2. Interesting! I like the idea lots but can’t get it too work. How do you avoid “Cannot redeclare class Zend_XXXX in …” messages? Sooner or later some other Zend class (one that you didn’t compact) might have a require_once “Zend/XXXX.php” call..

    Thanks for your help!

  3. True. I really only tested this locally, but that’s a problem that could occur. I just never had enough performance problems in production to really need to move forward with this solution. I guess one thing that could be done is to scan all files and remove includes of those files that were compacted. A fairly simple script could do it.

Leave a Reply

Your email address will not be published. Required fields are marked *