Thoughts on Systems

Emil Sit

Apr 17, 2006 - 2 minute read - Hacking dokuwiki tools

Saving bandwidth with DokuWiki

I recently installed DokuWiki on NearlyFreeSpeech; while I love DokuWiki’s features, I quickly noticed that I was being charged for more bandwidth than seemed necessary for the few pages I was viewing and editing.

A quick check of access logs revealed two things. First, DokuWiki does not compress its output using gzip. Second, it does not send appropriate cache control headers to allow essentially static data (e.g. style sheets) to be cached.

Google reveals that it’s easy to actually compress output from PHP. For example, Jan-Piet Mens added one line to doku.php to turn on gzip output compression. I borrowed a snippet from WordPress’s gzip_compression function and added it to inc/init.php (after the init session code):

// Hack: enable gzip output compression -ES
if ( extension_loaded('zlib') ) {
  ob_start('ob_gzhandler');
}

This has the benefit of affecting any file that generates output, including CSS and JS files. (DokuWiki recently introduced its own bizarre CSS/JS compression scheme that breaks Monobook for DokuWiki; gzip compression seems simpler and less error prone.)

I also observed that my browser was repeatedly requesting lib/exe/css.php and lib/exe/js.php; it turns out that others have raised this issue in just the past few weeks. On 10 April 2006, a set of patches was committed that properly generates ETags and Last-modified headers and allows the resulting output to be cached without checking for at most one hour. I manually applied these patches (with this helper patch); where I used to transfer 11k worth of CSS and 70k of JS for each page view, now I send about 2k of CSS and 17k of JS once an hour. My pages load quicker too!