Creating a WordPress farm on a Bitnami server


, ,

This is a work in progress and may be updated in future. Please feel free to comment.

The Bitnami stack for deployment on AWS is beautifully crafted.

  • Ubuntu
    • Apache2
      • WordPress (single instance)
      • Other PHP applications

Apache2 configuration

Apache is configured so that there is a default website.


with a configuration file




Bitnami optimise the number of threads and workers and timeouts for the Apache server by including a softlink whose target is set when the server is built or resized,

Include "/opt/bitnami/apache2/conf/bitnami/httpd.conf"


To host the actual applications it is necessary to overwrite the configuration that gives us the default website. Bitnami do this by a line near the bottom of the file:

Include "/opt/bitnami/apache2/conf/bitnami/bitnami.conf"


This overwrites the default settings of the Apache webserver and loads the appropriate SSL modules (I am not exactly sure what is happening in this file yet.)

At the bottom of this file we have the key line that actually loads the application websites into Apache.

Include "/opt/bitnami/apache2/conf/bitnami/bitnami-apps-vhosts.conf"

You will see that this file is well named “bitnami-apps-vhosts.conf”. In the standard bitnami distribution this includes a single (in this example) WordPress site.


Because we want to host a WordPress farm we will replace this line with

Include "/opt/bitnami/apps/wordpress-farm.conf"

which has the file ownership and group “bitnami” and the permissions 0644.


This file contains a single entry for each site. The first site listed is the default that comes with the bitnami distribution. The others adopt my own naming convention.

# james bayley 
# license MIT
Include "/opt/bitnami/apps/wordpress/conf/httpd-vhosts.conf"
Include "/opt/bitnami/apps/"
Include "/opt/bitnami/apps/"


Individual site configuration

The individual site is configured using two files


<VirtualHost *:80>
 ServerAlias * 
 DocumentRoot "/opt/bitnami/apps/"
 Include "/opt/bitnami/apps/"

The ServerAlias is used because I am actually using WordPress Multisite. This requires both a server alias and a wildcard entry in the DNS to support user’s individual websites.

If the site is SSL enabled there will be an additional entry like,

<VirtualHost *:443>
 ServerAlias *
 DocumentRoot "/opt/bitnami/apps/"
 SSLEngine on
 SSLCertificateFile "/opt/bitnami/apps/"
 SSLCertificateKeyFile "/opt/bitnami/apps/"
 SSLCertificateChainFile "/opt/bitnami/apps/"
 Include "/opt/bitnami/apps/"

and you will need to put your certificates in the location shown.


This contains

  • Directory level security options
  • rewrite rules to support multisite
  • PHP_FPM options

Testing the Server Configuration

Remember to restart Apache after each change.

When implementing this configuration put a simple “hello world” page as index.html in your virtual hosts htdocs directory and verify that it is working before trying to get the WordPress application working.

Installing WordPress

When you fire up a bitnami stack you get everything prebuilt for you. For subsequent servers you will have to do it yourself manually (unless you write/find a script for it). The steps are

  1. Create a database (unless you are going to use the same one with a different table prefix)
  2. Download a fresh install of WordPress
  3. Copy the WordPress files into the htdocs directory
  4. run http://(your sitename here)/wp-admin/install.php
  5. WordPress is now running

(detailed instructions)

Installing WordPress Multisite

WordPress Multisite cannot be manually installed directly. You must first install WordPress and the upgrade the site by following these instructions.

This process adds additional tables to the database. Failure it exactly may result in infinite loops and other WordPress horrors.

To enable file uploads and the installation of plugins it is necessary to make daemon the owner and group of


With the permissions 0755 and to apply those permissions recursively. If you still have problems uploading verify that these permission have been set correctly.


This post shows how to create a WordPress webfarm but once the flow of control is understood it is easy to include virtual hosts running static websites or other applications.

Understanding WordPress multisite users

– this article was updated on 4 April 2014 after a fix was identified–

I have spent the whole day trying to understand WordPress users in a multisite environment. I’m not sure if it is very buggy, or I am thick or a bit of both.

A multisite instance, henceforth called the “network” has its own domain of users. Every member of a blog on the network is automatically a member of the network.

  • Users register with the network not an individual blog
  • A single email address can only be associated with one network username

It is impossible to operate the network without the users knowing that you are operating a network. For example, this blog is presently hosted by A user cannot register directly with this blog; they must be invited to register with whereupon the Network Administrator (Mr WordPress) or the Site Administrator (me) can give them permissions on this blog.

The network admin can create users in the network administration dashboard but by default users must be added to each blog by hand. However by installing the Multisite User Management plug-in it is possible to add a user to the network and for them to receive default a default role on each blog.

Gotcha for plugin authors and others

In a non-multisite install if a user is authenticated then they will have the role “Subscriber” or greater on the blog. This is not true on a multisite install. They may be authenticated to the network but have the role “none” on the blog that they are trying to access. Authors of security related plugins and themes need to be aware of this.

Bugs in WordPress 3.8.1 BuddyPress 1.9.2

In my scenario I do not allow self-registration but users must be created by either the site administrator or the network administrator.  (Network settings option ” allow site administrators to add users via their own site users at new page” is enabled).

  • Users created by site administrators using the option ” do not send confirmation email” skip the activation step but their welcome email does not include their password, rather the characters [user set]. This makes registration by site administrators pointless.


Microsoft OneDrive is better than Google Docs


, , ,

I needed to share an Excel document so I investigated Microsoft OneDrive.  Microsoft is late to the file syncing party Dropbox and Google Docs having got well ahead.

I’ve used Dropbox a lot, because it just works; it is really effective and painless way of sharing files quickly. This is their core business and they are really good at it.

Google Docs is different because although it has sync functionality its core value has always been collaborative editing of documents online. Microsoft was very late to this arena, perhaps facing the innovator’s dilemma of cannibalising their own very successful office suite. However Microsoft have got a fully functional collaborative document editing suite called Office Apps which are nicely integrated into their OneDrive product.

I simply opened a free online account with Microsoft, uploaded my Excel document, shared a link and now my Collaborators can all edit the document simultaneously.

Why is it better than Google Docs? Because it is Excel – doh!

I have only used OneDrive for a few minutes so if you have any experiences you would like to share or comments on the comparison with Google Docs please comment below.

Better than Apple Time Machine but for a PC


, , , , ,

Over the last month I’ve learnt a lot about backup. There had been significant structural changes over the last few years, disc is cheap, bandwidth is inexpensive and remote storage is coming down in price dramatically.

It is necessary to address two different issues, protecting your data files and enabling rapid recovery of a laptop or workstation when the hard drive fails.

Protecting data files

It is no longer necessary or desirable to store backups of current files in proprietary containers such as “backup sets”. You should simply use a copy utility to echo local files to remote storage. The requirements are

  • It should be possible to schedule jobs for unattended execution
  • An email should be sent if the job fails

I am trialling the product GoodSync from Siber Systems. This supports many protocols therefore it is possible to backup to local servers or any number of cloud servers.

I find it acceptable to rely on my backup archives to recover historic copies of my files however it is possible to find sync utilities that will keep a number of previous versions for you.

Enabling rapid recovery when a hard drive fails

To recover a laptop from a crashed drive it is necessary to have a disk image. This not only has all the data at the time of the backup but also the necessary boot records.

I trialled two products, Acronis true image 2014 which I found unreliable and StorageCraft ShadowProtect Desktop.

ShadowProtect is an excellent product, it is designed by techies for techies. Key features include

  • No impact on performance during incremental backups
  • Logging is excellent
    • progress bars are reasonably accurate
    • emails can be sent on success for failure
  • The user interface is very well designed
    • Documentation is very good
  • Very fast incremental backups
    • Hourly incremental backups take about 30 seconds
  • Automatic consolidation of incremental backups using the ImageManager tool
    • It is possible to reproduce the Apple Time Machine behaviour
      • One backup every hour the last day
      • One backup everyday for the last week
      • One backup every week for the last year
      • (This is only one of many possible backup regimes)

Using Windows built-in disk manager I partitioned my laptop’s 1 TB drive into an operating system partition (C:) and backup partition (E:). I then set up ShadowProtect to create incremental backups to the E: drive as described above.

I use GoodSync to copy the backup files from the laptop to my home server once a day. If I’m away from home and then return I can run a GoodSync job manually or simply wait for its next scheduled execution.

ShadowProtect also provides a bootable ISO image so that you can create a recovery disk to launch the ShadowProtect software when your drive fails. Rather than create a CD I simply burnt the ISO image to a USB memory stick using the excellent Rufus utility.

ShadowProtect’s website is very ugly, I suspect that rather than hiring a web designer they decided to spend more time making the product better. You also have to fill out a form to get a trial copy and remember to ask for a trial of the recovery environment as well. However it is worthwhile getting through these hurdles because the product is very effective.

You may read earlier articles in this series here.

Tiny 64GB usb stick is like another laptop hard drive


, , , , , , , , ,

One problem with laptops is that that only have one drive. This limits space and data redundancy. This tiny USB drive is designed to be installed permanently in your laptop.

An added bonus is that it could be bootable. Thus giving you access either to a linux machine or recovery media for the laptop while travelling. (See other posts for details of my backup journey)

Why I removed Acronis True Image 2014


, , ,

I have just removed Acronis true image 2014 from my laptop and breathed a sigh of relief. Backup software should make you feel that your computer is safer than it was before, Acronis did not.

Acronis’ interface and messages are so chaotic that one is in perpetual state of anxiety about where one’s data is and what progress has been made with either a backup or recovery.

Acronis has perhaps the most comprehensive set of features of any backup product. Unfortunately they don’t seem to work, or more exactly they don’t work very well for me.

Non-stop backup is unreliable

I created an Acronis Secure Zone (i.e. hidden partition) on my laptop’s C drive and set non-stop backup running. This creates an incremental backup every 5 minutes and is intended to allow the recovery of accidentally deleted files. It can also be used to restore the hard drive if the operating system is corrupted and won’t boot but there is nothing wrong with the disc. This is particularly valuable for people like myself who travel with their laptop and don’t have access to the backup archive.

I “tested”  The file restore functionality by accidentally deleting 18 GB of files but a after many hours the restore crashed.

Acronis non-stop frequently stopped during the day for no apparent reason and would either start will not start at boot at random.

When it is running it seems to impair performance.

Restore from archive is buggy and slow

The primary use case for backup software to create a remote archive. I set the nightly job to create differential backups to my Windows Home Server.

Having failed to restore the files from the non-stop backup archive I used the restore functionality to restore them from the nightly tape archive. Although Acronis may have been restoring files there was very little CPU or disk activity and after an hour or so I cancelled the restore. I used the Acronis image mounting facility to mount the archive as and image and then copied the files out using Windows. This worked well and showed a data transfer rate of 5 Mb per second. Although I have nothing to compare this with it seems a bit slow.

Removing the Acronis Recovery Zone

When I attempted to remove the Acronis Recovery Zone the PC simply locked up for half an hour with no feedback about progress. I held my nerve and went and had a cup of tea. Eventually I did get control back and the zone had been removed but the software never prompted me to return the unallocated storage to the primary partition.


Acronis could argue with justification that they cannot be wholly responsible for the performance of backup and restore operations in a world where hardware and networks are so varied. However the user interface and user feedback is appalling and these are under their control. I think it is reasonable to assume that this careless attitude may apply to their core backup and restore functions as well.

I will now trial StorageCraft ShadowProtect.

How to install a SSL certificate on a Bitnami WordPress server


, , ,

In a previous post I described how to enforce SSL in this post I show how to replace the self-signed certificate with one from a Certificate Authority such as StartSLL.

The standard configuration of Bitnami servers is that the Apache configuration is overwritten by an application configuration. The Bitnami server is provided with a self-signed certificate and there are two copies of this, one in the application configuration directory and one in the Apache configuration directory.

I have chosen to replace both with my own certificates.

Changing the Apache Certificates

The Bitnami documentation for Apache tells you how to change your certificates at the Apache level.

Backup and replace the certificates stored here


and update the configuration file


to include the following line below the SSLCertificateKeyFile:

SSLCertificateChainFile "/opt/bitnami/apache2/conf/server-ca.crt"

Changing the Application Certificates

To replace the certificates at the application level you must put them in


and update


(I have used

<VirtualHost *:443>
 ServerAlias *
 DocumentRoot "/opt/bitnami/apps/wordpress/htdocs"
 SSLEngine on
 SSLCertificateFile "/opt/bitnami/apps/wordpress/conf/certs/server.crt"
 SSLCertificateKeyFile "/opt/bitnami/apps/wordpress/conf/certs/server.key"
 SSLCertificateChainFile "/opt/bitnami/apps/wordpress/conf/certs/server-ca.crt"
 Include "/opt/bitnami/apps/wordpress/conf/httpd-app.conf"

and restart the Apache server

$ sudo /opt/bitnami/ restart apache

Note on using StartSSL Certificates

The SSLCertificateChainFile is

ca.crt is not required.

How to enforce SSL on a Bitnami WordPress server


, , ,

Bitnami servers come with Apache and WordPress SSL enabled with a self signed certificate. I will address replacing this certificate in a later post.

As I understand it Bitnami configures the AWS cloud image so that the application configuration overwrites the Apache configuration. This is described in their documentation.

The relevant section is


This will not work because it is overwritten by the WordPress configuration as highlighted. Instead you must follow these instructions,

Recent versions of BitNami apps ship three configuration files in the “/installdir/apps/myapp/conf/ folder: httpd-app.conf, httpd-prefix.conf and httpd-vhosts.conf.


After making the required change to the application configuration file you must restart the server.

$ sudo /opt/bitnami/ restart apache

Free SSL Certificates from StartSSL


, , , ,

Whenever I have to replace a server certificate I curse because I have to pay so much for so little. The whole industry is an conspiracy against the small businessman.

In the past I have purchased certificates from Verisign (very expensive), Commodo (not recognised on Android a few years ago) and most recently Digicert. However the cost of Digicert certificates is still high.

I read about StartSSL who have broken ranks and issues simple certificates for a single domain and one subdomain for free.

I created a certificate for and

and installed it into Apache on my Bitnami Ubuntu Server running on AWS and it ran fine.

The user interface and the StartSSL website is designed for techies by techies. You need to understand the concepts involved in creating and installing certificates. Where you see disparaging comments about StartSSL online I suspect that these have been made by occasional users. Indeed I suspect the majority of the cost of issuing a certificate is supporting clueless users.

StartSSL also sell a range of higher value certificates at a reasonable prices.

I strongly recommend StartSSL‘s services for technical users.


Get every new post delivered to your Inbox.

Join 59 other followers