If you’ve worked in an agency that have their processes sorted, you’ll have come across some sort of automated or well managed deployment system. It’s definitely a step up from manually uploading files via FTP (or even editing them directory from the server via FTP). Whilst this makes my toes curl, we all had to start somewhere.
From manually uploading files, the next step is to start working with version control systems (VCSs). SVN and Git are two very well known VCSs which are used to save all your changes to a repository. Some use this as their deployment by simply cloning the repository on the server and pulling (generally through a staging/production branch) directly via the command line. This takes away the annoyance of having to upload all your files by only deploying changed files, but it’s still not the right way to go about it.
There are third party solutions out there such as DeployHQ and DeployBot (formally known as Dploy), both of which are great tools and enable you to set up deployments via a hook on a specific branch or by manually deploying (either by adding some text in the commit or pushing the ‘deploy’ button on their website). Personally, I have used both of these systems and found DeployBot has much more to offer as a deployment service however, there’s still a few bottle necks (one of them described in this post).
All of these systems use SFTP/FTP to upload files to the server. Although there are other types of deployments you can set up yourself (such as SSH deployments) the FTP versions are the only ones that actually upload the files for you.
Something which I think these systems have missed out on is the use of a command called
rsync. For more information about this command, take a look at the manual page by typing in
man rsync from your command line. Directly from the manual, the part we want to take advantage of is:
The rsync remote-update protocol allows rsync to transfer just the differences between two sets of files across the network connection, using an efficient checksum-search algorithm described in the technical report that accompanies this package.
How great is that?
This means if we’re deploying a file we already have on the server (chances are, we do, since we’re generally updating files as well as uploading new ones), we’re only deploying the difference in the file, not the whole file itself. If we had a 90MB file and changed a few words, rsync will be able to upload a much smaller amount of bytes compared to the whole 90MB file, which FTP will not be able to do. Since FTP is pretty slow anyway (other copy commands such as SCP will be much faster than FTP), rsync seems like a great solution for deploying web applications.
Let’s take DeployBot’s setup as it currently is and develop from there to make the perfect deployment system. If you’re not familiar with DeployBot, take a look at their guide on deploying a Laravel app to Digital Ocean and read from ‘Configuring DeployBot’.
If we take the technology from rsync and the setup DeployBot already has, we’re heading in the right direction. One of the biggest downfalls I found using DeployBot is the fact it has to re-upload the entire project via FTP. On a large site, that can take quite a while. It does create revision directories and uses a symlink to change the public directory to the new revision (when it’s been deployed, and when it’s ready).
Rather than having to re-upload the entire project, why not just copy the previous revision directory (which will be quite fast on a good server – much faster than uploading everything again) and then use the rsync command to deploy the differences for that particular revision?
If we were to deploy something to our server from our local version, we would use rsync like this:
rsync -avz --delete /var/www/handle/. user@server:/var/www/handle/.
With all that in mind, if DeployBot were to copy the previous release, upload changes to that directory via the rsync command and continue with their usage of system links and containers, we’d find it to be perfect deployment system.