The Perfect Blogging Workflow

I periodically talk about this site and how it is ran. Well after nine months my baby has gotten an upgrade. I’ll try and detail the programs that I use to render this site thorough enough to so that they can be followed and not be too tedious. Before getting started, it should be noted that I have a test web server and machine where the web site is built. These are two different machines and files from the development machine have to be copied to the test web server.

Let’s start with the basics. I still use jekyll, the static blog generator, to turn my writing into web pages and for the form of site. The most tedious part of jekyll is setting up how the site is to be generated. I have jekyll build the site to _site, this allows me to view the majority of the web site unchanged. Any dynamic part of the site relying on the back-end written in WCF, like comments, aren’t rendered correctly, but the impact is minimal.

Having jekyll solves the biggest problem of getting a blog up and running, but there are still plenty of automation missing. If there is any amount of javascript in the project (I say ‘any’ because there is a minimal amount of javascript on this site and still the following advice is useful), then one will want minification and jshint. Jshintis for javascript error detection and minimization to reduce load times. The minimization is in remembrance for those who do not have a blazing fast internet connection or large data plans. On the styling part of the website, I use LESS CSS to compile into CSS, which is then minified. The banners that are prevalent at the top of recent blog posts aren’t stored in the web repository as the size they appear. These images are loaded with exactly the right size, thereby reducing wait and load on the cpu spent resizing the image. The site is then copied from the jekyll server to the web server and then if the it looks good, I push the website live using ftp. I could manually invoke the tools necessary to achieve the desired result. This could be further improved with some sort of Bash script, which I actually wrote and used for quite a while. The problem with this approach if the development machine changes. If one remembers what version the tools they used were, then there isn’t a problem; however if one forgets then new versions must be downloaded and that could lead to dependency issues (true story).

What fixes this? Node. Those familiar with just the name and its recent appearance, might groan. Indeed when I told my boss from my first job, that the web application I had developed needed node to be able to ‘compile’, he became skeptical, doubtful, and had to be reassured that the application wasn’t running on top of node. It is only needed for development purposes, so don’t worry. I know it may seem to be overkill to download and install server software just to achieve a better workflow, but the end result is worth it. Trust me, no one hates software more than software engineers, so coming from me, you know that if the end wasn’t good, I wouldn’t have converted my entire website to the new workflow.

How does node fix this problem? Grunt. Grunt is an automation tool that runs ontop of node and has allowed me to throw away the bash script in favor of a method that ensures that transferring development to a new machine involves a simple dependency resolution of npm install --no-bin-links. The --no-bin-links is needed due to the development machine being a virtual machine. Grunt is the solution to all the problems posed above. I still am amazed that I can point grunt to a directory of images on the jekyll server and resize them so that their width are all exactly 720 pixels. Therefore in the source control I have higher quality pictures just in case I change the width of my blog in the future. Resizing of images and the automation of ftp deployment are the two biggest advantages I found with grunt. Basically everything about this site is now automated – all but the writing!

Here is the workflow:

For minimization, I must mention that I have moved away from Google’s Closure Compiler and instead use UglifyJS via RequireJS Optimizer. The main reason is speed of compilation. As shown here, Closure is, on average, five times slower than Uglify, and if I’m going to be frequently testing my site, I don’t want to sit around waiting.

Comments

If you'd like to leave a comment, please email [email protected]