Aug 31 2020

Light as a Feather: Improving CSS Page Weight

Performance is a software cornerstones that tends to be left behind. It’s easy to get distracted while pushing through countless features. Let me be clear about something though: Performance is a feature too! Not only is it a feature, but it also leads to great user experience. The UX of this blog will hands-down beat the UX of any ad-bloated recipe site.

Web performance falls under two categories:

  1. algorithmic complexity
  2. filesize * latency

Many javascript-heavy applications ought to be concerned about both #1 and #2. However, since my blog is static, I don’t have to concern myself with algorithmic complexity. This allows me to focus solely on page weight of the website. And while I could lower latencies through CDNs, I will put that on the todo list for another day.

For too long I’ve been putting performance of this website on the backburner. This made sense in the beginning when I just wanted to get something shipped. Now that there is a good chunk of content out, I can begin refining some of the rough edges.

Let’s first start with one of the things that I did right. This website used to ship with Tachyons via CDN. Why was this a great choice? I’m glad you asked. You see, had I forgotten to configure Nginx to gzip CSS responses. So had I hosted my own version of Tachyons, I would be sending 74kB instead of the gzipped 10kB equivelent. Plus, most of these styles were completely unused!

With this being said, let’s start with the quick wins. There’s nothing quicker than gzipping your responses. We just need to slightly modify the nginx config:

server {
  # ...
  gzip on;
  gzip_types      text/css;
  gunzip on;

Now we have set outselves up for success when we ship our own CSS stylesheet. This also decreased the size of the homepage by about 7kB.

If you are looking to optimize CSS on your own website, we need to begin with a little philosophy. The most intelligent person I know in CSS philosophy is Adam Morse. Go ahead and read his very insightful post about keeping it DRY. In fact, all of his blog articles are very high quality and are worth reading. Scaling CSS is an uphill battle without the mindset of functional, atomic CSS.

Now that you have a background on CSS style reuse, we can jump into removing unused styles. The last thing you want to do is give somebody a resource that they didn’t ask for. So instead of shipping the entire Tachyons library to my users, I will need to pluck out only the styles in use. With the blessing of open source, I can utilize PurgeCSS and uglifycss. I am also going to download a copy of Tachyons and replace the CDN.

Uglify will minify the CSS content, while PurgeCSS will remove any unused styles. The bash command (not gulp, webpack, or brunch!) that I use for generating the CSS is:

purgecss --css assets/tachyons.css --content _site/**/*.html,_site/*.html \
  | ./bin/ \
  | uglifycss > assets/styles.min.css is a quick and dirty script that aggregates all of the css content from the json output of PurgeCSS:

#!/usr/bin/env python3

import json, sys, fileinput

content = ''

for line in fileinput.input():
    content = content + line

parsed = json.loads(content)

css = ''

for entry in parsed:
    css = css + entry['css']


Now we have a small CSS file that is only what we need! I can go ahead and replace all html templates to utilize styles.min.css. Now here is the drum roll… What could have been 74kB through shipping a non-gzipped Tachyons libary… is now…

2.1kB! A 97% reduction.

What’s next on the docket for performance improvements? It looks like Google Analytics is consuming 52.8kB. This is 70% of the current page weight now. With this in mind, I plan on either cutting out analytics, or find a lighter solution.