Homepage Optimisation

Homepage Optimisation

After redesigning my site recently, I realised that loading a bunch of images and text client-side potentially increases the loading time as it has to make multiple requests, including external CSS and JS/JSON files. Not only this, but they were uploaded exactly how I wrote them: full of whitespace and potentially long variable names.

Smush the text

In order to get a small increase in loading times, simply run your CSS and JS files through a minifier to get smaller versions of them! This doesn't reduce your pagesize by much, but if you consistently get a large amount of traffic it soon adds up!

File Original Size (bytes) Minified Size (bytes)
style.css 15,133 12,102
main.js 2,647 1,483
data.json 8,266 6,633
Total 26,046 20,218

That's a difference of 6KB, or a decrease of 22%, for every single visitor every page load!

Smush the images

Now, of course, we can and will go one step further. If your webpage is full of images like mine they need to be dealt with ASAP. Images are the largest part of your website, and if you're naively uploading them as they were created and putting them on your site you could save so much disk space and bandwidth you wouldn't believe your eyes.

For PNGs containing mostly solid graphics, you can safely use services like TinyPNG to reduce the filesize for you, or use RIOT and get the best results yourself.

For large photographic images you're best off firstly resizing the image to something more manageable, say 1920 pixels wide (which could still be considered too wide! It really depends on what the content of your site is). This covers most desktop screens and HiDPI mobile screens to be nice and crisp. Once resized, reduce the quality from what is probably 80-90%, to something like 50-60%. If you see any horrible, visible artefacts increase the quality until you find a good compromise between filesize and image quality.


In my example, I have 14 images of 640x640, making up a grid of album covers. The average filesize for each of these is only 27KB, a tiny amount compared to most mainstream sites, and you can't even tell they've been compressed just by looking at them!

My previous site made use of a CMS which would generate thumbnails for me. Great. While these were only 400x400 pixels, the average filesize was 88KB, and so you can see you can save a lot of data (a whopping decrease of 69% and larger resolution images in my case) if you smush the images yourself!

Mobile users will thank you for doing this - It's likely they have a bandwidth cap as well as you, so it's a win-win situation!

Internalise Everything!

It is known that loading CSS, JS, fonts, images, etc. from outside the main document causes slow down. Obviously, you don't want to have to copy and paste the same CSS into every new document you create, but there is a solution!

If your webserver is using PHP, the following example prevents the need from users making multiple requests to your server to load a single page. Take snippet 1, the standard way, loading in the CSS as you would normally, then snippet 2 applies exactly the same stylesheets to your document but only making one request (from the user's point of view):

// Snippet 1
    <link rel="stylesheet" type="text/css" href="style.css">

// Snippet 2
    <style type="text/css"><?php echo file_get_contents("style.css"); ?></style>

The same technique can be applied to javascript files:

<script type="text/javascript"><?php echo file_get_contents("script.js"); ?></script>

Remember: There are multiple ways of achieving the same thing, this example just happens to use PHP, you can get the same effect using Node.JS if you wanted to.

Server-side vs Client-side Data Loading

Where necessary, your site's main content might be stored in a JSON file for quickly and dynamically updating. I designed my website like this so I can easily add new content using a standardised format and can later reuse the data should I want to redesign it again or use the same data somewhere else.

When I started designed my site, I had everything load on the client-side, and as a result had a lot of DOM manipulation happening as the page loaded. This is great for content that changes frequently, but in my case (and depending on your case too) the content is very rarely going to change. As a result I found that I could increase the page load speed by having it create these elements before sending the resulting page to the client.

This can be done very easily with libraries like Express on Node.JS, or simply by using PHP if that's all your service supports.

Load IFrames Later

If your webpage contains iframes for whatever reason, loading them after the page has finished downloading and rendering will greatly decrease the time it takes to actually render the page, as it doesn't have to download the other, quite slow, and unimportant (in comparison) resources at the same time.

Where your existing iframes are, replace the src attribute with data-src. This prevents the iframe from loading at first, but we'll add some javascript to the end of the document that loads in the iframe when necessary.

// From this:
<iframe src="https://www.youtube.com/embed/dQw4w9WgXcQ"></iframe>

// To this:
<iframe data-src="https://www.youtube.com/embed/dQw4w9WgXcQ"></iframe>

The simple script below waits until the page has loaded, then looks for every iframe in the document and replaces the data-src attribute to src, as we started:

window.onload = function() {
    var iframes = document.querySelectorAll("iframe[data-src]");
    for (var i = 0; i < iframes.length; i++) {
        iframes[i].src = iframes[i].dataset.src;

Remove anything unnecessary

This should be at the top of this article, shouldn't it? If we're talking about optimising your website, the perfect optimisation method would be to just remove things you don't need!

Strip down your HTML until you've got the bare minimum required to display the relevant information your viewers want to see. You probably don't need jQuery, and you could probably make a lightbox plugin yourself in a couple of kilobytes. I know there's no need to reinvent the wheel, but the majority of existing solutions are meant to please as wide of an audience as possible, so will have features you're never going to touch, inevitably making your page speed slower.


Cloudflare is a blessing and a curse. It will automatically minify your site, cache it, and protect it from DDoS attacks. Their DNS is also incredibly fast to update, too. Definitely use it if you can. However, there are some features which, on most small sites, will likely slow it down.

For example, one of their features is Rocket, which according to their site optimises your javascript to run more efficiently. If you look closely, however, this file is 80kb in size and when you've only got 10KB of javascript in the first place you're loading in much more than is required, defeating the purpose of most of the previous steps!


This article can be summed in these steps to get the best out of your website performance. It's very likely you won't need to consider something like AMP if these are followed!

  1. Minify the CSS/JS/JSON/etc.
  2. Resize / compress the images
  3. Instead of making the user load CSS & JS externally, prepare them internally using a preprocessor such as PHP
  4. Consider also preparing the page content on the server's side rather than the client's side
  5. Load in iframes after the page has finished loading, to stop them from hogging the main thread when it's needed most
  6. Don't include things you don't need...
  7. Use Cloudflare, but wisely



Want to stay updated? Get information on new music, events, and even sometimes discount codes!

Get In Touch

If you would like to get in touch with me, please email [email protected], or alternatively DM me on Twitter.

We can discuss commissions, collaborations, or any questions you may have about my work!