Improving frontend web performance using RequireJS optimiser

By Richard Powell, Frontend Engineer at Server Density.
Published on the 27th June, 2013.

Performance is important. It’s so important in-fact that both Google and Bing agree slow pages lose users. But where does performance come from?

80% of the end-user response time is spent on the front-end. Most of this time is tied up in downloading all the components in a page.
Best Practices for Speeding up your site

And that’s why when we started analysing the performance of Server Density V2 the quickest win by far was to reduce the number of HTTP Requests requireJS was making. We’ve been able to do that by using the requireJS optimiser. This article is going to look at that journey, specifically:

  • How we broke down the problem
  • How and why we used RequireJS & the problems it introduced
  • How we used the RequireJS Optimiser to solve these problems
  • The challenges we encountered
  • The effect on performance

HTTP Requests: The Breakdown

To understand that we had a request problem we needed to look at the performance of the application using Chromes Network Panel and Console.time(). Both of these tools helped us to understand the applications performance, but by far the most illuminating evidence was the network panel. It showed us how many requests were made and it allowed us to filter them by the type:

A Network Graph showing requests for modules

A Network graph showing 83 requests for templates

Here we can see that 102 requests are made to load the applications JavaScript (each file ending in .js) and 82 requests are made to load our Mustache.js templates (files ending in .html). These files are requested by requireJS, but how does this compare to the number of requests for files like images and css:

Graphs showing percentages of requests

Out of approximately 210 requests (enough to make any front-end developers eyes water) requireJS is responsible for 89%.

Yahoo, Google and O’Reilly have all written good articles explaining why requests are expensive. But if we take it for read that it’s best to avoid them the question must be asked why use requireJS at all? Why organise code in a way that is inherently expensive to the performance of an application? Well, the first answer is that once we run the code through the optimiser all those requests won’t exist, but lets look at that question in more detail now.

Managing Dependencies With RequireJS

As a modern web applications Server Density v2 uses the Backbone stack Backbone, underscore & jQuery and requireJS to manage dependencies. Backbone handles our models, collections and views, jQuery handles our DOM interactions but it’s RequireJS that glues it all together at an architectural level. For those that aren’t familiar with this tool:

  • RequireJS is a JavaScript file and module loader. It is optimised for in-browser use, but it can be used in other JavaScript environments, like Rhino and Node.
  • The requireJS website

This means you group your JavaScript (or in our case CoffeeScript) into modules. Each module is a block of code that declares it’s own dependencies before returning it’s functionality. Below is an example from Server Density V2 (you can paste CoffeeScript into jsToCofee if you wish):

define (require) ->

    Backbone = require('backbone')
    PluginPostModel = require('models/plugin-post')
    App = require('app')

    class PluginPosts extends Backbone.Collection

        model: PluginPostModel
        urlRoot: App.getHost('alerts/plugins')

    return PluginPosts

In the above example we are defining a Collection module and we are declaring three dependencies: Backbone, a model and our application class. Our application can now use the collection like so:

define (require) ->

    $   = require('jquery')
    _   = require('underscore')
    Backbone = require('backbone')
    PluginPosts = require('collections/plugin-posts')

    class DeviceView extends Backbone.View

        initialize: ->
            @pluginPosts = new PluginPosts

        listenForEvents: ->
            @listenTo @pluginPosts, "loading:start loading:end", ()  => @toggleLoadingSlate()

return DeviceView

Here the Device View is using the collection from the first example which is resolving its own dependencies. So using RequireJS allows any part of the application to be loaded in any order, without us having to worry about a dependency like jQuery or a specific model having been loaded first. This makes dependency management much easier as we don’t have to think about dependencies at an application level, just a module level.

RequireJS: The Sting in the tail

As we’ve already seen, requireJS has a sting in it’s tail: the number of requests it makes. And even though it’s great at minimising requests and handling them asynchronously they soon add up. In our simplified example above theres still 6 requests:

  • jQuery
  • Underscore
  • Backbone
  • The PluginPosts Collection, which
  • The PluginPostModel
  • The Application

If we scale this simple example to an application the size and complexity of Server Density v2 we’ll soon have around 200 requests. Thankfully we can reduce these without too much hassle.

The RequireJS optimiser

The requireJS optimiser is a tool that examines Require modules so that they can be concatenated and minified. It means that developers can manage dependencies in a way that’s familiar to them without incurring excess requests. The optimiser just becomes an additional step before deployment. In Server Density v2 we use node.js to run the requireJS optimiser over 2 separate build files before we deploy the application. One build file defines how to optimise our JavaScript, the other defines how to optimise our stylesheets. We have both build files inside a build folder alongside the optimiser JavaScript and both files output to a separate distribution folder. The final piece of the jigsaw is a few conditional checks in our index and bootstrap files to ensure users on production download the optimised code. Our JavaScript build file looks like this:


// This tells the requireJS optimiser where to find our application code
    baseUrl: '../app/compiled/',

// This tells the require optimiser where out application starts. Normally this file loads all the dependencies itself but this will be entry point for the optimised file
    name: 'bootstrap',

// This file contains all the paths to the modules and libraries that our application uses.  The requireJS optimiser uses this to resolve every dependency in the application and concatenate them all into one file.  Alternatively you can configure the modules in the build file using paths, shim etc
    mainConfigFile: '../app/compiled/require-config.js',

// This is where the optimised file should be built to. This becomes the file that includes all the above modules and libraries
    out: '../distribution/bootstrap-built.js',

// Without this the requireJS optimiser will not look formodules or libraries used within other modules. Without it the optimised application will still rely on HTTP requests to load modules or libraries
    findNestedDependencies: true,

// If we set up a path to the require lib, we can include it as part of the build and avoid a separate HTTP request
    paths: {
        requireLib: '../assets/js/libs/require/require-2.1.5.min',
    include: 'requireLib',

// Specify what minifier to use and any minifier specific configuration. We have choose not to mangle the output to avoid problems with the variable names and the Rickshaw Library
    uglify2: {
    mangle: false
    optimize: "uglify2"


Each configuration option is commented to explain what it does and why we are using it. If you’d like to see more of the available options then I recommend this sample build file. The important bit to understand is that mainConfig file defines most of our modules and name is the entry point for our application. The second file minifies and concatenates our CSS:


    cssIn: "../app/assets/css/site.css",
  out: "../distribution/assets/css/site.css",
    optimizeCss: "standard"


We run the requireJS optimiser using both these build files with the following commands:

$ node r.js -o build-js.js
$ node r.js -o build-css.js

The result is 2 files. One containing all our applications JavaScript, the other containing all our applications css and images.

The problems we encountered

Adding the requireJS optimiser into our build process was not without it’s pitfalls. For the most part the problems were minor but I’m going to document the problems we had as it may save you some time:

  • Absolute url’s: Initially all the dependency URL’s in our mainConfig file were absolute. We found that this created enough problems relating to the baseUrl that it was easier to use relative URL’s in this file. Fortunately we could still use relative URL’s outside of this file for our module definitions.
  • Incompatible configuration options: Initially we tried to use just one build file. We found that the configuration options we needed to optimise our CSS were not compatible with the options to optimise our JavaScript. Creating 2 build files to get around this was a logical solution.
  • Aggressive minification: Initially, we found that the default minification options were a little too aggressive for Rickshaw, the Graphing Library we use. This led to unpredictable JavaScript errors. Changing to uglify2 with mangle set to false solved this problem at the unfortunate cost of some file size. (Update: This issue is now fixed in Rickshaw)
  • Window specific variables: Our mainConfig file contains some lines that are specific to the browser environment. The requireJS optimiser runs in the node environment, meaning an error on those lines. We solved this by moving browser specific code into a second require.config as detailed here. You could also solved this by moving the contents of the mainConfig file into the build file, but this seemed like a duplication of effort to us.
  • CSS minification: We use a font file in our CSS to manage our icons. The CSS minimiser was munching these declarations, which was easily fixed by using the character codes in our CSS rather than the symbols those character codes represent.

Before & After

So, was it worth it? Well, lets compare the number of requests before and after.

RequireJS CSS Images Other Total
Before 185 11 7 11 214
After 3 1 7 11 22
Reduction 98% 81% 0 0 89%

They key statistic here is an 89% reduction in the number of requests that need to be made when the page loads. We can also assume a reduction on the number of requests after a user interacts with the page. Those extra dependencies no longer need to be resolved as they are concatenated into the single file the user downloads. Further to these impressive stats we’ve received some great feedback from users:

I must say V2 is a pleasure to use and I really admire the clean and fast UI. Keep it up! – Geoff Wagstaff of GoSquared

So now Server Density only makes 22 requests when the page loads, which is much lower than the average. Furthermore users can tell the difference: feedback before this process was that V2 could be faster, now the feedback is that it’s fast.


The requireJS optimiser is not the only tool we’ve used to reduce requests; we’ve also optimised images into style sheet, used font icons and optimised each page to make as few API calls as possible. It also wont be the last: We’ll be monitoring and improving performance regularly. But the steps we have already taken using the requireJS optimiser demonstrate the importance understanding performance bottlenecks and focussing on the correct areas of optimisation. It can be tempting to focus on complex problems such as the speed for loops execute or how fast a jQuery Selector is when the actual performance bottlenecks are much less exciting.

The use of requireJS and its optimiser also demonstrates that the tools we use often have pros and cons. RequireJS is great for making dependencies easier to manage, but it costs requests. Thankfully the optimiser exists, and I hope this article has explained why it’s an essential partner for requireJS.

Free eBook: The 9 Ingredients of Scale

From two students with pocket money, to 20 engineers and 80,000 servers on the books, our eBook is a detailed account of how we scaled a world-class DevOps team from the ground up. Download our definitive guide to scaling DevOps and how to get started on your journey.

Help us speak your language. What is your primary tech stack?

What infrastructure do you currently work with?

Articles you care about. Delivered.

Help us speak your language. What is your primary tech stack?

Maybe another time