The Complete Guide To Angular Load Time Optimization

Share on facebook
Share on google
Share on twitter
Share on linkedin

In the past, I have written a lot about performance tuning Angular apps. Now it is time to dive into one of the big and important topics: optimizing Angular load time performance.

Recently, I helped a big e-commerce site with optimizing their load time, as every ms of waiting time decreases conversions. This forced me to really understand how the browser works when loading websites and in the process I developed a systematic approach to making your Angular application as fast as possible.

One of the motivations for writing this post is because there is basically only vague advice available on how to optimize Angular apps online. That is why there are so many slow Angular apps out there. The advice all sound something like:

  • “Just use lazy loading”
  • “Check your bundle sizes”
  • “Just use Angular Universal”
  • “Upgrade to Angular 8 and use differential loading”

These advice were, well, not that useful and practical. If you can’t advise in a step-by-step manner, your advice is not that practical.

Since, according to Google, 40% of visitors abandon a website if it takes more than 3 seconds to load, this is a seriously important topic, we should have good and practical advice on.

Anyways, this post is taking care of all of this and will be your goto for a step-by-step guide to make your Angular app’s load time to perform world-class. The tips in this post have been used to 6x the first paint time on a real app so trust me it works.

The areas to optimize for when improving load time

After watching numerous BlinkOn videos and reading EVERY blog post on performance tuning Angular apps (yes every literally, even my own!), and tested all of this knowledge in practice, I have located the areas in an Angular app, we need to tune to make the load time faster. This is ordered after the biggest impact first:

  1. Optimizing the size of the main bundle
  2. Optimizing load of static content
  3. Optimizing load of API resources

You might be like, wait there is more! What about Angular Universal?

This is not making the app load considerably faster compared to the previous points. Angular Universal is making the time to interactive (TTI) slower because it requires you to first render the server page and then the client bundles. It might give you a faster first paint though as well as some SEO benefits but it definitely has a price.

The thing that will make the biggest impact on your Angular app’s load time (of the initial route) is making the main bundle as small as possible. The time for downloading, parsing and executing JS all blows up quickly as the main bundle size increases. If you just do this and manage to get your main bundle small, then you don’t have load time problems anymore. You don’t need the next fancy performance tool to make your Angular app load fast, you just need to make the main bundle small. Using service worker caching can then make subsequent requests faster by skipping the download phase.

Back to basics: Meet the Angular doctor

If you have read my other post about performance, you know I use this universal principle when performance tuning Angular apps:

  1. Investigate what the problem is and where it is located. Create a hypothesis based on this, eg. moving a big library out of the main bundle, will make the app load faster
  2. Fix it by implementing the hypothesis, eg. lazy load the big library
  3. Measure how well the performance tuning worked (using audit/lighthouse), eg. measure that the app loads faster

Performance tuning is not a linear process. It is about iterating over these three steps until you have got the desired performance level. If you want better performance, you just do more iteration cycles and optimize the areas we will touch upon in this post.

Optimizing the size of the main bundle

Basically, the reason everyone is talking about lazy loading in a performance context is that it is about making the main bundle as small as possible. Lazy loading helps with that as we can keep the visible parts, relevant to the initial route, in the main bundle and lazy load everything else in separate bundles. Once the main bundle starts to increase the performance goes down exponentially because every kB extra on the main bundle is contributing to slower:

  1. Download
  2. Parsing
  3. JS execution

I don’t like to give vague shitty advice so let’s dive in and have a closer look at how bundling works in an Angular CLI app.

Get an overview of the bundles and load times

Use Webpack bundle analyzer to find out how things are bundled. We want to keep our main bundle as small as possible and lazy load everything that is not relevant to the initial route. To accomplish this we need to know how Angular CLI (using Webpack) is going to bundle our application.

The different bundles in an Angular CLI app

These are the different bundles in an Angular CLI app (ordered after load priority):

  • Runtime
  • Polyfills
  • Main
  • Common
  • Lazy loaded bundles

When the app loads they are fetched in this order:

Runtime bundle

The runtime contains the Webpack loader, that is used to load the subsequent bundles.

Polyfills bundle

The polyfills bundle contains the polyfills and shims to make sure the app has the desired browser support. With differential loading, all the polyfills don’t need to be loaded for browsers, that don’t need them, thus can make this bundle smaller.

Main bundle

Main contains:

  • Vendor – Node modules, including Angular, that are included in main or shared between multiple bundles
  • Source code that is not lazy-loaded

Note, that Angular CLI can also be configured to create a separate bundle for vendor. For the most part, it is the most performant (on the first load) to have these together in one main bundle. One the other hand, if your vendor bundle is huge and is rarely changed, you might get some benefits once your users have it cached using service workers as they keep visiting the site often if you mostly change your source code. This is a tradeoff between the first load and subsequent loads performance.

Common chunk bundle

Common chunk contains the code that is shared between lazy loaded modules. Because this is only used by lazy loaded modules, this is first loaded in the background after the load of the main bundle.

Note, that Angular CLI might create multiple common bundles. Let’s say we have 5 lazy-loaded bundles and two of them reference Lodash. Then, Lodash will go to a lazy loaded bundle, that is using the naming convention:
*lazy-loaded-bundle1-name*+*lazy-loaded-bundle2-name*.*hash*.js. This is smart because it makes sure this will only be loaded if one of the dependant lazy bundles are loaded.

Lazy loaded bundle

Lazy loaded bundles are bundles that are loaded as needed. Either they can be loaded right after the application is initialized (using a preloading strategy) or loaded when requested (default behavior).

Separating over the fold from under the fold

Over the fold is what is visible to the user when it loads the initial page. Since they can only see the upper part of the screen, everything else doesn’t need to be in the main bundle and can instead be lazy-loaded.

If we want to maximize our load performance we split our code:

  • Above the fold for the main bundle
  • Below the fold for the lazy loaded bundles

Maintaining main bundle size by having a main-shared.module

Now I have some next level ninja shit, that will help you tremendously maintaining the main bundle size while maintaining your hairline!

In the same way, you have a shared module to be used only in lazy loaded feature modules, the dependencies in main can become easier to maintain if you simply create a shared module only for main (main-shared.module.ts).

Make sure you NEVER import the shared module in the main modules (over the fold) and ALWAYS use the main-shared module instead. That will make sure the shared source code goes to the common bundle instead of going to the main bundle and slowing down the load time for all routes.

Having a main-shared.module makes it easier to get an overview and manage the dependencies in the main bundle as well as keeping your code more DRY, by not needing to pick imports separately.

Lazy load routes

We want to lazy load all routes except the initial route. Why? We want the initial route to load as fast as possible, so by not lazy loading it, we allow it to get into the main bundle which is loaded before the lazy-loaded bundles.

Luckily, Angular CLI makes lazy loading using routes easy. We just use the loadChildren property in the routes like this:

And this will both tell Webpack to bundle the lazy loaded module as a separate bundle and only load it when this route is requested (or using a preloading strategy).

Component level lazy loading

For lazy-loaded under the fold, using routes and creating dummy routes for the under-the-fold section, would be an awkward approach. Instead, it is possible to load it directly without routing.

There are two libraries for this, which both are based on the NgModuleFactoryLoader which uses the SystemJS loader under the hood.

These are:

  1. ngx-loadable
  2. hero-loader

Let’s look at some pros and cons of these:

ngx-loadable

PROS

  • More popular on Github
  • Has more options for configuring what to show on events such as loading, loading failed and loading succeeded

CONS

  • For specifying the lazy-loaded bundle is either convention-based or by setting a lazy-loaded path mapping in the forRoot which makes for more boilerplate

hero-loader

This library is created by Aaron Frost (originally under the name lazy-af!)

PROS

  • More simple and less boilerplate than ngx-loadable

CONS

  • Don’t have the options to show on different events related to the lazy loading

For this guide, I will go with ngx-loadable, but almost the same steps would apply for hero-loader.

Lazy load under the fold

Again, optimizing the load time is all about lazy loading everything that is not shown initially to the user. We do that by extracting everything under the fold, that is everything out of the viewport on the initial load, and lazy load that. This gives the benefit of making the main bundle smaller as well as parsing and executing less JS on the initial load.

Let’s say we have identified our footer as a big contributor to the main bundle size and hence want to lazy load it.

First, we install ngx-loadable:

npm i ngx-loadable

Then we add it to lazyModules in angular.json so Webpack will bundle this module as a separate lazy-loaded bundle:

We then create a module for the footer, and make sure that it bootstraps the footer component:

Now we make sure the footer component is only imported in the footer module or we will get a build error.

We can now use the ngx-loadable component to lazy load this bundle on a given condition.

First, we need to set up the file mappings for the footer:

You might recognize the file string as been the same as the one used for the loadChildren property in routes.

Now, this is set up in app.module as:

Now we can use the ngx-loadable component in our template to lazy load the footer:

We can now do:
npm start

and we should see the footer being bundled as a separate bundle:

Avoiding flash of invisible text (FOIT)

Avoiding text not showing using either:

The easy way to ensure, this will not impact your load performance is to use font-display: swap on your font CSS. This will tell the browser to use a system font as a placeholder if the font is not downloaded yet and swap it out with the real font once it is ready. You can go with this if you can live with the font swap.

Even after doing this, Lighthouse will tell you:

The estimated savings are not accurate as the savings will come from making the main bundle smaller.

Angular CLI doesn’t have a good way to preload fonts. Normally you simply would add this tag in your index.html:

<link rel="preload" href="fonts/zantroke-webfont.woff2" as="font" type="font/woff2" crossorigin>

You need to make sure that the fonts are not getting hashes when they get built. You ensure this by adding this to your angular.json:

Instead of the default:

"assets": ["src/assets"]

Load scripts after Angular is initialized (not in index.html)

Unless the script is necessary for the execution of Angular, you might want to load the script after the Angular app’s views have been initialized, to make the app load as fast as possible. If the script is only used in a lazy loaded module, you should just move it to there so you also get the benefit of cohesion as well as getting the script lazy-loaded.

As an example, you might only need to activate Google tag manager after the Angular app has initially been loaded and you don’t want GTM to slow down the paint time.

Avoiding unnecessary big libraries in the main bundle

You might like libraries like lodash.js and moment.js but they are often one of the biggest libraries in the app. Definitely not something you want in your main bundle if load performance is critical to you.

Getting a library out of the main bundle

As mentioned before, we simply do these steps to get a library out of the main module.

  1. Find out if it is needed in over-the-fold of the initial route. If it is, we might look for a more performant alternative. If it is not needed in over-the-fold of the initial route, we will lazy loaded it.
  2. Make sure it is not referenced in any modules that are eager loaded and only referenced in the module that it is needed in, so it will be lazy-loaded.

Use ES6 imports

The best practices with loading libraries are to use ES6 imports, so you can import only what you need from the library in a tree-shakable way. The last thing you want is to import the whole library when you are only using one function. Eg. lodash-es will give you a version of lodash that supports specific tree shakable imports, so you can import only what you need.

Only use one UI library in your Angular app

Even with tree shakeable ES6 imports, having eg. Material, Bootstrap, and Covalent in the same Angular app will crank up your main bundle size a lot. Also, it might make up for a weird design with mixing all of these UI libraries. Pick one and get rid of the others. An exception to this can be if you need a specific part from another ES6 loadable library, then sure import it, but if it is not shown on the initial load, lazyload it.

Find more lightweight alternatives or DIY/copy-paste

If ES6 imports are not supported by the library, you might want to look for a more lightweight alternative. If you only need one method from a library. you could just implement it yourself or copy-paste from the Github repo.

You might want to find a smaller alternative to the library you are using. Eg. if you are using Moment why not use moment-mini instead?

Make sure the big library is only imported were it is used

To avoid the big library ending up in main or common bundle, make sure it is only imported were it is used. It is common for Angular apps to have a shared module for easy import of all the shared stuff, but that will likely end up in either main or common chunk. Instead, avoid importing this big module in the shared module and import is directly were it is used. once it truly is shared, you can move it to the shared module.

If a third party library is only used in one lazy module, it should only be imported there. If it is used in either one non-lazy module or more than one lazy module it will go to a common lazy-loaded bundle.

For your shared application code, it will go to main, if it is referenced from a non-lazy loaded module and it will go to a common bundle/sub common bundle if it is referenced by more than one lazy loaded modules.

Measure after implementing these improvements

Now, I invite you to do these improvements in your application and return to the section after you have done it.

As I said at the beginning of the post, performance tuning is an iterative process. Now we have implemented this, it is time to measure again. We measure with Webpack bundle analyzer and lighthouse:

Measuring bundle sizes with Webpack bundle analyzer

We measure the Webpack bundles again to make sure, that we have gotten rid of dependencies in the main bundle and that they are ended up in the lazy bundles. We also look at the main bundle again and see if we can find any libraries, that are not critical to the initial route. If they are not, we move the usage to a lazy-loaded module or create a new lazy-loaded bundle for it. We only want libs that are necessary for the initial load in the main bundle! Alright? 🙂

First, we install Webpack bundle analyzer globally with:

npm i -g webpack-bundle-analyzer

Then we can do a prod build with named chunks and stats json with:

npm run build -- --prod --named-chunks --stats-json

And finally use Webpack bundle analyzer as:

webpack-bundle-analyzer dist/*name-of-project*/stats.json
Note: depending on the setup, you might get a stats-es5.json and stats-es2015.json. Just reference the stats-es5 instead.
 
We should now see:
 
Then looking at this, it becomes clear that our biggest libraries are Angular/material and CDK. This is a pretty basic app, so there is not too much to take out of here. Also, we see that the footer has been extracted to a separate bundle.
 

Measure with Audit/Lighthouse

We measure again with Audit/Lighthouse in Chrome dev tools. You can also use WebPageTest which is also a good tool for measuring load performance.

We see if our hypothesis from the investigation phase holds up and we see performance improvements.

You can use the “Audit” tab in Chrome dev tools to generate a performance report:

Basically, to get a good performance score you need to have fast:

  1. First contentful paint (FCP) – The time it takes to paint the real website (not just showing the spinner overlay (first paint))
  2. First meaning paint (FMP) – The time it takes to paint what the user came for
  3. Time to interactive – How fast it takes for the site to become intractable, that is, when the CPU is Idle

If we are happy with the performance now, we can go straight to our boss now and ask for a raise or paid vacation. If not, we go back to step 1 and do another iteration with a new hypothesis.

Optimizing subsequent request

The subsequent request can be optimized using service worker caching.

For this, I recommend you check out these posts:

  1. Caching static content
  2. Caching API resources

A word about hosting

Of course, I need to talk about hosting and compression in a load performance post.

Hosting

For hosting, I recommend using a cloud provider for easy scaling hosting with at least two replicas of the application and a load balancer in front. Also, use a fast CDN like Cloudflare and Cloudfront to make sure the static resources are always loaded fast and is taking the load away from your FE servers.

Compression

For compression, there are basically gzip and Brotli.

If your server supports Brotli, use that, as that will make up for a smaller bundle size.

Brotli will make JS files 14% smaller than GZIP so it is definitely worth looking into if you are currently using gzip. If you use Cloudflare it is as simple as just turning it on:

Prevent bad performance

You might have heard my saying:

“If it can’t be automated, don’t bother”

That applies here as well.

If we can’t automate a way to make developers write performant Angular apps it will soon become slow again and eventually fall back to baseline.

To avoid this there are a set of tools to run on the CI, we can set up to protect performance:

  • Angular CLI bundle budgets. Here I am especially talking about the main bundle, to keep that as small as possible and set a budget up for that. Everything else can go to lazy bundles.
  • Run lighthouse on the CI and set a threshold for the performance score: https://github.com/GoogleChromeLabs/lighthousebot

What is left out?

In this post I didn’t touch upon:

  • Load speed improvements with differential loading
  • Load speed with Ivy

I will later look into this and will give an update regarding the results of this.

Angular load performance checklist

Here is the performance checklist to go through

  1. Did you make sure to NOT lazy load the initial route (/)?
  2. Did you make sure to lazy load all routes except initial route?
  3. Did you make sure to lazy load everything under the fold on the initial route and pages which are having load performance problems?
  4. Did you look in Webpack bundle analyzer for any lib not needed in the initial over-the-fold load and lazy-loaded everything else?
  5. Did you preloaded fonts or use font-display: swap to avoid blank text?
  6. Are you using Angular PWA to cache static resources?
  7. Did I only import the shared module in lazy loaded modules?
  8. Are you using solid and fast hosting and CDN including compression (Brotli if possible)?
  9. Do you have a green score in Audit/Lighthouse? (if not, do another iteration from step 1)

This checklist is something you can keep going back to. Sites like Amazon that load in 0,4 seconds are not optimized in one go. They spend millions on exceptionally good engineers to keep iterating the performance and squeeze every performance improvement out of it. Also, they don’t use a SPA which brings me to…

The harsh truth about load performance and single-page applications

Even, with the amazing performance improvements, there has been with Angular over the last couple of years a static website will always load faster than an Angular app. This is because of its simple nature, that is very easily digestible for the browser.

When the browser loads a SPA it needs to do more work:

  1. Download JS bundles
  2. Parse JS bundles
  3. Execute JS bundles to render the page dynamically

That is one of the reasons Netflix uses static pages for their performance-sensitive pages (such as signup and homepage) and uses React (their own fork which is tuned for performance) for the actual application.

Conclusion

In this post, we saw how to optimize the load time of Angular apps. We saw the importance of getting the bundle size small and saw how it is the root of all evil when it comes to performance. We how to optimize the main bundle size using route and component level lazy loading. We looked at using tools such as Webpack bundle analyzer to getting rid of unnecessary code in the main bundle and extracting it out using lazy loading. We saw how to do an audit with Lighthouse as well as how to prevent bad performance from reoccurring using automation.

In the end, I presented a checklist so you can start doing iterations to improve your load performance today!

If you want any help with improving the performance on your Angular apps, you can reach out to me here.

Do you want to become an Angular architect? Check out Angular Architect Accelerator.

Related Posts and Comments

How to Set up a CI pipeline with Azure Pipelines and Nx

It goes without saying that having a CI pipeline for your Angular apps is a must. Setting one up for regular Angular apps is fairly straightforward but when you have an Nx monorepo there are certain other challenges that you have to overcome to successfully orchestrate a “build once, deploy many” pipeline. This post will

Read More »

How to Set Up Git Hooks in an Nx Repo

Git hooks can be used to automate tasks in your development workflow. The earlier a bug is discovered, the cheaper it is to fix (and the less impact it has). Therefore it can be helpful to run tasks such as linting, formatting, and tests when you are e.g. committing and pushing your code, so any

Read More »

The Stages of an Angular Architecture with Nx

Long gone are the times when the frontend was just a dumb static website. Frontend apps have gotten increasingly complex since the rise of single-page application frameworks like Angular. It comes with the price of increased complexity and the ever-changing frontend landscape requires you to have an architecture that allows you to scale and adapt

Read More »

The Best Way to Use Signals in Angular Apps

Since Angular 16, Angular now has experimental support for signals and there is a lot of confusion in the community about whether this is going to replace RxJS or how it should be used in an app in combination with RxJS. This blog post sheds some light on what I think is the best way

Read More »

High ROI Testing with Cypress Component Testing

Testing is one of the most struggled topics in Angular development and many developers are either giving up testing altogether or applying inefficient testing practices consuming all their precious time while giving few results in return. This blog post will change all this as we will cover how I overcame these struggles the hard way

Read More »