Self-hosting external scripts

Varun Chauhan
NE Digital
Published in
4 min readOct 10, 2022

How to improve website performance by self-hosting external scripts?

Frontend optimization Photo by buttercms

At NEDigital we are constantly working on improving user-centric metrics like LCP(Largest Contentful Paint), FID(First Input Delay), TBT(Total Blocking Time), SI(Speed Index) etc. As an e-commerce platform, one of the challenges we face is the loading of external scripts impacting our overall web load time. With the product and business team's ambition of enhancing the overall buying experience of the website by integrating scripts with external partners, the performance of the website takes a toll.

We constantly evaluate these scripts to measure their impact but one of the common issues associated with loading any external script is that it requires establishing a connection which causes a few things like a DNS lookup(Domain name to IP translation), a new HTTP connection(TCP Handshake) and an SSL handshake with the vendor’s server. Another cost of fetching external scripts is that you miss the opportunity to serve the asset with HTTP2 multiplexing, a much more efficient way for a browser and server to communicate. The image attached below captures the various times taken while establishing a connection with an external domain via a performance test done on webpagetest.

Webpagetest waterfall result

Time taken by DNS Lookup was 204ms, Initial Connection 352ms and SSL negotiation took 260ms. In total web browser takes 816ms to connect with one external domain before we even start to fetch the script. If we can host these external scripts on our CDN then we can save a reasonable amount of time spent connecting external domains. To achieve this we can upload the external script to a storage bucket and start serving it from a CDN. But the problem with this approach is that we will miss updates done to the script by the vendor.

To solve this problem we need an intermediate service which will keep updating the script from the external domain. Our initial proposed solution looked something like this.

Proposed Architecture

For the above, proposed solution we were using the AWS lambda function as an intermediate service, which would loop over a list of scripts and connects to each external domain to fetch the script. AWS Lambda function will upload them to the S3 bucket and further this bucket was connected to AWS CloudFront for hosting files via CDN.

Lambda Script

The script was working well for a local machine but once uploaded to lambda it didn’t work properly as the “request” is an npm package and not a part of NodeJS. Lamda doesn’t provide us with the option of installing any node modules. On further exploring the documents, I observed a layer can be created which can be attached to the lambda function that can serve any dependency we need. From this, we worked towards version 2 of the architecture.

Proposed Architecture V2

I created a layer with the request package and integrated it with the lambda function. Using this layer, now lambda function can import the request packages and can loop over the array of file’s paths hence able to upload them to a specific folder inside the S3 bucket.

Even with this approach we still need to execute this function manually which is not realistic. After exploring more options I found the event bridge is best suited for the task. Event bridge will invoke the lambda function based on the frequency defined, for our use case we will trigger the lambda function hourly ensuring our scripts are updated regularly. One can change the frequency when needed.

Our final architecture looks like this.

Final Solution

Summary

Improving website performance is always an ongoing process, one has to look out for areas of improvement where one can focus. One such area of improvement for us was the time taken by the client to connect with each domain associated with the external script. In this article, we have worked on resolving this problem by hosting an external script on AWS Cloudfront via the AWS S3 bucket. Meanwhile, we resolved the problem of syncing files using the FileSync Lambda function. This lambda function will loop over a list of scripts from an external domain and upload it to our S3 bucket. Further, we are using an event bridge to execute our lambda function every hour.

Hopefully, this article provides you with a clear approach for hosting external scripts on a CDN.

Thank you for reading!

Reference

  1. AWS Lambda function
  2. Event bridge
  3. Schedule expression using rate or corn
  4. Create an event bridge rule that runs on schedule.
  5. Self-hosting script
  6. Amazon CloudFront

--

--