Gaming Google — The perfect lighthouse score

Jonathon Grantham
9 min readMay 18, 2023

--

I’ve always been hesitant to share my code. It’s not that I believe it’s subpar, nor am I plagued by imposter syndrome. It’s just that sharing code offers a glimpse into the madness that is my mind, and that feels somehow cruel for everyone else. Allow me to guide you on a journey that nearly drove me to the brink of sanity. It all began with Google’s Lighthouse update.

Developer pulling out his hair

For the SEO uninitiated, Google’s Lighthouse update represented a substantial shift in search ranking, hinging on a website’s performance and adherence to best practices. If you’d like to test your website, simply open it in Chrome on a desktop, press F12 to open the Chrome developer tools, and then click the “Lighthouse” tab. Choose either Desktop or Mobile and click “Analyze”. After about a minute, you’ll receive five scores, each ranging from 0 to 100, for Performance, Accessibility, Best Practice, SEO, and Progressive Web App (PWA).

Before delving into the technical details, let me introduce myself. My name is Jonathan Grantham, and I’m the proud owner of a small B2B SaaS company, Nexoid, which specializes in ERP and ITSM software. The website I’m discussing in this piece is www.nexoid.com. Feel free to have a look, open up the source code. You can also follow me on LinkedIn. My profile is https://www.linkedin.com/in/jonathongrantham/.

Upon the release of the Google Chrome Lighthouse update, I did what any conscientious business owner would do: I checked my website. Much to my dismay, the results were far from satisfactory. With scores of 21/100 for Performance, 30/100 for Accessibility, 45/100 for Best Practice, 11/100 for SEO, and a failure for PWA, I was shocked. I had personally built the website, a fairly standard Single Page Architecture using React.js. The scores were a crushing blow.

Undeterred by the initial setback, I launched MS Code and began addressing the issues one by one, with Performance being my first target. The guides provided within the Lighthouse tool proved quite useful. I converted all images from JPEGs and PNGs to modern WebP files, and ensured that every img tag was equipped with a width and length property to prevent layout shifts. These modifications alone boosted my score from 21/100 to 60/100. It was a significant improvement, but far from perfect. The only suggestion remaining was to “reduce unused JavaScript,” which wasn’t particularly helpful. The only JavaScript present was the React.js framework, as everything else had been eliminated.

Despite my persistent efforts to rectify the issue, I was met with constant roadblocks. I attempted to remove parts of React.js, explored “lazy loading,” and tested various optimizers and compressions. However, the issue stemmed from React.js itself, which was approximately half a megabyte in size.

I can almost hear seasoned web developers shouting, “Don’t use React for a website! It’s meant for building web applications!” I’m well aware of this now.

What began as a seemingly simple task of converting a few images had now morphed into a complete website overhaul using a new framework. Frustrated and uttering a few choice words under my breath, I set out in search of a suitable replacement. I first considered Vue and later Angular, arguably the biggest competitors of React.js. However, they both presented the same issue.

In an attempt to simplify things, I decided to look into older technologies, and gave jQuery a shot. Yet, I was met with the same problem. It became abundantly clear that there wasn’t an off-the-shelf Single Page Architecture framework that could appease the Google deities.

It seemed my only remaining option was to resort to vanilla JavaScript.

My series of experiments began with a basic HTML page without any JavaScript. Then, I tried an HTML page with a div whose contents could be replaced. I quickly realized that making multiple simultaneous changes to the page via JavaScript incurred penalties from Lighthouse. The solution was to manipulate the contents of the body tag as a string and then reintegrate it, thereby creating only a single visible DOM change.

I now had a minimalist HTML page with an empty body tag, complemented by a small onload function in the head tag. This function inspected the URL and executed an HTML GET request to retrieve the corresponding text file containing the page’s body HTML. One would think this is a suitable solution. Unfortunately, it fell short when I attempted to dynamically load JavaScript functionality.

Unlike other tags, if you add a script tag with a simple alert(“yes this fired”) into the body contents string, it won’t execute. Although not ideal, one workaround was to parse the body string, identify all the script tag contents, and place them in a JavaScript eval function. The approach was somewhat effective but stumbled when dealing with namespaces, and the developer console was flooded with unsightly warnings. The solution was to extract the script tags from the HTML and add them as a script element after the DOM had rendered. Google did not penalize this action for some reason.

Progress was being made, and I had a basic Single Page Architecture solution. But not so fast. While Google is efficient at indexing Single Page Architecture pages (they do this by opening it in a browser, allowing all the JavaScript to run, and then scanning the DOM), Bing, Yahoo, and other major search engines use a similar, simpler method. However, most other platforms like Facebook, Reddit, LinkedIn, and WhatsApp only fetch the HTML file, retrieving a small HTML file with a blank body. My solution was not viable. I now had to replicate this concept for every page on the website and include the JavaScript to switch to Single Page Architecture mode when a user clicked on a link.

I needed a tool capable of generating HTML for each page, based on my solution. It occurred to me that I had the perfect resource at my disposal: my own ERP system, Nexoid. I created a Nexoid model encompassing a website and web page data objects. The website record facilitated the creation of a generic template webpage, while the web page records contained the content for each individual page. The final piece of the puzzle was a workflow function or script that could read the website record and all related web page records to generate the HTML files. After a few days, it was operational. I had created a basic Content Management System (CMS). Developing a CMS to this point is not overly complex; the real challenge arises when integrating other CMS workflows, approvals, localizations, previews, etc.

A key requirement for the new website was localization; we aimed to launch it in 11 languages. Being an IT company, I naturally leaned toward technological solutions. Rather than hiring a translator for every page, I opted for AWS Translate. While AI translators are decent, they’re not perfect, and the errors are noticeable enough to reveal a non-human origin. A French-speaking staff member evaluated the AI translation and gave it a 6/10, describing it as “understandable, but not proper French.”

However, we stumbled upon a valuable trick. We found that feeding the English text through ChatGPT first, asking it to ‘tidy this up’, and then pasting it in, it rewords the text in a way that’s still English but is much more compatible with the language models. Using the ChatGPT-reworded English as the base for translation significantly improves the translation quality, elevating it to a 9 or even a perfect 10 out of 10.

Having developed a solid technological foundation for creating the website, I was making progress. However, a new challenge emerged as we began to build more complex pages. Under the new Lighthouse guidelines, it became necessary to consolidate all JavaScript, CSS, and HTML into a single file. This also applied to the Single Page Architecture versions.

We resorted to inserting all JavaScript and CSS files as inline tags. A similar strategy was required for the Single Page Architecture version. We created a JSON file containing all scripts, styles, and HTML.

Lighthouse identified the next problem as the size of the assets; the HTML and JSON page files were excessively large. I resolved this issue using ‘minify,’ a Node.js library specifically designed to compress HTML, CSS, and JavaScript files. This solution resulted in a reduction of text file size by over 40%. Additionally, minify offered the added benefit of obfuscation, making the raw code more difficult to read, enhancing security.

Let’s delve into the topic of hosting. Traditionally, a Content Management System (CMS) operates via an application server that handles the user’s HTML request. It interprets the page request from the URL, locates the corresponding assets in a database, retrieves the database record (possibly alongside others), processes the information to assemble the page, and finally delivers it to the end user as a flat HTML document. This description primarily pertains to the initial HTML request when a user visits a new website, although I am aware of AJAX and other similar technologies.

However, this conventional model presents certain drawbacks in the context of the new Lighthouse world. Firstly, the back-and-forth communication between the application server and the database server, as well as the page compilation, introduces delays. Secondly, in its simplest form, an application server and a database server are only physically available in a single location. This setup is excellent if you’re in the same building or city, but significantly less efficient if you’re attempting to access the site from the other side of the world. For instance, the average ping latency between Australia and the UK is approximately 250 milliseconds.

Our solution to these challenges involves utilizing AWS S3 for hosting the static files generated by the previously mentioned publish script, and AWS CloudFront for global content distribution. At the time of this writing, AWS CloudFront was distributing content to over 90 cities in 47 countries. For an individual in Melbourne, Australia accessing a UK website, AWS CloudFront reduced the ping latency from 250 milliseconds to a mere 13 milliseconds (this is the time difference between Melbourne and AWS edge servers in Sydney).

We now arrive at the Progressive Web Application (PWA) component of the Lighthouse test, which was not something I had previously given much consideration. For those unfamiliar, a PWA involves a JavaScript service worker that manages the website as a web application. If that’s a bit complex, consider it this way: it’s essentially an automatic downloading and caching tool. When a user visits your website, the goal is to make their subsequent requests as speedy and seamless as possible. The PWA service worker allows you to already have the next assets downloaded to the user’s local machine, eliminating the need for another internet GET request.

At the time of writing this article, the Nexoid website is relatively small, containing only 19 pages. However, those 19 pages are translated into 11 different languages, making a total of 209 pages. Initially, I tried to download every asset into the service worker, which amounted to around 5MB. This size was too large for an initial load, and Lighthouse penalized me for it. I settled on downloading only the English page JSON files, which include all the necessary CSS, HTML, and JavaScript to display each page.

The final structure is as follows: An S3 bucket houses the compiled HTML files, named without the .html extension. For instance, www.nexoid.com/en represents the English homepage HTML, www.nexoid.com/de is for the German homepage HTML, and www.nexoid.com/en/platform refers to the English platform HTML, and so on. Additionally, there are JSON files that contain the parts of the body and head that change when navigating between pages, such as https://www.nexoid.com/en.json, https://www.nexoid.com/de.json, and https://www.nexoid.com/en/platform.json, among others.

In conclusion, comprehending Lighthouse posed a significant challenge. I am skeptical that traditional, out-of-the-box CMS products can effectively tackle this task. Reflecting on my experience with platforms like WordPress and Drupal, I find it hard to believe that they could be optimized to achieve a perfect Lighthouse score. Overall, I believe the effort is worthwhile, and Google is justified in placing more emphasis on performance. However, this shift is and will continue to be a considerable pain point for web designers and agencies.

If you’re interested in learning more about Lighthouse or if you’d like to discuss Nexoid’s products and services, please don’t hesitate to get in touch. You can reach out via LinkedIn or through the ‘Contact Us’ page on our website.

www.nexoid.com/en/contact_us

www.linkedin.com/in/jonathongrantham

--

--

Jonathon Grantham

Proud owner of Nexoid, a B2B SaaS company specializing in ERP and ITSM software. Passionate about improving business performance through clever IT solutions.