Jim Nielsenā€™s Blog
Preferences
Theme: This feature requires JavaScript as well as the default site fidelity (see below).
Fidelity:

Controls the level of style and functionality of the site, a lower fidelity meaning less bandwidth, battery, and CPU usage. Learn more.

Thoughts on Avoiding an Excessive DOM Size

I recently read Web Platform News #40 and saw this:

The Lighthouse tool for auditing web pages suggests avoiding an excessive DOM sizeā€¦Lighthouse recommends keeping a pageā€™s DOM size below 1,500 DOM elements.

I had never seen this before. I knew a large DOM could create performance problems, but Iā€™d only encountered them when trying to script against a large DOM. In my own (anecdotal) experience, loading an HTML with lots of DOM nodes was never necessarily a performance hit.

I wanted more color around this recommendation, so I read through the page Lighthouse links to:

Lighthouse flags pages with DOM trees that:

Have more than 1,500 nodes total.
Have a depth greater than 32 nodes.
Have a parent node with more than 60 child nodes.

Interesting.

I wanted to see this in action for myself, so I pulled up a classic website that surely has lots of DOM nodes: Wikipedia. Specifically, I looked at the World Wide Web entry and found Lighthouse taking exception with the size of the DOM:

Screenshot of the World Wide Web entry on Wikipedia in Chrome with the Lighthouse dev tools open showing a warning about the size of the DOM.

At the time of this writing, the recommended limit for DOM elements is 1,500. This Wikipedia page came in at 4,606.

(How exactly does DOM size gets calculated? Iā€™m not sure. I ran document.querySelectorAll("*").length in the console and got the number 4,641, which is pretty close to what Lighthouse reported but this method isnā€™t very scientific or reproducible across different web pages. Looking at the source code for Lighthouse you could probably derive how DOM size is calculated.)

Whatā€™s intriguing about this warning on DOM size is that it doesnā€™t appear (at least at the moment) to have any bearing on the performance score, as this Wikipedia page came in at 100 (for a ā€œDesktopā€ performance audit).

Screenshot of the World Wide Web entry on Wikipedia in Chrome with the Lighthouse dev tools open showing a performance score of 100.

Out of curiosity, I wanted to try and find a Wikipedia page whose DOM size would be even larger, so I tried the entry for Human. It weighed in at 12,075 DOM elements.

What I found intriguing about the DOM size warning in Lighthouse was the callout to the DOM element with the most children (likely intended as a hint to an element in the DOM you should consider refactoring). In this case, the DOM element with the most children was the references!

Screenshot of the Wikipedia entry for ā€˜Humanā€™ with the Lighthouse dev tools open showing a warning about the number of DOM elements.

Wouldnā€™t want to cite too many sources now would we?

Iā€™m being flippant here, but only in part.

A guideline that a web pageā€™s DOM should avoid being too large? I can buy that. The codification of that guideline into an industry standard tool? That causes me hesitation.

The fuzziness of human language allows us to recommend a ā€œbest practiceā€ like ā€œdonā€™t let your DOM size get too bigā€ while still providing room for nuance which may alter or even void the recommendation entirely.

However, codifying a guideline or best practice into an automated tool which can be measured requires lines be drawn. A metric, however arbitrary, must be chosen in order to make a measurement and provide a judgement.

In this particular case, itā€™s choosing a maximum number of DOM elements ā€” 1,500 ā€” to represent a threshold at which point your DOM has become too big. Why 1,500? I donā€™t know. But the binary nature of that number boggles my brain a bit. 1,499 DOM elements? Green check, youā€™re ok. 1,501 DOM elements? Red alert, youā€™re doing something wrong!

A closer look the official rationale behind avoiding a large DOM outlines three reasons why a large DOM can contribute to slower performance, two of which hinge on JavaScript running. No JavaScript? Thereā€™s only one (stated) reason to limit your DOM size: ā€œnetwork efficiency and load performanceā€:

A large DOM tree often includes many nodes that aren't visible when the user first loads the page, which unnecessarily increases data costs for your users and slows down load time.

Ok, I can agree with that in principle. Makes senseā€”the bigger anything is on a computer, the more compute resources will be required to handle it.

But in practice, it seems to me thereā€™s more nuance to the question of DOM size than having 1,500 elements or less.

However nice as a guideline, Iā€™m not sure I buy an arbitrary limit on DOM nodes codified into a performance tool everyone uses. It might seem harmless, but we are shaping our performance tool which will shape us and how we think about what the web is, how it should work, and what it ultimately should be. Limiting DOM size is a specific point of view about the nature of a web page and is therefore limiting, perhaps exclusionary, in its vision of what a web page can be.

As a simple illustration of what Iā€™m trying to get at, consider the book Moby Dick. The fact that you can access that book digitally is quite astounding, a feat many wouldā€™ve marveled at thirty years ago. The entire book available at a URL as HTML. No need for an app that optimizes for performance by only allowing access to one chapter at a time while requiring a ā€œsave for offline readingā€ feature to download the entire book. Just fast, performant, searchable text delivered as HTML with no JavaScript ā€” and no need to keep the DOM size small.

Screenshot of the Lighthouse dev tool flagging the large DOM size for ā€˜Moby Dickā€™ on Project Gutenburg.

The beauty of the web is that itā€™s bigger than any rules we try to draw around it. As Rich Harris stated in his talk on transitional apps, ā€œ[the web is] a medium that by its very nature resists definitional boundaries.ā€

Perhaps thereā€™s more room for nuance and range in our performance tools and metrics.

Update 2021-10-19

@anthony_ricaud hit me up on Twitter noting that the ā€œHumanā€ page on Wikipedia pales in comparison to the DOM size and depth of the HTML spec. Itā€™s a single page of HTML with 278,148 elements!