But why's the browser DOM still so slow after 10 years of effort?

JavascriptDom

Javascript Problem Overview


The web browser DOM has been around since the late '90s, but it remains one of the largest constraints in performance/speed.

We have some of the world's most brilliant minds from Google, Mozilla, Microsoft, Opera, W3C, and various other organizations working on web technologies for all of us, so obviously this isn't a simple "Oh, we didn't optimize it" issue.

My question is if i were to work on the the part of a web browser that deals specifically with this, why would I have such a hard time making it run faster?

My question is not asking what makes it slow, it's asking why hasn't it become faster?

This seems to be against the grain of what's going on elsewhere, such as JS engines with performance near that of C++ code.

Example of quick script:

for (var i=0;i<=10000;i++){
    someString = "foo";
}

Example of slow because of DOM:

for (var i=0;i<=10000;i++){
    element.innerHTML = "foo";
}

Some details as per request:

After bench marking, it looks like it's not an unsolvable slow issue, but often the wrong tool is used, and the tool used depends on what you're doing cross-browser.

It looks like the DOM efficiency varies greatly between browsers, but my original presumption that the dom is slow and unsolvable seems to be wrong.

I ran tests against Chrome, FF4, and IE 5-9, you can see the operations per second in this chart:

enter image description here

Chrome is lightning fast when you use the DOM API, but vastly slower using the .innerHTML operator (by a magnitude 1000-fold slower), however, FF is worse than Chrome in some areas (for instance, the append test is much slower than Chrome), but the InnerHTML test runs much faster than chrome.

IE seems to actually be getting worse at using DOM append and better at innerHTML as you progress through versions since 5.5 (ie, 73ops/sec in IE8 now at 51 ops/sec in IE9).

I have the test page over here:

http://jsperf.com/browser-dom-speed-tests2

What's interesting is that it seems different browsers seem to all be having different challenges when generating the DOM. Why is there such disparity here?

Javascript Solutions


Solution 1 - Javascript

When you change something in the DOM it can have myriad side-effects to do with recalculating layouts, style sheets etc.

This isn't the only reason: when you set element.innerHTML=x you are no longer dealing with ordinary "store a value here" variables, but with special objects which update a load of internal state in the browser when you set them.

The full implications of element.innerHTML=x are enormous. Rough overview:

  • parse x as HTML
  • ask browser extensions for permission
  • destroy existing child nodes of element
  • create child nodes
  • recompute styles which are defined in terms of parent-child relationships
  • recompute physical dimensions of page elements
  • notify browser extensions of the change
  • update Javascript variables which are handles to real DOM nodes

All these updates have to go through an API which bridges Javascript and the HTML engine. One reason that Javascript is so fast these days is that we compile it to some faster language or even machine code, masses of optimisations happen because the behaviour of the values is well-defined. When working through the DOM API, none of this is possible. Speedups elsewhere have left the DOM behind.

Solution 2 - Javascript

Firstly, anything you do to the DOM could be a user visible change. If you change the DOM, the browser has to lay everything out again. It could be faster, if the browser caches the changes, then only lays out every X ms (assuming it doesn't do this already), but perhaps there's not a huge demand for this kind of feature.

Second, innerHTML isn't a simple operation. It's a dirty hack that MS pushed, and other browsers adopted because it's so useful; but it's not part of the standard (IIRC). Using innerHTML, the browser has to parse the string, and convert it to a DOM. Parsing is hard.

Solution 3 - Javascript

Original test author is Hixie (http://nontroppo.org/timer/Hixie_DOM.html).

This issue has been discussed on StackOverflow here and Connect (bug-tracker) as well. With IE10, the issue is resolved. By resolved, I mean they have partially moved on to another way of updating DOM.

IE team seems to handle the DOM update similar to Excel-macros team at Microsoft, where it's considered a poor practice to update the live-cells on the sheet. You, the developer, is supposed to take the heavy lifting task offline and then update the live team in batch. In IE you are supposed to do that using document-fragment (as opposed to document). With new emerging ECMA and W3C standards, document-frags are depreciated. So IE team has done some pretty work to contain the issue.

It took them few weeks to strip it down from ~42,000 ms in IE10-ConsumerPreview to ~600 ms IE10-RTM. But it took lots of leg pulling to convince them that this IS an issue. Their claim was that there is no real-world example which has 10,000 updates per element. Since the scope and nature of rich-internet-applications (RIAs) can't be predicted, its vital to have performance close to the other browsers of the league. Here is another take on DOM by OP on MS Connect (in comments):

> When I browse to http://nontroppo.org/timer/Hixie_DOM.html, it takes > ~680ms and if I save the page and run it locally, it takes ~350ms! > > Same thing happens if I use button-onclick event to run the script > (instead of body-onload). Compare these two versions: > > jsfiddle.net/uAySs/ <-- body onload > >vs. > > jsfiddle.net/8Kagz/ <-- button onclick > > Almost 2x difference..

Apparently, the underlying behavior of onload and onclick varies as well. It may get even better in future updates.

Solution 4 - Javascript

Actually, innerHTML is less slow than createElement.

In an effort to optimize I found js can parse enormous json effortlessly. Json parsers can have a huge number of nested function calls without issues. One can toggle between display:none and display:block thousands of elements without issues.

But if you try create a few thousand elements (or even if you simply clone them) performance is terrible. You don't even have to add them to the document!

Then, when they are created, insert and remove from the page works supper fast again.

It looks to me like the slowness has little to do with their relation to other elements of the page.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionIncognitoView Question on Stackoverflow
Solution 1 - JavascriptspraffView Answer on Stackoverflow
Solution 2 - JavascriptwistyView Answer on Stackoverflow
Solution 3 - Javascriptvulcan ravenView Answer on Stackoverflow
Solution 4 - Javascriptuser40521View Answer on Stackoverflow