Why is array.push sometimes faster than array[n] = value?

JavascriptArraysPerformanceFirefoxBrowser

Javascript Problem Overview


As a side result of testing some code I wrote a small function to compare the speed of using the array.push(value) method vs direct addressing array[n] = value. To my surprise the push method often showed to be faster especially in Firefox and sometimes in Chrome. Just out of curiosity: anyone has an explanation for it? You can find the test @this page (click 'Array methods comparison')

Javascript Solutions


Solution 1 - Javascript

All sorts of factors come into play, most JS implementations use a flat array that converts to sparse storage if it becomes necessary later on.

Basically the decision to become sparse is a heuristic based on what elements are being set, and how much space would be wasted in order to remain flat.

In your case you are setting the last element first, which means the JS engine will see an array that needs to have a length of n but only a single element. If n is large enough this will immediately make the array a sparse array -- in most engines this means that all subsequent insertions will take the slow sparse array case.

You should add an additional test in which you fill the array from index 0 to index n-1 -- it should be much, much faster.

In response to @Christoph and out of a desire to procrastinate, here's a description of how arrays are (generally) implemented in JS -- specifics vary from JS engine to JS engine but the general principle is the same.

All JS Objects (so not strings, numbers, true, false, undefined, or null) inherit from a base object type -- the exact implementation varies, it could be C++ inheritance, or manually in C (there are benefits to doing it in either way) -- the base Object type defines the default property access methods, eg.

interface Object {
    put(propertyName, value)
    get(propertyName)
private:
    map properties; // a map (tree, hash table, whatever) from propertyName to value
}

This Object type handles all the standard property access logic, the prototype chain, etc. Then the Array implementation becomes

interface Array : Object {
    override put(propertyName, value)
    override get(propertyName)
private:
    map sparseStorage; // a map between integer indices and values
    value[] flatStorage; // basically a native array of values with a 1:1
                         // correspondance between JS index and storage index
    value length; // The `length` of the js array
}

Now when you create an Array in JS the engine creates something akin to the above data structure. When you insert an object into the Array instance the Array's put method checks to see if the property name is an integer (or can be converted into an integer, e.g. "121", "2341", etc.) between 0 and 2^32-1 (or possibly 2^31-1, i forget exactly). If it is not, then the put method is forwarded to the base Object implementation, and the standard [[Put]] logic is done. Otherwise the value is placed into the Array's own storage, if the data is sufficiently compact then the engine will use the flat array storage, in which case insertion (and retrieval) is just a standard array indexing operation, otherwise the engine will convert the array to sparse storage, and put/get use a map to get from propertyName to value location.

I'm honestly not sure if any JS engine currently converts from sparse to flat storage after that conversion occurs.

Anyhoo, that's a fairly high level overview of what happens and leaves out a number of the more icky details, but that's the general implementation pattern. The specifics of how the additional storage, and how put/get are dispatched differs from engine to engine -- but this is the clearest i can really describe the design/implementation.

A minor addition point, while the ES spec refers to propertyName as a string JS engines tend to specialise on integer lookups as well, so someObject[someInteger] will not convert the integer to a string if you're looking at an object that has integer properties eg. Array, String, and DOM types (NodeLists, etc).

Solution 2 - Javascript

These are the result I get with your test

on Safari:

  • Array.push(n) 1,000,000 values: 0.124 sec
  • Array[n .. 0] = value (descending) 1,000,000 values: 3.697 sec
  • Array[0 .. n] = value (ascending) 1,000,000 values: 0.073 sec

on FireFox:

  • Array.push(n) 1,000,000 values: 0.075 sec
  • Array[n .. 0] = value (descending) 1,000,000 values: 1.193 sec
  • Array[0 .. n] = value (ascending) 1,000,000 values: 0.055 sec

on IE7:

  • Array.push(n) 1,000,000 values: 2.828 sec
  • Array[n .. 0] = value (descending) 1,000,000 values: 1.141 sec
  • Array[0 .. n] = value (ascending) 1,000,000 values: 7.984 sec

According to your test the push method seems to be better on IE7 (huge difference), and since on the other browsers the difference is small, it seems to be the push method really the best way to add element to an array.

But I created another simple test script to check what method is fast to append values to an array, the results really surprised me, using Array.length seems to be much faster compared to using Array.push, so I really don't know what to say or think anymore, I'm clueless.

BTW: on my IE7 your script stops and browsers asks me if I want to let it go on (you know the typical IE message that says: "Stop runnign this script? ...") I would recoomend to reduce a little the loops.

Solution 3 - Javascript

push() is a special case of the more general [[Put]] and therefore can be further optimized:

When calling [[Put]] on an array object, the argument has to be converted to an unsigned integer first because all property names - including array indices - are strings. Then it has to be compared to the length property of the array in order to determine whether or not the length has to be increased. When pushing, no such conversion or comparison has to take place: Just use the current length as array index and increase it.

Of course there are other things which will affect the runtime, eg calling push() should be slower than calling [[Put]] via [] because the prototype chain has to be checked for the former.


As olliej pointed out: actual ECMAScript implementations will optimize the conversion away, ie for numeric property names, no conversion from string to uint is done but just a simple type check. The basic assumption should still hold, though its impact will be less than I originally assumed.

Solution 4 - Javascript

Here is a good testbed, which confirms that direct assignment is significantly faster than push: http://jsperf.com/array-direct-assignment-vs-push.

Edit: there seems to be some problem in showing cumulative results data, but hopefully it gets fixed soon.

Solution 5 - Javascript

array[n] = value (when ascending) is always faster than array.push if the array in the former case is initialised with a length first.

From inspecting the javascript source code of your page, your Array[0 .. n] = value (ascending) test does not initialize the array with a length in advance.

So Array.push(n) sometimes comes ahead on the first run, but on subsequent runs of your test then Array[0 .. n] = value (ascending) actually consistently performs best (in both Safari and Chrome).

If the code is modified so it initialises an array with a length in advance like var array = new Array(n) then Array[0 .. n] = value (ascending) shows that array[n] = value performs 4.5x to 9x faster than Array.push(n) in my rudimentary running of this specific test code.

This is consistent with other tests, like @Timo Kähkönen reported. See specifically this revision of the test he mentioned: https://jsperf.com/push-method-vs-setting-via-key/10

The modified code, so you may see how I edited it and initialised the array in a fair manner (not unnecessarily initialising it with a length for the array.push test case):

function testArr(n, doPush){

  var now = new Date().getTime(),
                  duration,
                  report =  ['<b>.push(n)</b>',
                             '<b>.splice(0,0,n)</b>',
                             '<b>.splice(n-1,0,n)</b>',
                             '<b>[0 .. n] = value</b> (ascending)',
                             '<b>[n .. 0] = value</b> (descending)'];
  doPush = doPush || 5;

  if (doPush === 1) {
   var arr = [];
   while (--n) {
     arr.push(n);
   }
  } else if (doPush === 2) {
   var arr = [];
   while (--n) {
    arr.splice(0,0,n);
   }
  } else if (doPush === 3) {
   var arr = [];
   while (--n) {
    arr.splice(n-1,0,n);
   }
  } else if (doPush === 4) {
   var arr = new Array(n);
   for (var i = 0;i<n;i++) {
    arr[i] = i;
   }
  } else {
    while (--n) {
    var arr = [];
      arr[n] = n;
    }
  }
  /*console.log(report[doPush-1] + '...'+ arr.length || 'nopes');*/
  duration = ((new Date().getTime() - now)/1000);
  $('zebradinges').innerHTML +=  '<br>Array'+report[doPush-1]+' 1.000.000 values: '+duration+' sec' ;
  arr = null;
}

Solution 6 - Javascript

Push adds it to the end, while array[n] has to go through the array to find the right spot. Probably depends on browser and its way to handle arrays.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionKooiIncView Question on Stackoverflow
Solution 1 - JavascriptolliejView Answer on Stackoverflow
Solution 2 - JavascriptMarco DemaioView Answer on Stackoverflow
Solution 3 - JavascriptChristophView Answer on Stackoverflow
Solution 4 - JavascriptTimo KähkönenView Answer on Stackoverflow
Solution 5 - JavascriptMagneView Answer on Stackoverflow
Solution 6 - JavascriptStiroporView Answer on Stackoverflow