Skip to content

Processing large arrays in javascript

by drphrozen on August 3rd, 2012

Today i was debugging some Javascript code, where a long running script made Firefox cry (Chrome took it like a man ;-) ). The problem was that i had to process a large array and inject it into an existing table in the DOM. And although i did some optimizations (removed jQuery element creations and used direct id lookup) it was still not fast enough.

Knowing that Javascript behaves somewhat similar to old school winforms (one thread for event processing) i knew i had to free this process. Apparently this is done using setTimeout or setInterval. These are not compatible with jQuery’s $.each function so i decided to make my own:


(function( $ ) {
    $.eachWait = function(arr, callback, interval, bulksize) {
        interval = interval || 10;
        bulksize = bulksize || 1;
        var index = 0;
        var intervalObj = setInterval(function() {
            for(var j = 0; j<bulksize; j++) {
                if (index < arr.length) {
                    try {
                        callback(index, arr[index]);
                    } catch(e) {
                        clearInterval(intervalObj);
                        throw e;
                    }
                } else {
                    clearInterval(intervalObj);
                    break;
                }
                index++;
            }
        }, interval);
  };
})( jQuery );


// Usage:
$(function() {
    var arr = [0,1,2,3,6,5,6,7,8,9];
    $.eachWait(arr, function(index, value) {
        if(arr[index] != index)
            throw "noooo";
        $('body').append($('<p/>').text(index));
    });
});

Im not sure this is the correct way to do things, but it works… The additional parameters are interval: time to wait between executions (in ms, see setInterval) and bulksize: number of elements to process between timeouts.

Now i need some coffee :-)

From → Uncategorized

No comments yet

Leave a Reply

Note: XHTML is allowed. Your email address will never be published.

Subscribe to this comment feed via RSS