You should load the results into smaller pieces. In pseudo code, it will be something like this:
loadDataUsingAjax(url, index) {
load the first index to index + 250 items async using an ajax call;
if there are still more items
a few mili seconds later call loadDataUsingAjax(url, index + 500);
}
loadDataUsingAjax(url, 0);
Otherwise, most browsers, especially on slower computers, freeze for several seconds while they try to update the DOM.
UPDATE : actual jQuery code
var CHUNK_SIZE = 500;
var DELAY = 100;
function loadDataUsingAjax(ajaxUrl, index) {
$.ajax({
url: ajaxUrl,
data: {startIndex: index, chunkSize: CHUNK_SIZE},
dataType: 'json',
success: function(response) {
if (response.hasMoreResults) {
setTimeout(function() {
loadDataUsingAjax(ajaxUrl, index + CHUNK_SIZE);
}, DELAY);
}
}
});
}
loadDataUsingAjax("yourUrl", 0);
Your server side script should do something like this:
startIndex = get the value of the startIndex request parameter;
chunkSize = get the value of the chunkSize request parameter;
select ... from ... where ... limit startIndex, startIndex + chunkSize;
create a json result from the MySQL result set;
select count(...) from ... where ...;
if count(...) is > startIndex + chunkSize then set hasMoreElements = true
source
share