最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Uninterruptable ajax request - Stack Overflow

programmeradmin2浏览0评论

Sometimes I have to AJAX a huge JSON payload (~20MB). During this process it seems like the browser window is constantly loading (latest Chrome, Windows 8.1). The user can click on things, like links, but the browser won't actually go to those links until the AJAX request pletes. This can be a problem because if you go to a page that requires such an AJAX call, it is difficult to navigate away. Unfortunately at this time I cannot reduce the size of the payload much (I am loading graphs, and sometimes the graphs are hundreds of thousands of nodes and edges).

Any idea why the browser won't navigate away (even though the browser is responsive)? And, if possible, any potential solutions? Thanks!

Sometimes I have to AJAX a huge JSON payload (~20MB). During this process it seems like the browser window is constantly loading (latest Chrome, Windows 8.1). The user can click on things, like links, but the browser won't actually go to those links until the AJAX request pletes. This can be a problem because if you go to a page that requires such an AJAX call, it is difficult to navigate away. Unfortunately at this time I cannot reduce the size of the payload much (I am loading graphs, and sometimes the graphs are hundreds of thousands of nodes and edges).

Any idea why the browser won't navigate away (even though the browser is responsive)? And, if possible, any potential solutions? Thanks!

Share Improve this question asked Dec 12, 2013 at 19:39 tautau 6,74910 gold badges41 silver badges63 bronze badges 10
  • 1 Have you thought about breaking your JSON into pieces? You might also consider streaming it through something like BinaryJS and using one of those extensions that allows "streaming JSON" where you can access elements of a resolved JSON object while it continues to download. – Brad Commented Dec 12, 2013 at 19:45
  • 2 Are you using synchronous ajax? If so, stop doing that. – Pointy Commented Dec 12, 2013 at 19:47
  • 1 Consider: * Chunking the JSON * Opening a separate page for upload when data is larger than X * Upload as a more size-efficient format? – Kroltan Commented Dec 12, 2013 at 19:49
  • yes i have considered those things. im more curious to know if anyone knows why it happens. so when ajax'ing such a big file locally, there is a super small slowness when the request is plete. over a network, its like that slowness seems to be elongated, which leads me to believe the browser is dealing with the payload over multiple "requests", each of which has more significant overhead than my local machine. just an idea. – tau Commented Dec 12, 2013 at 21:15
  • Maybe you can opt to try and g-zip the return JSON. If it's possible. Or any pression system for returning JSON. – Jekk Commented May 19, 2014 at 5:29
 |  Show 5 more ments

5 Answers 5

Reset to default 6 +25

I would suggest using the an HTML5 Web Worker. The browser support is not great, especially for IE.

Web workers are non-blocking, meaning you can run two scripts concurrently. This should free up your DOM while you are making the AJAX call in the background.

Here is one article I could find on using web workers with an AJAX call.

I don't know how to address the exact technical problem you are facing. However this can still be useful. Consider Google or Facebook where you can download your full profile as an archive with the all your history and photos. Take a note on how they do it (not through AJAX calls). I believe they chose this way because of the very same problems you have. Just somthing to think about.

I faced a similar problem several months ago. I found a solution that uses setTimeout(), which creates a separate event in the browser event queue. This is basically what I did:

jQuery(document).ready(function() {

  setTimeout(function () {

    $.ajax({
      url: "/longAjaxCall",
      dataType: "json"
    }).done(function() {
      // do whatever ...
    });

  }, 0);

});

Not sure whether or not you're using jQuery, but regardless the basic principle is the same.

I would initially send a MD5 sum and a chunk number to divide the data into chunks. And I would make a js script that slowlly asked for them. Then join it, later check if the full data was sent correctly, and finally use it.

The only two things I can think of is that either like many already said above your AJAX call is not asynchronous or that you may have used up all the parallel http connection threads available for that browser. The AJAX call is blocking one so depending on the other interactions that you are allowing the users to do it may well be that you are just stuck downloading images and scripts in the background and effectively preventing the user to move forward until either one pletes.

20 MB is indeed a big chunk so I also agree with others that maybe you should explore alternative ways of achieving what you need if possible.

发布评论

评论列表(0)

  1. 暂无评论