Browsers keep eating memory with AJAX + setInterva

2019-03-09 22:27发布

I need to update a lot of data within a given interval with JavaScript. The problem is, no matter of what JS library i use (even bare-bone js), that all browsers seem to allocate memory on every AJAX request and are not able to free it afterwards. Here is a sample snipped that should reproduce the error:

    <!DOCTYPE html>
    <html lang="en">
        <head>
            <title>Memleak Test</title>
            <meta charset="utf-8" />
            <script type="text/javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.5.2/jquery.min.js"></script>
            <script type="text/javascript">

                function readData() {
                    $.getJSON('data.php');
                }

                $(document).ready(function() {
                    setInterval(readData, 1000);
                });
            </script>
        </head>
        <body>
            <div id="content"></div>
        </body>
    </html>

An equivalent test page is available at jsbin

Here is more info on this:

  • I also tried to put the readData() function as a closure directly in the setInterval() call. This doesn't seem to make any difference.
  • I use jQuery here, but any other Library would produce the same errors.
  • My data.php script just produces a fake JSON-Object with json_encode() in PHP.
  • I know that one second is a short timeframe here, in my production script the timeframe is 30 seconds. I just wanted to see the effect quicker (in the production app it takes hours but then the memory is full too).
  • The problem here is that the app will be open 24/7.

It seems so simple that I think I'm doing something really wrong here, it would be great if some of the JS gurus in here can help me out!

7条回答
劳资没心,怎么记你
2楼-- · 2019-03-09 23:12

One possible problem with the same posted is that if the XHR requests take longer than the poll period (on average) there will be an increasing queue of pending requests. If the web-server itself starts to backlog requests this can turn into a vicious cycle.

To avoid this possible case, use a CPS-style coding where the next action is done using the appropriate callbacks. That is, don't start the next request until required (previous request complete: success or failure) -- this approach can still be used to create a manual request queue with a controllable size if numerous outstanding requests are required.

Also, make sure that unused objects are be eligible for reclamation as this is the "standard" cause of a "memory leak" in a GC language.

Happy coding.


The code in the post contains nothing that will inherently leak memory. It could possibly be an issue internally with jQuery, but this is just speculation. Additionally, tools like Firebug that monitor XHR/web requests can consume significant amounts of memory so it's something to check and make sure the behavior is not a Heisenberg.


Also, remember that increasing memory usage doesn't not indicate a memory leak unless it grows unbounded. A garbage collection cycle will only occur when the hosts feels like it.

查看更多
登录 后发表回答