Is there any limit on number of html elements, tha

2020-08-13 11:53发布

问题:

Basically I've got a huge table, which gets even bigger as user scrolls down (auto preloading subsequent rows). At some point browser becomes sluggish, it starts to hang for a moment as I click around or try to scroll and more sluggish it becomes, the more rows it gets. I wonder if there is any limit on number of elements that page can hold? Or maybe it's just my javascript leaking somewhere (although I've got only one event handler, attached to the tbody of the table - and a script that parses bubbled mousedown events).

Update: Delay becomes noticeable after a thousand of loaded rows. The speed of scroll itself is pretty bearable, but for example highlighting of the clicked row (with the help of single event handler on tbody) is painful (it takes at least 2-3 seconds and delay grows with the number of rows). I observe delay on all browsers. It's not only me, but almost everyone who visits the page, so I guess at some extent it affects every platform.

Update: I came up with simple example here: http://client.infinity-8.me/table.php?num=1000 (you can pass whatever number you want to num), basically it renders a table with num rows and has a single event handler attached to a parent table. I should conclude from this, that there actually is no noticeable dropdown in performance, caused by number of child elements. So it's probably a leak somewhere else :(

回答1:

I don't think there is a limit defined by the standard. There might be a limit hard-coded in each browser implementation, though I would imagine that this limit is likely to be billions of elements. Another limit is the amount of addressable memory.

To solve your problem: As well as automatically loading elements as you scroll down, you could automatically unload the ones that have scrolled up off the screen. Then your program will remain fast even after scrolling a lot.

You may also want to consider an alternative interface such as paging.



回答2:

Another thing you should look at is table sizing. If you have your table styled with table-width:auto; the browser has to measure every single element in the table to size it. This can get insanely slow.

Instead, choose a fixed width or at least style the table using table-width:fixed.



回答3:

If you have got JS on each table row then old computers will not handle that. For HTML itself you shouldn't worry much.

You should worry about fact that normal human being doesn't like large tables this is what pagination is made for. Separate it using paging for better usability nor other concerns.

Think of book that doesn't have pages but one large page, would you like to read it? Even if your eyes (PC in our case) can handle it.



回答4:

I don't think there is a limit. However, the longer a HTML file is, the more resources, your computer will need. But the table has to be very large then...



回答5:

The limit is really determined by the user agent and client machine being used. HTML, in the same way as XML, is a tree format of data. Therefore the more elements, the further through the tree the client browser has to search to render the page.

I had issues adding more than 100 tables to a div (as an old workaround to IE6 not being able to create table elements dynamically).



回答6:

What browser are you using? Some browsers can handle things better than others - for instance, if you're using IE, I wouldn't be surprised if it's sluggish - it doesn't handle javascript events as well as webkit based browsers.

The other thing is obviously your computer (RAM and CPU). But to be honest, most computers shouldn't have a problem with that unless we're talking 10000+ rows... and even then...

Can you post some code?



回答7:

Given that there exists a multitude of browsers and rendering engines and all have different performance characteristics and also given that those engines get steadily improved in regard to performance and computer hardware gets faster all the time: No there is no fixed upper limit what a browser can handle. However, there are current upper limits on specific hardware for specific versions of browsers.

If you do not define more what your hardware and browser are its hard to help you. Also, noone can make any suggestion in regard to the possible performance of your Javascript if you don't post the code. Even if it's just one event handler, if it loops infinitly or if its called for each element, it can considerably slow down the rendering process.



回答8:

Nevermind RAM or CPU usage, what's the actual size of the file after it preloads?

If your table is really that huge, you could be forcing your users to download megabytes of data - I've found that some machines tend to hang after 2-4MB of data.



回答9:

Depends. IE for example will not start rendering a table until all the content is loaded. So if you had a 5,000 row table it needs to load all 5,000 rows of data before rendering any of it where as other browsers start rendering once they have partial data and just adjust a bit (if needed) as the table grows.

Generally speaking rendering slows with the quantity of nodes but also with the complexity of nodes... Avoid nested tables, and if at all possible, try to break up huge tables into chunks... E.g. Every 100 rows (if possible)



回答10:

There really isn't a reason in the world for publishing an entire huge dataset on a single page. If the requirement is to provide the user with all that data, then you should export it to a file that can be read by some better software than a browser.

Instead, I suggest that you make an AJAX driven page, where you let the user see a portion of the data and if they need to see more, you would just download that portion of the dataset and replace the current dataset on the page. This is pagination. Google search is an excellent example of this.



回答11:

If there are any limits, it depends on the browser. But the problem you have it not about limit, since the browser still displays the page.

Big tables are always problem with browsers. Rendering a large table takes lots of time. Therefore, it is often good idea to split a large table into smaller tables.

Further, you probably want to specify the column widths. Without that, the browser has to download the whole table before it can calculate width of each column and render the table. If you specify the widths in the HTML code, the browser can display the page while it is still downloading. (Note: specifying the width of the whole table is not enough, you need to specify the width of each column.)

If you add a single line into a big table using Javascript, the browser most likely has to render the whole table again. This is why it becomes so slow. If you have smaller tables, only the one small table needs to be rendered again. Better still, if you load one sub-table at a time instead of just one line.

But the most effective method is to split the data into multiple pages. The users probably prefer that, too. That is why for example Google displays only so many results on each page.