← Back to context

Comment by tipiirai

3 days ago

This exact demo will crash with vanilla JavaScript (in Chrome 134.0). This React would also crash — unless the computation relies on WASM

Make a demo with react-virtualized[0] and see if it crashes. Hint: It will not[1]. React can easily render 1 million rows with high performance without relying on WASM [2]

Here is the demo of react-virtualized[3], in which I entered 10m as the row count and scrolled to the bottom without crashing.

[0] https://github.com/bvaughn/react-virtualized

[1] https://www.youtube.com/watch?v=1JoEuJQIJbs

[2] https://medium.com/@priyankadaida/how-to-render-a-million-ro...

[3] https://bvaughn.github.io/react-virtualized/#/components/Lis...

*Update: Here I made a table with 1 million rows with search, filtering, and pagination. In plain Javascript:

https://htmlpreview.github.io/?https://gist.githubuserconten...

Could you give a code example? Also, by crash, do you mean the mentioned stack overflow error?

If so, why would the stack be involved when talking element count?

  • Because he constructs a giant JSON by joining individual entries. Rendering that directly on the DOM will always cause the performance issues (even at the 10k entries). That's why you need to use virtualized list, it can be done in plain JS or using libraries like react-virtualized.

    This works, plain JS 150k rows

        <style>
            #viewport {
                height: 600px;
                overflow-y: scroll;
                position: relative;
                border: 1px solid #ccc;
                width: 400px;
                margin: auto;
            }
    
            .item {
                position: absolute;
                left: 0;
                right: 0;
                height: 30px;
                padding: 5px;
                box-sizing: border-box;
                border-bottom: 1px solid #eee;
                font-family: Arial, sans-serif;
            }
        </style>
    
        <div id="viewport">
            <div id="content"></div>
        </div>
    
    
        <script>
            const viewport = document.getElementById('viewport');
            const content = document.getElementById('content');
            const itemHeight = 30;
            const totalItems = 150000;
    
            const items = Array.from({length: totalItems}, (_, i) => ({
                id: i + 1,
                name: `User #${i + 1}`
            }));
    
            content.style.height = `${totalItems * itemHeight}px`;
    
            function render() {
                const scrollTop = viewport.scrollTop;
                const viewportHeight = viewport.clientHeight;
                const start = Math.floor(scrollTop / itemHeight);
                const end = Math.min(totalItems, start + Math.ceil(viewportHeight / itemHeight) + 10);
    
                content.innerHTML = '';
    
                for (let i = start; i < end; i++) {
                    const div = document.createElement('div');
                    div.className = 'item';
                    div.style.top = `${i * itemHeight}px`;
                    div.textContent = items[i].name;
                    content.appendChild(div);
                }
            }
    
            viewport.addEventListener('scroll', render);
            render();
        </script>

  • The exact error is "Maximum call stack size exceeded" when the WASM- engine is replaced with this JS engine:

    https://github.com/nuejs/nue/blob/master/packages/examples/s...

    There is currently no demo about the crash, but you can setup this locally.

    • `events.push(...arr)` puts all arguments on the call stack before calling the method, which causes the error. Don't push tens of thousands of items at once.

      4 replies →

    • You're solving a problem nobody has. If you encounter this problem, you shouldn't think "ah, let's yeet the JS engine because it clearly isn't good enough for my awesome SPA", you should think "hm, maybe I shouldn't render 10000000000 records in the DOM".

      What's next? "Oh I have a memory leak, let's get a subscription on RAM modules and just keep adding them!"

No. Back when supporting ie 9 we had tables with a million rows and dozens of columns and it runs fine.