Comment by riehwvfbk
2 months ago
It did make all those requests, but only because the author set up caching incorrectly. If the cache headers were to be corrected, site.xsl, pages.xml, and posts.xml would only need to be downloaded once.
2 months ago
It did make all those requests, but only because the author set up caching incorrectly. If the cache headers were to be corrected, site.xsl, pages.xml, and posts.xml would only need to be downloaded once.
The cache headers are correct, you can't indefinitely cache those because they might change. Maybe you could get away with a short cache time but you can't cache them indefinitely like you can a javascript bundle.
Not to mention on a more involved site, each page will probably include a variety of components. You could end up with deeper nesting than just 4, and each page could reveal unique components further increasing load times.
I don't see much future in an architecture that inherently waterfalls in the worst way.
There are cache times other than 0 and infinity. Ideally the XSLT would change rarely, as would things like nav menus. So "relatively short" could mean several minutes to an hour. And with ETags the resource could be revalidated before expiry and never have to be re-downloaded.
ETags still require a round trip. You could cache for longer but now you have to deal with the complexities and struggles of caching.
2 replies →