← Back to context

Comment by wereHamster

17 days ago

CDN wouldn't help much. These days browsers partition caches by origin, so if two different tools (running on different domains) fetch the same model from the CDN, the browser would download it twice.

Did not know that. That sounds extraordinary wasteful, there must be a file hash based method that would allow sharing such files between domains.

  • It offers security.

    Just like you wouldn't use same table in your system for all users in a multi tenant application.

    • If the file is hashed strongly enough then it can be no other file. I can see how information on previous sites visited can be leaked and how this could be bad but I think whitelisting by end users could still allow some files to be used. E.g. the code for react.

      3 replies →

  • it's a security feature. otherwise my malicious site could check for cdn.sensitivephotoswebsite.com and blackmail you if it was cached already

    • It would be nice if there was a whitelist option for non-sensitive content. I stopped using cdn links due to the overhead of the extra domain lookups but I did think that my self hosted content would be cached across domains.

      1 reply →