Comment by yason
7 years ago
Everything that is running keeps using the old libraries. The directory entries for the shared libraries or executables are removed but as long as a task holds a live file descriptor the actual shared library or executable is not deleted from the disk. New processes will have the dynamic linker read the new binaries for the updated libraries. Unless the ABI or API somehow changes during the update (and they don't, big updates usually bump the library version) things work pretty fine.
Do they work fine though?
1. On the one hand I see folks accessing files over and over by paths/names, and on the other hand they demand features that would break unless they switched their fundamental approach to handles/descriptors. Which is it? You can't claim descriptors would fix a problem and simultaneously insist on path-based approaches being perfectly fine. Most programs use paths to access everything (and this goes beyond shared libraries) and assume files won't have changed in between. You can blame it on the program not using fds if that makes you feel better, but the question is how do you magically fix this for the end user?
2. Do you actually see this working smoothly on a Linux desktop environment in practice, or do you just mean this is possible in a theoretical sense? Do you not e.g. get errors/crashes after an apt-get upgrade that presumably upgraded a package your desktop environment depended on (say GTK or whatever)? That happens to me frequently (and I'm practically guaranteed to see a problem if I open a new window in some program in the middle of an update), and it scares me what might be getting corrupted on the way -- makes me wish it would reboot instead of crashing and stop giving me errors.
1. In general updating the same files at the same time is not a that much of a common problem in any practical sense. The user generally won't be editing the document in two different editors at the same time. Programs use flock(2) or something similar if they have to update a shared file, or they have a directory structure that allows different instances of the program to update simultaneously by using little small files instead of having a mutually exclusive access to a single file.
I think the most common real-life problem is editing a shell script while it's still running: this happens often during development if the shell script takes a bit longer to run. You edit the file and hit save until the previous run has finished. The on-disk data changes which is reflected in the shell process that mmap()ped the script file, and eventually the contents that changed or shifted will break the shell's parser.
2. I have 106 days of uptime on my laptop. It has gone through several apt upgrades and I don't think I've shut down my X11 session for 106 days either. Firefox sometimes restarts itself after an update because it apparently knows it needs to do that but other than that I generally never restart X or reboot the machine because of updates. This has basically been the case for years, even decades. The scheme probably has to break eventually but I generally bump into other stuff such as important kernel updates before that. Fair enough for me, never really ran into any issues because of it.
1. User opens a document. User moves a higher-level directory not realizing it was an ancestor of that file. Then user goes back to the program and it can no longer find the file because it was using file paths. What should happen? Should the OS play any role?
2. You manage to keep X11 open, but that's hardly the point I was making. Do you also keep everything open and use your GUI programs as normal when going through an upgrade, or do you change your behavior in some way to avoid it messing up what you're running? And/or are you selective about which updates you apply to minimize their effects on what you're running?
Furthermore, are you familiar at all with the kinds of errors I referred to? Or have you never seen them in your life/don't know what I'm even talking about? If you don't think I'm just making things up when I say updates frequently cause me to get get crash and error messages ("Report to Canonical?" etc.), then in your mind, why does this happen right when I update? Is it just some random cosmic bit flip or disk corruption on my new computer that pops up exactly when I update? Is it not possible for it to be the update perhaps changing files when programs didn't expect them?
1 reply →
1. Reopening a file pretty much always marks a point where it's safe for the contents to change.
2. I don't think I've ever had a problem caused by continued use between update and reboot.
In answer to 2, not the GP but I've not experienced problems doing that. Maybe I'm just lucky, though.