Citation: "When KFR is being used, the October 2018 Update will delete the original, default Known Folder locations. Microsoft imagined that this would simply remove some empty, redundant directories from your user profile. No need to have a Documents directory in your profile if you're using a redirected location, after all."
Sorry for harsh language, but this is not a bug. This is a complete brain damage of those who decide to implement such behaviour.
I have redirected Documents. And there are A LOT of programs that directly try to use C:\Users\user\Documents instead of redirected.
Sorry for harsh language, but this is not a bug. This is a complete brain damage of those who decide to implement such behaviour.
That's not "harsh", harsh is what happened to the unlucky users who lost data because no one on the development team bothered to call it out. It should be an implicitly understood rule that you NEVER remove a file you did not create, unless the user explicitly asked to, but I guess MS considers it acceptable to do anything to a user's system after they convinced everyone to take forced updates as being acceptable too.
> but I guess MS considers it acceptable to do anything to a user's system after they convinced everyone to take forced updates as being acceptable too.
This. Once you get in the mindset that you know better than the user, it's not a big jump to "I know these directories should be empty, any content is leftover garbage, let's remove them".
Absolutely this, with a followup of "and if you're 'helpfully' deleting an unused folder, check that it's unused first!"
Given that this was (stupid) desired behavior, making sure Documents didn't have stuff in it should have been a screamingly obvious step. It would still have been utterly unacceptable, it could still have created weird downstream bugs when users installed things that target the now-missing default Documents location, but at least it wouldn't have set a bunch of data on fire without any warning.
But then, I guess relying on common sense after the first terrible decision is made is never going to work. There's a reason "never break user space" is rule 1 for Linux updates...
Microsoft doesn’t consider the OS to be the “user’s system”. They clearly see it as something they rent to the user a little bit each day. If the user wants to keep the system for a little longer, he/she has to make a special request to Microsoft that they hold off for a bit before retaking control of the OS.
Recently was reading the blog of the team upgrading conhost. On it they joke several times that the developers doing the work were not even born yet.
Well, this is the kind of thing that happens when your whole team is interns and the "senior" is 28.
Reading the associated bugs on github I also learned that there were lots of complaints about the new console, not on features, but breaking compatibility that is. Guess no one thought to start a new project, rather than changing a 30 year-old one that hadn't been touched in 20.
TL;DR: At least one codger is needed on teams doing this kind of work to give perspective.
Seems like common sense, but Microsoft has been doing this for a while. In Windows 7 (not sure about later releases) the OS runs a 'Desktop Cleanup' periodically that deletes shortcuts to network locations it can no longer connect to. God forbid your network drives don't map one day and Windows decides to nuke all your desktop shortcuts... this actually happened to a user I was doing support for and they were understandably livid
Yes, seriously. I would still think it's totally unjustifiable as a 'helpful' change, but at least it'd be causing problems like "hey, this install failed, how do I fix it?" rather than "where's all my stuff?"
That, and I'm just a bit shocked that "the folder should be empty, so delete it" didn't just naturally make people think "obviously I should check if that's true first". Even if it's not a total fix, the failure to add that is it's own layer of screwup.
Windows/NTFS has links, but they aren't used for this (and software support for them is ... iffy, which can be an issue with backup tools etc, for which they aren't just transparent). These "known folders" are roughly implemented as environmental variables containing paths to the configured folder.
This seems like a common sense thing that every intern would consider. Why the Microsoft development team didn't really raises some questions not only about their QA, but about their whole development process.
Tangential to this when are operating systems going to ship w/ CoW filesystems by default? Accidental deletion of critical directories has been a solved problem for years now. I take instant snapshots of my home directory on ZFS every hour, my boot environment is snapshotted before every update.
You could literally remove my entire root directory and at worst you've minorly inconvenienced me: I now have to boot from a USB drive and run `zfs rollback` to the most recent snapshot. I'd be impervious to most ransomware as well: my home directory is filled w/ garbage now? Good thing it can't encrypt my read only snapshots, even with root access. Oh and I can serialize those snapshots across the network at the block level before I nuke the infected machine.
Of course humanity will always invent a bigger idiot, so it's possible a virus could gain root & exec some dangerous ZFS commands, or it's possible some MSFT employee could pass the `-r` flag and recursively delete a dataset & its snapshots in their post-update script. Plus let's not forget that no filesystem in the world will stop you from just zero-filling the drive. Still it's easy for me to imagine a world where I'm completely impervious from ransomware: by way of only being able to delete critical snapshots when a hardware key is present or when my TPM is unlocked.
On the whole it seems to me CoW filesystems are a major step forward in safe-guarding users from accidental deletions. At a minimum they have other benefits (serialized incremental send streams, block level compression & dedup, etc.) that make backups easier to manage. -- Yet I still can't boot from ReFS on Windows.
That'd be great, if I didn't have to turn it off. System Restore is so slow as to be borderline unusable. (In my experience creating the restore point accounts for the majority of time Windows spends doing updates, on a true CoW filesystem this should be nearly instantaneous.) Furthermore if you have VSS enabled on a volume hosting an MSSQL database it creates entries in the backup log for every snapshot, which destroys the chain of differential backups. This makes it impossible to use SQL Server Agent's maintenance plans alongside VSS in small-scale deployments (e.g: a workstation.)
I cannot stress enough: NTFS is not a CoW filesystem, any attempt to add CoW features on top of it will be poorly performant and doomed to fail, because they actually have to copy the old data blocks when you overwrite something. ZFS, btrfs, et al. are not actually copying user-data, because they don't have a concept of "ovewriting a block," every block that is written is written to unallocated regions of the disk; blocks that are not referenced by a dataset or snapshot are then returned to the allocator for later use; at no point is the old data "copied", it's just "not returned to the allocator for reuse."
What btrfs or ZFS mean by "copying" is not the user's data blocks, it's copying filesystem metadata, portions of btrfs' extent tree, or ZFS' block pointer tree. There is a world of difference between volume shadow copy, and an actual CoW filesystem. -- Microsoft knows this, that's why they are working on ReFS. (Of course in typical MSFT fashion they've made a number of critical mistakes: user data checksums are optional, and volume management is not integrated so it can't even use those integrity streams to self-heal corruption. Also last I checked ReFS can't even be used as a bootable Windows volume. -- It's worth pointing out APFS also made the mistake of checksumming metadata but not user data; which in my opinion makes both of them unsuitable as next generation filesystems.)
Last time Microsoft decided to replace NTFS with something modern, they took about 15 years to give up. If they start now, we can expect them to give up roughly in 2033.
They tried radical, that's why it was slow and sadly painful. Now implementing CoW ideas is at MS reach easily and would bring commercial value easily, so it could go very differently.
ZFS has now been ported to Windows. With some testing and polishing, maybe it will be something they could adopt in the future (I know, wishful thinking, but the technology is there).
”The Volume Shadow Copy Service (VSS) keeps historical versions of files and folders on NTFS volumes by copying old, newly overwritten data to shadow copy via copy-on-write technique. The user may later request an earlier version to be recovered.”
Out of curiosity, have you got a link to a workflow doc/tutorial/guide that could instruct someone who's a big green around the ears with using a zfs backup like this?
For a great primer on ZFS (on Linux) in general there's this website[1] by Aaron Toponce that is very well laid out. Specifically see the section on snapshots & clones, as well as sending/receiving. Also don't forget that you can make a ZFS pool out of any block device, this includes normal files mounted as a loop device! So a great way to experiment w/ learning how to manage a ZFS pool is to just create a few sparse files, add them to a pool, and just start messing around with it. (For actual production pools though always put ZFS as close to the actual disks as possible.)
What's really cool about serialized snapshots is that once native encryption support lands in ZFSonLinux you'll be able to send encrypted volumes over the wire. The data blocks don't have to be decrypted to be serialized or checked for integrity, so you don't even need the decryption key to do incremental sends! You can send your backups to an untrusted remote pool that never has to have knowledge of the decryption key!
(You can also serialize snapshots to files, which is useful for say writing them to detachable media to sneakernet them. If you want to do incremental sends though the receiver does have to be an actual zpool.)
Some other helpful hints would be using something like `mbuffer`[2] on the sender & receiver. This is handy for spinning rust pools, so that your disks don't have to wait on the network before seeking to the next blocks to be sent.
Also ZFS lets you delegate permissions so you can manage filesystems as normal users, I'd recommend doing this so you don't have to login as root on the remote pool to receive the snapshots.[3] In my case I ran into this early on because I have `PermitRootLogin no` on all my boxes.
They had that 10 years ago and called it "system restore", but they couldn't figure out how to actually...make it work.
Besides, msft makes money by selling SaaS. There's no financial incentive to protecting consumer data since the EULA indemnifies the company for any damages caused by their products.
I would have been bitten by this bug. All my Known Folders are redirected to be in a separate hard drive so I can switch them to a new system more easily when I upgrade. It also stems from when SSD's were too expensive to store anything but your OS and your applications on.
A whole bunch of applications do the bad thing Microsoft is talking about, and hardcode the path to 'My Documents'. I have a ghost 'My Documents' folder that has mostly app settings and maybe some save files.
I guess I'm glad that I don't use any of those folders... I just dump everything into category folders at the root of my data partition, e.g. G:\Code, G;\Pictures, G:\Downloads, etc.
The entire Users directory ends up being such an unimaginable cesspool on a Windows machine that has seen any significant service time, with this, that, and everything else poking and prodding and leaving its detritus inside. My desktop is essentially a single-user machine, as most Windows laptops and desktops are, but even on servers where user accounts are actually used, I spend an undue amount of time fiddlefrigging to take ownership and permissions on files because some script I need is saved to a different account's desktop. Bailing out of the whole thing and running your own filesystem shouldn't be easier, but it is.
I suppose the developer who thought it was a good idea to delete a non empty directory was high at the time he implemented it. But, how this went pass through QA is entirely a mystery to me.
It's pretty simple. Because there isn't a QA. Microsoft laid them off and uses the "Insiders" as beta testers. But they reported this issue but Microsoft ignored it because it didn't get enough votes.
Wow... This is not acceptable at all. They should have a team to go through these and at least prioritise the tickets instead of just relying on upvote. I'm pretty sure Microsoft can afford that.
I’ve read this comment about MS getting rid of their QA in a lot of places. Is that something they actually really did or is it something that people say because their QA quality has dropped significantly (which as in Apple’s case could be due to the increased release cadence).
It's the Microsoft mindset that they rule the world and everything works the Microsoft way. Not only do other operating systems not exist (yes yes Linux subsystem for win 10 whatever), but every software vendor and user is assumed to use the system exactly the way it was intended. That means no app will have the path to the original folder hardcoded instead of querying it the official way, and no user will mnavigate to the old pre-redirect location manually for any reason. By that definition, the folder can only be empty, so it's safe to delete it, recursively, since deleting an empty folder recursively is no different from deleting it non-recursively.
Sounds more like a business/product owner request given to a developer who has been told one too many times they are too negative when being given new stories to work on.
YOU are their QA department if you're using a non-enterprise version of Windows! Wake up and use Linux or buy a Mac (at least Apple only mocks your wallet!).
I don't think this was a single developer mistake (that would be easy to spot), instead this was a solution for a problem so it was a requirement for the new update.
Looks like someone didn't do their job correctly. Carrying on a destructive action (DELETE) without first checking whether that folder contains any files (except maybe desktop.ini)...
I won't try to defend their execution, but I think they're trying to solve a legitimate issue. It's really confusing to an end-user if they see two Documents folders.
And I'm certain that I've personally encountered instances where both Documents folders had the Documents special folder icon (though I'm not sure if I've seen this on Win10, specifically)
There is no “check first” in an ever-changing file system environment that has no atomic operation for something this large. Any “check” is a false sense of security, convincing you that you’re about to do the right thing; meanwhile, any background process could create important stuff in the directory tree you just “checked” and you’d destroy it anyway.
If you could lock down the whole directory tree and then check, it would be moderately safer but you are still assuming the tree contains only what you expect. It’s far wiser to have a list from the start or a file search that you can audit before individually processing files on your list.
Sounds like making the perfect the enemy of the good. Why bother to do any checks of anything when a random flip-flop could go metastable forever and brick your system?
"No longer". That's a very bad sign about QC and how confident the major retailers are that we're not going to switch to a different and more stable OS. The problem is that they're right, because Unix/Linux/etc was never and is not meant to be a single-user desktop home OS for the general public. Of course, random broken updates completely bricking your system should be no surprise to those users either.
Anybody got a suggestion for an OS I can use that exists on hard media, doesn't use kernel or base OS code that's been distributed digitally and has optional completely non-destructive updates via hard media no more than once a year, so that I'm not feeling like I'm trying to hit a moving target with the stuff I want to use, and it either works or not until a year later?
It certainly brings the suitability of Windows for production use into question. Not only this bug, which is outrageous, but the direction of Windows 10 in general before this.
I actually used it as a desktop OS in the early 00s, it was a very relaxing experience. It was years behind FreeBSD (which was years behind Linux, which was years behind Windows...) in terms of device drivers, but it was incredibly stable, minimal, beautifully documented and well-organized OS that was a joy to work with.
As an example of what "stable" means here: OpenBSD has a port collection, but it's an implementation detail that you're not supposed to use. Instead, when a release is created, all ports are built and tested to guarantee that they work. Then, they're not updated. Ever. You're supposed to upgrade to the next release yourself, and if you don't, all the packages available to you right now will be available and working the same way 20 years from now.
(Of course, there is a -current version you can use to get more liberal update policy.)
I don't remember all the details, but it sounds like it would fit your requirements very well.
I may have to try it then. How does OpenBSD name drives? Logically, like A:\ & B:\ for floppies or removable media, C:\ for main HDD, D:\ for main optical media or second HDD, E:\ for second optical media, etc? Or does it name in the incomprehensible method Unix & Linux use that have no application to what I'm trying to do with my life?
A live DVD of some Linux variant (Puppy or Alpine come to mind)? I'm not sure that I would want to run it for that long unless you figure out some way to run your browser off a different device, though; security updates aren't something that's safe to put off for a year.
Can download the binary Moz-distributed FF and unpack it wherever, then run $wherever/firefox. I think currently needs PulseAudio if you want audio, so that might limit what distro one can use. There can also be some traps with missing libraries, so very stripped down distro might be unusable. Otherwise, just check Moz website for updates every couple months?
At worst Linux updates might screw up your system and render it non-bootable. I'm not aware of any bugs in the past 20ish years where your home folder was deleted.
Well.... there are a few really nasty issues that come to mind, usually surrounding filesystems. Not too long ago, if you were using a certain version of systemd on certain laptops and did a rm -rf / (which as a new user is not hard to mistakenly do), not only would you lose your files, your hardware would become unusable as well.
(Systemd mounted an EFI partition read/write, and nuking / also nuked critical firmware information)
That said, given that Microsoft charges money for this, goes out of their way to render themselves not liable when their code breaks your shit, and doesn't test a use case that's not exactly uncommon, it's still unforgivable and not comparable to Linux.
This is what you get when you adopt the "let users do the qa/testing/beta-testing for us for free". The whole attitude behind Windows 10 "continuous updates" instead of actual releases that are actually alpha-tested and beta-tested by actual hired testers on multiple machines is disgusting and offensive to the user! An operating system needs to be a boring "rock-solid foundation", it doesn't need frequent updates and experimentation, that's what apps are for.
Use Linux or buy a Mac. Microsoft always is and always will be horrible to end-users. They maybe cool in the enterprise sphere, and when it comes to open-source, and when it comes to developer tools and languages (I use lots of MS stuff... but run then on a Linux system!), but they don't care about regular end users since most of them "can't cast a vote" in deciding the system they use and where money goes to.
That's good progress - congratulations, Microsoft!
Frankly, I can't understand why - having a dominant position on the market - they seem to do everything to drive people away from their platform. It's not like we're in the 90s and there is no other choice.
During about 10 years in the late 90's, early 2000's, I was a kind of Linux zealot, with anti-MS signatures on my emails, which survive on some BBS and Usenet archives.
Nowadays with the exception of a travel netbook, I mostly run Windows or Android on my computers.
Because GNU/Linux never managed to get their act together what means to have a full stack experience for UI/UX focused developers, specially on laptops.
And I just won't pay the Apple prices for less hardware than I can get with a Thinkpad/Dell/Asus workstation laptop, usually about 500 euros cheaper.
I mean MacOS requires a relatively expansive machine to run it, and many business-critical software either doesn’t run on Macs, or has drastically reduced functionality.
Linux isn’t even worth bothering with if one isn’t technical.
If you're not technical, just go with Mint. Looks like Windows 7, behaves like Windows 7, doesn't break. You don't have to leave GUI environments once, neither in installation nor in usage. Doesn't break. Gives you the opportunity to optimize your workflow if you want to.
They are still not many choices out there: either Apple with high-priced defective keyboards and no desktop solutions or Linux which is still a gamble especially on laptop. In some regards, the situation is worse than a decade ago.
On Microsoft side, as soon as they announced Windows 10 would be rolling release OS (i.e. a perpetual beta one) I knew I was done with it on bare metal.
What other choice though? I mean especially for Laptops. I find Macbooks now completely unacceptable, as a 13 year mac user, and Linux still seems to have the old issue of unreliable driver support.
There are choices, better than ever before, but to vast types of users this doesn't matter.
Some examples - corporate users (nobody big seriously considers Linux for desktops for various reasons, Apple would be easily 3-5x that expensive for no good enough added value), gaming (again some good options, but subpar to windows on probably every aspect).
Everybody knows Windows, everybody can somehow get by with just clicking around. If I've put Linux on my fiancee's notebook (she is a doctor), I would have to do 24x7 support for it, forever. No, thank you.
Recursive delete has always been a misfeature of computing, out of a mistaken entitlement to convenience when you are performing a fundamentally risky operation on a target you can never know the state of.
At best, it is redundant with a recursive search feature that chooses “delete” as the operation. And if you want “do something else then delete”, you can no longer call a recursive-delete command anyway so why not just learn how to enumerate files first and give your system a fighting chance to audit first?
Disk cleanup code should always create lists of known target files, attempt to delete only that list of files, then do the platform equivalent of “rmdir” at the end to attempt to remove the directory. If that fails, congratulations: your ass was saved by not deleting something you didn’t know was there.
I really wonder what goes on at MS to have consensus to think such messages were ever a good idea. Even if the message is true, it scares the users because it's like ransomware. If it isn't, that's even worse because you're now lying to your users. In any case, they arouse suspicion and fear.
In the XP days, I believe updates would, after restarting, at most show a dialog with a more informative message ("Installing updates...") and a progress bar, and more importantly, your wallpaper and desktop would continue loading in the background --- the latter really helps with the unease, if not the annoyance. The full-screen, vague, and unnecessary messages just invoke feelings of horror.
If you want to run an older OS have you considered just running a hypervisor like ESXi or KVM and then handling OS through that? There are lots of good solutions there at this point, and it can be a fun way to play with a lot of other cool features and different OS as well. You can even get near-native performance even for heavy duty graphics applications by using PCI passthrough. The only caveat that adds for hardware choice is that you'll want a processor with an IOMMU for the hardware virtualization support (AMD calls this "AMD-Vi", Intel "VT-d"). AMD is pretty good about not artificially segmenting there, I think everything modern they make supports it (all Ryzen/EPYC at least) though probably worth double checking overall system compat. Intel splits this all up more, Xeon always has everything but support varies elsewhere and you really just have to check the specs.
Even so that gives a ton of hardware choice and flexibility, and will give you more options to protect and control the systems beyond the OS themselves which is very important if you want to run something older since security patches will stop. But if you're judicious about what you use for what tasks and how you handle I/O it offers another option, and can make hardware changes a lot easier as well by abstracting away the metal somewhat. Basically a lot of the advantages that make virtualization so popular in general for business can be just as applicable at home these days, most of us have cycles and memory to spare and can afford to burn a bit of it on making a more pleasant software experience or working around issues coming from a higher level. In this case for example you could be running your Windows VM on virtual disks on a NAS/DAS or even the same system but supporting better snapshotting, and if the data was deleted simply roll back the entire VM to pre-upgrade state.
Judging from the state or Server 2016/Windows 10 updates I suspect not. How a rollup update for 2016 takes 30 mins+ to install (and often fails), yet the 2012 rollup is done in 10 mins is still baffling. This is on 2012 machines with much longer update histories.
People have gotten XP running on Haswell so I don't think running 7 on Skylake would be a problem. msfn.org has some useful information on running (very) old OSs on new hardware.
Every once in a while an MS support thread actually has a useful answer in it... given by some random internet commenter about 15 posts after MS support has determined only a reinstall will fix the problem.
Yes, I read about this earlier and along with the support tip was the "and don't touch your PC" tip. So I'm pretty sure they'll advise some undelete tool and until then they don't want deleted data to get overwritten, but no more magical solution than that.
Just yesterday I saw that my brother's Win10 desktop was magically empty after he'd rebooted due to windows update. After hunting around for solutions, (none of which worked), I noticed that all the missing desktop files were magically in the recycle bin.
Nice work M$.
From memory, the update was #1803, so not sure if this is relevant to the arctechnica article... but since it was yesterday, it's clearly not 100% accurate.
(PS: No, my brother didn't cause them to be put there)
The update in question here is #1809; so it's possible your brother encountered another data loss bug, which, considering their current state of QA, doesn't seem that unlikely.
By default the recycle bin has a limit of how much files it can store, and if you try to put more in there it will be deleted. It'll ask when you do the operation, but it if was done automatically as part of the update, who knows. A safety net with such big holes never made sense to me so that's why I always change the limit to 100% of the disk size. Your brother was lucky to recover his files.
I wonder if Microsoft is able to cancel the installation of already downloaded updates - if not, something like this might happen if the erroneous update was already downloaded in the background earlier. I think the default setting is that updates will be downloaded automatically and then installed later whenever the system decides there is a suitable period of "inactive time".
The update that deleted files wasn't ever an automatic one. You would have had to manually update. Fingers crossed they didn't mess up another. Seems unlikely.
"Compounding this issue is that Microsoft's rollout of version 1809 was already unusual. For reasons unknown, Microsoft didn't release this update to the Release Preview ring, so the most realistic installation scenario—someone going from version 1803 to 1809—didn't receive much testing anyway. And all this is against the longer-term concern that Microsoft laid off many dedicated testers without really replacing the testing that those testers were doing."
And from this article:
"In response the company has promised to update the Feedback Hub tool so that the severity of bugs can be indicated. Many people reported this data loss bug, but none of the reports received many upvotes, with Microsoft accordingly disregarding those bugs. If the bugs had been marked as causing data loss—the highest severity possible—then they may have received the additional attention that they deserved. Microsoft hasn't, however, explained why this update didn't receive any kind of "release preview" distribution or testing. There are no guarantees that this would have caught the bug, but it would have meant that an extra round of people would have installed the update onto their systems, and who knows, one of their bug reports might have gotten lucky."
As a dedicated tester for a large-ish company I can't even imagine how many problems would go unreported if they even got rid of half of our department. It's hard to quantify the exact value of SQA so I can see some manager over-looking its importance, but this is Microsoft. They should know better.
I've switched from macOS to Win10+WSL as my main dev machine this summer, mainly because I like Thinkpad hardware much more (and wanted to give the standard OS there a try), but I'm close to giving up on it and switching to Linux. It's crazy how much crap it throws at you at a daily basis.
* explorer, and even in general file operations are dead slow for some reason. an expand of a zip from explorer with a couple of 10s of thousands of files can take an hour, while in WSL takes maybe a minute. explorer also takes its sweet time to load, including in open dialogs. This being on a near top-of-the-line 480s with 24GB Ram and 1TB SSD.
* windows don't remember their previous position on multi-screens.
* copying in terminal sometimes seems to work, sometimes not.
* terminal beeps at you on every tab with more than one option, always have to keep sound muted.
* bluetooth menu is glitchy and there's no standard quick way to connect to a previous device.
* no idea whether that's win10, spotify or thinkpad software, but hitting a media key produces a NON DISMISSIBLE big overlay for spotify that just hangs there for a good 10 seconds and blocks the stuff I want to click.
* solution for a full taskbar? just make it scroll with very small scroll buttons...
* some older Logitech mouse I connect has buggy assignment of forward/back keys - does a completely random operation instead. Windows doesn't seem to have a GUI-way to set this stuff up
* terminal has no tabs and crappy colors and I don't wanna go down the rabbit hole of trying to integrate WSL with a non-default terminal emulator. I've installed the spring update, won't touch october one for a while at least.
* there's no integration of WSL & windows GUI layer. Have to start an X-Server separately and have Linux GUI-tools instead. If I seriously need that I will simply switch to a Linux distro instead (which given the above I start to suspect I should have done from the beginning).
It's been okay since I upgraded to Windows 7. The lack of updates may become an issue at some point. On the other hand, the lack of updates can also be a boon. I really want to swap to Linux, but can't let go of Visual Studio. The second WINE can get Visual Studio running, I'm out.
Imagine a car-repair men standing besides your car, smiling "Your car will no longer spontaneously combust and kill those inside!" like that is a achievement to the previous incarnation.
If you want your files to be save, switch to linux. This - for lack of a better term - company obviously considers deleting them a case that can occasionally happen.
And all those additional backups, the time invested into that- makes windows to expensive as a system- even for free.
> Adding insult to injury, there are ways in which Windows users could have enabled KFR without really knowing that they did so or meaning to do so. The OneDrive client, for example, can set up KFR for the Documents and Pictures folders, if you choose to enable automatic saving of documents and pictures to OneDrive.
Worse still, the OneDrive client apparently left documents in the original locations which the Windows update would then delete:
> The current OneDrive client will set up KFR and then move any files from their original location to the new OneDrive location. Older versions of the OneDrive client, however, would set up KFR but leave existing files in the old location. The October 2018 Update would then destroy those files.
Microsoft's own software which they bundled with Windows 10 and nagged Windows 10 users to set up and use triggered a data loss bug which caused Windows 10 to delete those users' documents - all because Microsoft didn't think it through, didn't test properly, and didn't take any notice of data loss reports from external beta testers.
LTSB is just lovely. Maybe the best part of having access to an MSDN subscription these days. It'd be worth paying a premium for as a consumer, if they ever wanted to; manufacturers are slighly better than they once were, but it's still a necessary first step to wipe disks and install a crapware free Windows immediately on a new machine, and LTSB is far and away the best for that.
WSL would be nice, but it's better all in all to just spin up a ubuntu vm in hyperv.
I believe it also broke the File History and Windows Backup (if you use those). I had a client I had to restore from backups. Thankfully they only lost 1 small file in the mix of it all.
Honestly, I don't care. I've disabled Windows updates for the time being since I no longer trust their update QA. I can deal with a non-functioning system. I can't deal with files being deleted outright that don't belong to the system in any way.
Speaking as someone who has whose undergrad degree is in Electrical Engineering from a top-rated school and who worked designing computer chips for 4 years, Windows is not properly engineering its software. I also have extensive software experience.
Microsoft appears to be relying on users to do their testing instead of making their testing as part of the overall engineering process. [1] [2]
Can you imagine Boeing asking passengers to test their airframes or GE asking passengers to test their jet engines?
In Mac office, there is a bug with the new upgrade to Mac OS 10.14 (Mojave) where "recent files" local to the Mac are not retained (but those to the cloud are).
Proper engineering means that there are testing protocols followed while developing the product. The fact that such as simple bug as Recent files was not tested speaks a lot to the lack of proper testing of the software as does this Windows 10 bug.
Neither Microsoft nor Facebook, nor Google have people with engineering degrees mentored in engineering as their leadership and I doubt that there are many board members who have the requisite engineering education and experience.
I was mentored in a particular set of processes for ensuring proper design of computer chips and this process is simply lacking in many software systems.
Not all bugs can be caught by testing, many have to be prevented in the first place through proper processes.
Citation: "When KFR is being used, the October 2018 Update will delete the original, default Known Folder locations. Microsoft imagined that this would simply remove some empty, redundant directories from your user profile. No need to have a Documents directory in your profile if you're using a redirected location, after all."
Sorry for harsh language, but this is not a bug. This is a complete brain damage of those who decide to implement such behaviour.
I have redirected Documents. And there are A LOT of programs that directly try to use C:\Users\user\Documents instead of redirected.
Sorry for harsh language, but this is not a bug. This is a complete brain damage of those who decide to implement such behaviour.
That's not "harsh", harsh is what happened to the unlucky users who lost data because no one on the development team bothered to call it out. It should be an implicitly understood rule that you NEVER remove a file you did not create, unless the user explicitly asked to, but I guess MS considers it acceptable to do anything to a user's system after they convinced everyone to take forced updates as being acceptable too.
> but I guess MS considers it acceptable to do anything to a user's system after they convinced everyone to take forced updates as being acceptable too.
This. Once you get in the mindset that you know better than the user, it's not a big jump to "I know these directories should be empty, any content is leftover garbage, let's remove them".
> you NEVER remove a file you did not create
Absolutely this, with a followup of "and if you're 'helpfully' deleting an unused folder, check that it's unused first!"
Given that this was (stupid) desired behavior, making sure Documents didn't have stuff in it should have been a screamingly obvious step. It would still have been utterly unacceptable, it could still have created weird downstream bugs when users installed things that target the now-missing default Documents location, but at least it wouldn't have set a bunch of data on fire without any warning.
But then, I guess relying on common sense after the first terrible decision is made is never going to work. There's a reason "never break user space" is rule 1 for Linux updates...
Microsoft doesn’t consider the OS to be the “user’s system”. They clearly see it as something they rent to the user a little bit each day. If the user wants to keep the system for a little longer, he/she has to make a special request to Microsoft that they hold off for a bit before retaking control of the OS.
8 replies →
Recently was reading the blog of the team upgrading conhost. On it they joke several times that the developers doing the work were not even born yet.
Well, this is the kind of thing that happens when your whole team is interns and the "senior" is 28.
Reading the associated bugs on github I also learned that there were lots of complaints about the new console, not on features, but breaking compatibility that is. Guess no one thought to start a new project, rather than changing a 30 year-old one that hadn't been touched in 20.
TL;DR: At least one codger is needed on teams doing this kind of work to give perspective.
6 replies →
>NEVER remove a file you did not create
Seems like common sense, but Microsoft has been doing this for a while. In Windows 7 (not sure about later releases) the OS runs a 'Desktop Cleanup' periodically that deletes shortcuts to network locations it can no longer connect to. God forbid your network drives don't map one day and Windows decides to nuke all your desktop shortcuts... this actually happened to a user I was doing support for and they were understandably livid
Disable windows 10 automatic maintenance.
Windows is going into the Apple direction I guess.
How about 'delete the folder if empty'? Why on earth didn't they think to do that?
Yes, seriously. I would still think it's totally unjustifiable as a 'helpful' change, but at least it'd be causing problems like "hey, this install failed, how do I fix it?" rather than "where's all my stuff?"
That, and I'm just a bit shocked that "the folder should be empty, so delete it" didn't just naturally make people think "obviously I should check if that's true first". Even if it's not a total fix, the failure to add that is it's own layer of screwup.
Even that could potentially be a breaking change, like if the folder was expected to be there by some application.
4 replies →
How does that even work? Does Windows not have proper hard/soft links? Don't you maybe have to do some weird things to get around them?
Windows/NTFS has links, but they aren't used for this (and software support for them is ... iffy, which can be an issue with backup tools etc, for which they aren't just transparent). These "known folders" are roughly implemented as environmental variables containing paths to the configured folder.
9 replies →
Links have been a feature since vista, at least
9 replies →
> This is a complete brain damage of those who decide to implement such behaviour.
I'd love to read what Linus Torvalds would say about it.
There are programs that have bugs before the first line of code is written. This is one such case.
This. How much lost time, panic, and rage did carelessly deleting these folders cause? It was a dire mistake.
This seems like a common sense thing that every intern would consider. Why the Microsoft development team didn't really raises some questions not only about their QA, but about their whole development process.
Tangential to this when are operating systems going to ship w/ CoW filesystems by default? Accidental deletion of critical directories has been a solved problem for years now. I take instant snapshots of my home directory on ZFS every hour, my boot environment is snapshotted before every update.
You could literally remove my entire root directory and at worst you've minorly inconvenienced me: I now have to boot from a USB drive and run `zfs rollback` to the most recent snapshot. I'd be impervious to most ransomware as well: my home directory is filled w/ garbage now? Good thing it can't encrypt my read only snapshots, even with root access. Oh and I can serialize those snapshots across the network at the block level before I nuke the infected machine.
Of course humanity will always invent a bigger idiot, so it's possible a virus could gain root & exec some dangerous ZFS commands, or it's possible some MSFT employee could pass the `-r` flag and recursively delete a dataset & its snapshots in their post-update script. Plus let's not forget that no filesystem in the world will stop you from just zero-filling the drive. Still it's easy for me to imagine a world where I'm completely impervious from ransomware: by way of only being able to delete critical snapshots when a hardware key is present or when my TPM is unlocked.
On the whole it seems to me CoW filesystems are a major step forward in safe-guarding users from accidental deletions. At a minimum they have other benefits (serialized incremental send streams, block level compression & dedup, etc.) that make backups easier to manage. -- Yet I still can't boot from ReFS on Windows.
NTFS does support copy-on-write through the volume shadow copy service (which powers the System Restore and Previous Versions features).
That'd be great, if I didn't have to turn it off. System Restore is so slow as to be borderline unusable. (In my experience creating the restore point accounts for the majority of time Windows spends doing updates, on a true CoW filesystem this should be nearly instantaneous.) Furthermore if you have VSS enabled on a volume hosting an MSSQL database it creates entries in the backup log for every snapshot, which destroys the chain of differential backups. This makes it impossible to use SQL Server Agent's maintenance plans alongside VSS in small-scale deployments (e.g: a workstation.)
I cannot stress enough: NTFS is not a CoW filesystem, any attempt to add CoW features on top of it will be poorly performant and doomed to fail, because they actually have to copy the old data blocks when you overwrite something. ZFS, btrfs, et al. are not actually copying user-data, because they don't have a concept of "ovewriting a block," every block that is written is written to unallocated regions of the disk; blocks that are not referenced by a dataset or snapshot are then returned to the allocator for later use; at no point is the old data "copied", it's just "not returned to the allocator for reuse."
What btrfs or ZFS mean by "copying" is not the user's data blocks, it's copying filesystem metadata, portions of btrfs' extent tree, or ZFS' block pointer tree. There is a world of difference between volume shadow copy, and an actual CoW filesystem. -- Microsoft knows this, that's why they are working on ReFS. (Of course in typical MSFT fashion they've made a number of critical mistakes: user data checksums are optional, and volume management is not integrated so it can't even use those integrity streams to self-heal corruption. Also last I checked ReFS can't even be used as a bootable Windows volume. -- It's worth pointing out APFS also made the mistake of checksumming metadata but not user data; which in my opinion makes both of them unsuitable as next generation filesystems.)
2 replies →
Pretty sure its also used for some aspects of MSI (rollbacks I think)
Last time Microsoft decided to replace NTFS with something modern, they took about 15 years to give up. If they start now, we can expect them to give up roughly in 2033.
They tried radical, that's why it was slow and sadly painful. Now implementing CoW ideas is at MS reach easily and would bring commercial value easily, so it could go very differently.
ZFS has now been ported to Windows. With some testing and polishing, maybe it will be something they could adopt in the future (I know, wishful thinking, but the technology is there).
2 replies →
> Tangential to this when are operating systems going to ship w/ CoW filesystems by default?
As far as I know Ubuntu ships with ZFS kernel module.
Unfortunately, as you undoubtedly know, ZFS support on Linux will always be a problem due to licensing.
As far as I know native encryption is not yet stable in ZFS.
”The Volume Shadow Copy Service (VSS) keeps historical versions of files and folders on NTFS volumes by copying old, newly overwritten data to shadow copy via copy-on-write technique. The user may later request an earlier version to be recovered.”
https://en.m.wikipedia.org/wiki/NTFS#Volume_Shadow_Copy
2 replies →
Out of curiosity, have you got a link to a workflow doc/tutorial/guide that could instruct someone who's a big green around the ears with using a zfs backup like this?
For a great primer on ZFS (on Linux) in general there's this website[1] by Aaron Toponce that is very well laid out. Specifically see the section on snapshots & clones, as well as sending/receiving. Also don't forget that you can make a ZFS pool out of any block device, this includes normal files mounted as a loop device! So a great way to experiment w/ learning how to manage a ZFS pool is to just create a few sparse files, add them to a pool, and just start messing around with it. (For actual production pools though always put ZFS as close to the actual disks as possible.)
What's really cool about serialized snapshots is that once native encryption support lands in ZFSonLinux you'll be able to send encrypted volumes over the wire. The data blocks don't have to be decrypted to be serialized or checked for integrity, so you don't even need the decryption key to do incremental sends! You can send your backups to an untrusted remote pool that never has to have knowledge of the decryption key!
(You can also serialize snapshots to files, which is useful for say writing them to detachable media to sneakernet them. If you want to do incremental sends though the receiver does have to be an actual zpool.)
Some other helpful hints would be using something like `mbuffer`[2] on the sender & receiver. This is handy for spinning rust pools, so that your disks don't have to wait on the network before seeking to the next blocks to be sent.
Also ZFS lets you delegate permissions so you can manage filesystems as normal users, I'd recommend doing this so you don't have to login as root on the remote pool to receive the snapshots.[3] In my case I ran into this early on because I have `PermitRootLogin no` on all my boxes.
[1]: https://pthree.org/2012/04/17/install-zfs-on-debian-gnulinux... [2]: https://dan.langille.org/2014/05/03/zfs-send-on-freebsd-over... [3]: https://dan.langille.org/2015/02/16/zfs-send-zfs-receive-as-...
1 reply →
They had that 10 years ago and called it "system restore", but they couldn't figure out how to actually...make it work.
Besides, msft makes money by selling SaaS. There's no financial incentive to protecting consumer data since the EULA indemnifies the company for any damages caused by their products.
I would have been bitten by this bug. All my Known Folders are redirected to be in a separate hard drive so I can switch them to a new system more easily when I upgrade. It also stems from when SSD's were too expensive to store anything but your OS and your applications on.
A whole bunch of applications do the bad thing Microsoft is talking about, and hardcode the path to 'My Documents'. I have a ghost 'My Documents' folder that has mostly app settings and maybe some save files.
You could use an ntfs junction then nothing has to understand the settings.
I was actually surprised that this isn't how redirection is implemented. Seems like an easy enough fix that would catch everything.
14 replies →
Is there a way to create/manage an NTFS junction other than CLI?
2 replies →
I guess I'm glad that I don't use any of those folders... I just dump everything into category folders at the root of my data partition, e.g. G:\Code, G;\Pictures, G:\Downloads, etc.
The entire Users directory ends up being such an unimaginable cesspool on a Windows machine that has seen any significant service time, with this, that, and everything else poking and prodding and leaving its detritus inside. My desktop is essentially a single-user machine, as most Windows laptops and desktops are, but even on servers where user accounts are actually used, I spend an undue amount of time fiddlefrigging to take ownership and permissions on files because some script I need is saved to a different account's desktop. Bailing out of the whole thing and running your own filesystem shouldn't be easier, but it is.
I don’t use the directory myself - but plenty of programs dump data into it.
I held off updating because I didn’t want Windows wiping out all my game saves for example.
I suppose the developer who thought it was a good idea to delete a non empty directory was high at the time he implemented it. But, how this went pass through QA is entirely a mystery to me.
It's pretty simple. Because there isn't a QA. Microsoft laid them off and uses the "Insiders" as beta testers. But they reported this issue but Microsoft ignored it because it didn't get enough votes.
https://mobile.twitter.com/WithinRafael/status/1048473218917...
They must have been storing the vote database in C:\Users\Documents
Wow... This is not acceptable at all. They should have a team to go through these and at least prioritise the tickets instead of just relying on upvote. I'm pretty sure Microsoft can afford that.
2 replies →
I’ve read this comment about MS getting rid of their QA in a lot of places. Is that something they actually really did or is it something that people say because their QA quality has dropped significantly (which as in Apple’s case could be due to the increased release cadence).
1 reply →
It's the Microsoft mindset that they rule the world and everything works the Microsoft way. Not only do other operating systems not exist (yes yes Linux subsystem for win 10 whatever), but every software vendor and user is assumed to use the system exactly the way it was intended. That means no app will have the path to the original folder hardcoded instead of querying it the official way, and no user will mnavigate to the old pre-redirect location manually for any reason. By that definition, the folder can only be empty, so it's safe to delete it, recursively, since deleting an empty folder recursively is no different from deleting it non-recursively.
Sounds more like a business/product owner request given to a developer who has been told one too many times they are too negative when being given new stories to work on.
But I might just be projecting.
YOU are their QA department if you're using a non-enterprise version of Windows! Wake up and use Linux or buy a Mac (at least Apple only mocks your wallet!).
Yay and become a beta tester for life.
1 reply →
I don't think this was a single developer mistake (that would be easy to spot), instead this was a solution for a problem so it was a requirement for the new update.
Title really feels like from The Onion, not fitting 2018, more like 1988. Something terrible must have happened to the Universe's hypervisor lately...
Looks like someone didn't do their job correctly. Carrying on a destructive action (DELETE) without first checking whether that folder contains any files (except maybe desktop.ini)...
Just because a folder is currently empty doesnt mean its not used and assumed to exist by some program. Delete shouldnt have been called, period.
I won't try to defend their execution, but I think they're trying to solve a legitimate issue. It's really confusing to an end-user if they see two Documents folders.
And I'm certain that I've personally encountered instances where both Documents folders had the Documents special folder icon (though I'm not sure if I've seen this on Win10, specifically)
17 replies →
There is no “check first” in an ever-changing file system environment that has no atomic operation for something this large. Any “check” is a false sense of security, convincing you that you’re about to do the right thing; meanwhile, any background process could create important stuff in the directory tree you just “checked” and you’d destroy it anyway.
If you could lock down the whole directory tree and then check, it would be moderately safer but you are still assuming the tree contains only what you expect. It’s far wiser to have a list from the start or a file search that you can audit before individually processing files on your list.
Sounds like making the perfect the enemy of the good. Why bother to do any checks of anything when a random flip-flop could go metastable forever and brick your system?
"No longer". That's a very bad sign about QC and how confident the major retailers are that we're not going to switch to a different and more stable OS. The problem is that they're right, because Unix/Linux/etc was never and is not meant to be a single-user desktop home OS for the general public. Of course, random broken updates completely bricking your system should be no surprise to those users either.
Anybody got a suggestion for an OS I can use that exists on hard media, doesn't use kernel or base OS code that's been distributed digitally and has optional completely non-destructive updates via hard media no more than once a year, so that I'm not feeling like I'm trying to hit a moving target with the stuff I want to use, and it either works or not until a year later?
It certainly brings the suitability of Windows for production use into question. Not only this bug, which is outrageous, but the direction of Windows 10 in general before this.
Try OpenBSD.
I actually used it as a desktop OS in the early 00s, it was a very relaxing experience. It was years behind FreeBSD (which was years behind Linux, which was years behind Windows...) in terms of device drivers, but it was incredibly stable, minimal, beautifully documented and well-organized OS that was a joy to work with.
As an example of what "stable" means here: OpenBSD has a port collection, but it's an implementation detail that you're not supposed to use. Instead, when a release is created, all ports are built and tested to guarantee that they work. Then, they're not updated. Ever. You're supposed to upgrade to the next release yourself, and if you don't, all the packages available to you right now will be available and working the same way 20 years from now.
(Of course, there is a -current version you can use to get more liberal update policy.)
I don't remember all the details, but it sounds like it would fit your requirements very well.
I may have to try it then. How does OpenBSD name drives? Logically, like A:\ & B:\ for floppies or removable media, C:\ for main HDD, D:\ for main optical media or second HDD, E:\ for second optical media, etc? Or does it name in the incomprehensible method Unix & Linux use that have no application to what I'm trying to do with my life?
1 reply →
A live DVD of some Linux variant (Puppy or Alpine come to mind)? I'm not sure that I would want to run it for that long unless you figure out some way to run your browser off a different device, though; security updates aren't something that's safe to put off for a year.
Can download the binary Moz-distributed FF and unpack it wherever, then run $wherever/firefox. I think currently needs PulseAudio if you want audio, so that might limit what distro one can use. There can also be some traps with missing libraries, so very stripped down distro might be unusable. Otherwise, just check Moz website for updates every couple months?
At worst Linux updates might screw up your system and render it non-bootable. I'm not aware of any bugs in the past 20ish years where your home folder was deleted.
https://www.phoronix.com/scan.php?page=news_item&px=MTIxNDQ http://tldp.org/LDP/lame/LAME/linux-admin-made-easy/crash-re...
2 replies →
Well.... there are a few really nasty issues that come to mind, usually surrounding filesystems. Not too long ago, if you were using a certain version of systemd on certain laptops and did a rm -rf / (which as a new user is not hard to mistakenly do), not only would you lose your files, your hardware would become unusable as well.
(Systemd mounted an EFI partition read/write, and nuking / also nuked critical firmware information)
Some cursory googling reveals some other issues, but everything I can find is either very old (ex. https://lkml.org/lkml/2012/10/23/690) or reference utilities that an average user won't have (ex. https://www.spinics.net/lists/linux-bcache/msg05290.html)
That said, given that Microsoft charges money for this, goes out of their way to render themselves not liable when their code breaks your shit, and doesn't test a use case that's not exactly uncommon, it's still unforgivable and not comparable to Linux.
16 replies →
This is what you get when you adopt the "let users do the qa/testing/beta-testing for us for free". The whole attitude behind Windows 10 "continuous updates" instead of actual releases that are actually alpha-tested and beta-tested by actual hired testers on multiple machines is disgusting and offensive to the user! An operating system needs to be a boring "rock-solid foundation", it doesn't need frequent updates and experimentation, that's what apps are for.
Use Linux or buy a Mac. Microsoft always is and always will be horrible to end-users. They maybe cool in the enterprise sphere, and when it comes to open-source, and when it comes to developer tools and languages (I use lots of MS stuff... but run then on a Linux system!), but they don't care about regular end users since most of them "can't cast a vote" in deciding the system they use and where money goes to.
That's good progress - congratulations, Microsoft!
Frankly, I can't understand why - having a dominant position on the market - they seem to do everything to drive people away from their platform. It's not like we're in the 90s and there is no other choice.
During about 10 years in the late 90's, early 2000's, I was a kind of Linux zealot, with anti-MS signatures on my emails, which survive on some BBS and Usenet archives.
Nowadays with the exception of a travel netbook, I mostly run Windows or Android on my computers.
Because GNU/Linux never managed to get their act together what means to have a full stack experience for UI/UX focused developers, specially on laptops.
And I just won't pay the Apple prices for less hardware than I can get with a Thinkpad/Dell/Asus workstation laptop, usually about 500 euros cheaper.
I've heard it said, with Linux, you pay with your time, with Apple, with your money, and with Windows, you pay with your dignity.
2 replies →
For most people there is no other choice.
I mean MacOS requires a relatively expansive machine to run it, and many business-critical software either doesn’t run on Macs, or has drastically reduced functionality.
Linux isn’t even worth bothering with if one isn’t technical.
So really, we are left with Windows.
Linux one is just untrue.
If you're not technical, just go with Mint. Looks like Windows 7, behaves like Windows 7, doesn't break. You don't have to leave GUI environments once, neither in installation nor in usage. Doesn't break. Gives you the opportunity to optimize your workflow if you want to.
4 replies →
I know completely computer illiterate people who use Ubuntu. Your opinion says a lot about you.
5 replies →
They are still not many choices out there: either Apple with high-priced defective keyboards and no desktop solutions or Linux which is still a gamble especially on laptop. In some regards, the situation is worse than a decade ago.
On Microsoft side, as soon as they announced Windows 10 would be rolling release OS (i.e. a perpetual beta one) I knew I was done with it on bare metal.
What other choice though? I mean especially for Laptops. I find Macbooks now completely unacceptable, as a 13 year mac user, and Linux still seems to have the old issue of unreliable driver support.
There are other choices but they all have limitations as well therefore it always ends up with better than devil you know.
There are choices, better than ever before, but to vast types of users this doesn't matter.
Some examples - corporate users (nobody big seriously considers Linux for desktops for various reasons, Apple would be easily 3-5x that expensive for no good enough added value), gaming (again some good options, but subpar to windows on probably every aspect).
Everybody knows Windows, everybody can somehow get by with just clicking around. If I've put Linux on my fiancee's notebook (she is a doctor), I would have to do 24x7 support for it, forever. No, thank you.
>Apple would be easily 3-5x that expensive for no good enough added value
Well that's straight up not true, in fact IBM has over 100,000 Macs in the field and they estimate it's saving them $535 per machine over four years.
https://www.computerworld.com/article/3131906/apple-mac/ibm-...
7 replies →
There is no other choice that runs Win32 and DirectX well.
Wine is hit-or-miss, but when it works it tends to be a good option.
XBox. :)
5 replies →
Recursive delete has always been a misfeature of computing, out of a mistaken entitlement to convenience when you are performing a fundamentally risky operation on a target you can never know the state of.
At best, it is redundant with a recursive search feature that chooses “delete” as the operation. And if you want “do something else then delete”, you can no longer call a recursive-delete command anyway so why not just learn how to enumerate files first and give your system a fighting chance to audit first?
Disk cleanup code should always create lists of known target files, attempt to delete only that list of files, then do the platform equivalent of “rmdir” at the end to attempt to remove the directory. If that fails, congratulations: your ass was saved by not deleting something you didn’t know was there.
"All your files are exactly where you left them"
"...we just marked them as deleted."
That was the first thing I thought of when I heard the news about the bug --- the infamous creepy message just got an even scarier meaning.
https://www.reddit.com/r/windows/comments/3x88yj/all_your_fi...
I really wonder what goes on at MS to have consensus to think such messages were ever a good idea. Even if the message is true, it scares the users because it's like ransomware. If it isn't, that's even worse because you're now lying to your users. In any case, they arouse suspicion and fear.
In the XP days, I believe updates would, after restarting, at most show a dialog with a more informative message ("Installing updates...") and a progress bar, and more importantly, your wallpaper and desktop would continue loading in the background --- the latter really helps with the unease, if not the annoyance. The full-screen, vague, and unnecessary messages just invoke feelings of horror.
Is there a commercial PC for sale which supports Windows 7, e.g. is there an OEM with a desktop offering with a Skylake CPU?
If you want to run an older OS have you considered just running a hypervisor like ESXi or KVM and then handling OS through that? There are lots of good solutions there at this point, and it can be a fun way to play with a lot of other cool features and different OS as well. You can even get near-native performance even for heavy duty graphics applications by using PCI passthrough. The only caveat that adds for hardware choice is that you'll want a processor with an IOMMU for the hardware virtualization support (AMD calls this "AMD-Vi", Intel "VT-d"). AMD is pretty good about not artificially segmenting there, I think everything modern they make supports it (all Ryzen/EPYC at least) though probably worth double checking overall system compat. Intel splits this all up more, Xeon always has everything but support varies elsewhere and you really just have to check the specs.
Even so that gives a ton of hardware choice and flexibility, and will give you more options to protect and control the systems beyond the OS themselves which is very important if you want to run something older since security patches will stop. But if you're judicious about what you use for what tasks and how you handle I/O it offers another option, and can make hardware changes a lot easier as well by abstracting away the metal somewhat. Basically a lot of the advantages that make virtualization so popular in general for business can be just as applicable at home these days, most of us have cycles and memory to spare and can afford to burn a bit of it on making a more pleasant software experience or working around issues coming from a higher level. In this case for example you could be running your Windows VM on virtual disks on a NAS/DAS or even the same system but supporting better snapshotting, and if the data was deleted simply roll back the entire VM to pre-upgrade state.
Is Windows still a big business for MS? Maybe they don't care any more, having Azure.
Judging from the state or Server 2016/Windows 10 updates I suspect not. How a rollup update for 2016 takes 30 mins+ to install (and often fails), yet the 2012 rollup is done in 10 mins is still baffling. This is on 2012 machines with much longer update histories.
Extended support for Windows 7 ends in a year and 3 months. Probably not a great idea to jump to an OS at EOL.
Also I doubt OEMs are allowed to sell Windows 7 any more (though I don’t know for sure).
Disclosure: Microsoft employee
You could try checking outlet.dell.com and filtering by 6th gen CPUs. You'll have to get your Win7 install media and key separately, though.
You don't need to punish yourself like this. Most PCs on sale support Linux or ChromeOS (which is, after all, Linux).
People have gotten XP running on Haswell so I don't think running 7 on Skylake would be a problem. msfn.org has some useful information on running (very) old OSs on new hardware.
Next year will be the year of Linux on the desktop.
s/Linux/ReactOS/
"Microsoft is advising anyone affected by the bug to contact support."
Yeah right, I'm sure that will be fruitful.
Every once in a while an MS support thread actually has a useful answer in it... given by some random internet commenter about 15 posts after MS support has determined only a reinstall will fix the problem.
Might be, they provide a undelete tool to recover the files
Yes, I read about this earlier and along with the support tip was the "and don't touch your PC" tip. So I'm pretty sure they'll advise some undelete tool and until then they don't want deleted data to get overwritten, but no more magical solution than that.
If the DoJ couldn't sue them, I wish the poor person who lost all their family photos good luck.
The DoJ could and did, but backed off when a friendlier administration took over.
Elections have consequences, even if they singing are on issues most voters aren't considering.
I would dispute this fact.
Just yesterday I saw that my brother's Win10 desktop was magically empty after he'd rebooted due to windows update. After hunting around for solutions, (none of which worked), I noticed that all the missing desktop files were magically in the recycle bin.
Nice work M$.
From memory, the update was #1803, so not sure if this is relevant to the arctechnica article... but since it was yesterday, it's clearly not 100% accurate.
(PS: No, my brother didn't cause them to be put there)
The update in question here is #1809; so it's possible your brother encountered another data loss bug, which, considering their current state of QA, doesn't seem that unlikely.
By default the recycle bin has a limit of how much files it can store, and if you try to put more in there it will be deleted. It'll ask when you do the operation, but it if was done automatically as part of the update, who knows. A safety net with such big holes never made sense to me so that's why I always change the limit to 100% of the disk size. Your brother was lucky to recover his files.
I wonder if Microsoft is able to cancel the installation of already downloaded updates - if not, something like this might happen if the erroneous update was already downloaded in the background earlier. I think the default setting is that updates will be downloaded automatically and then installed later whenever the system decides there is a suitable period of "inactive time".
The update that deleted files wasn't ever an automatic one. You would have had to manually update. Fingers crossed they didn't mess up another. Seems unlikely.
1 reply →
Amazing that such a simple bug made it all the way through testing. Just goes to show that testing isn’t foolproof!
From an earlier Ars article:
"Compounding this issue is that Microsoft's rollout of version 1809 was already unusual. For reasons unknown, Microsoft didn't release this update to the Release Preview ring, so the most realistic installation scenario—someone going from version 1803 to 1809—didn't receive much testing anyway. And all this is against the longer-term concern that Microsoft laid off many dedicated testers without really replacing the testing that those testers were doing."
And from this article:
"In response the company has promised to update the Feedback Hub tool so that the severity of bugs can be indicated. Many people reported this data loss bug, but none of the reports received many upvotes, with Microsoft accordingly disregarding those bugs. If the bugs had been marked as causing data loss—the highest severity possible—then they may have received the additional attention that they deserved. Microsoft hasn't, however, explained why this update didn't receive any kind of "release preview" distribution or testing. There are no guarantees that this would have caught the bug, but it would have meant that an extra round of people would have installed the update onto their systems, and who knows, one of their bug reports might have gotten lucky."
As a dedicated tester for a large-ish company I can't even imagine how many problems would go unreported if they even got rid of half of our department. It's hard to quantify the exact value of SQA so I can see some manager over-looking its importance, but this is Microsoft. They should know better.
Their QA is nothing short of shitty at the moment therefore I suspect this is business as usual going forwards.
Windows 10 is literally death by a thousand paper cuts for me.
I've switched from macOS to Win10+WSL as my main dev machine this summer, mainly because I like Thinkpad hardware much more (and wanted to give the standard OS there a try), but I'm close to giving up on it and switching to Linux. It's crazy how much crap it throws at you at a daily basis.
* explorer, and even in general file operations are dead slow for some reason. an expand of a zip from explorer with a couple of 10s of thousands of files can take an hour, while in WSL takes maybe a minute. explorer also takes its sweet time to load, including in open dialogs. This being on a near top-of-the-line 480s with 24GB Ram and 1TB SSD.
* windows don't remember their previous position on multi-screens.
* copying in terminal sometimes seems to work, sometimes not.
* terminal beeps at you on every tab with more than one option, always have to keep sound muted.
* bluetooth menu is glitchy and there's no standard quick way to connect to a previous device.
* no idea whether that's win10, spotify or thinkpad software, but hitting a media key produces a NON DISMISSIBLE big overlay for spotify that just hangs there for a good 10 seconds and blocks the stuff I want to click.
* solution for a full taskbar? just make it scroll with very small scroll buttons...
* some older Logitech mouse I connect has buggy assignment of forward/back keys - does a completely random operation instead. Windows doesn't seem to have a GUI-way to set this stuff up
* terminal has no tabs and crappy colors and I don't wanna go down the rabbit hole of trying to integrate WSL with a non-default terminal emulator. I've installed the spring update, won't touch october one for a while at least.
* there's no integration of WSL & windows GUI layer. Have to start an X-Server separately and have Linux GUI-tools instead. If I seriously need that I will simply switch to a Linux distro instead (which given the above I start to suspect I should have done from the beginning).
75 replies →
It's been okay since I upgraded to Windows 7. The lack of updates may become an issue at some point. On the other hand, the lack of updates can also be a boon. I really want to swap to Linux, but can't let go of Visual Studio. The second WINE can get Visual Studio running, I'm out.
6 replies →
They no longer have QA. Their 'QA' is the Insiders program
"death by a thousand cuts" is used figuratively in this instance
Imagine a car-repair men standing besides your car, smiling "Your car will no longer spontaneously combust and kill those inside!" like that is a achievement to the previous incarnation.
If you want your files to be save, switch to linux. This - for lack of a better term - company obviously considers deleting them a case that can occasionally happen.
And all those additional backups, the time invested into that- makes windows to expensive as a system- even for free.
Good write-up. So if I didn’t move any special Known Folders to a different drive, I should be ok?
> Adding insult to injury, there are ways in which Windows users could have enabled KFR without really knowing that they did so or meaning to do so. The OneDrive client, for example, can set up KFR for the Documents and Pictures folders, if you choose to enable automatic saving of documents and pictures to OneDrive.
Worse still, the OneDrive client apparently left documents in the original locations which the Windows update would then delete:
> The current OneDrive client will set up KFR and then move any files from their original location to the new OneDrive location. Older versions of the OneDrive client, however, would set up KFR but leave existing files in the old location. The October 2018 Update would then destroy those files.
Microsoft's own software which they bundled with Windows 10 and nagged Windows 10 users to set up and use triggered a data loss bug which caused Windows 10 to delete those users' documents - all because Microsoft didn't think it through, didn't test properly, and didn't take any notice of data loss reports from external beta testers.
I'm seriously considering using a pirated version of LTSB for my next build...
LTSB is just lovely. Maybe the best part of having access to an MSDN subscription these days. It'd be worth paying a premium for as a consumer, if they ever wanted to; manufacturers are slighly better than they once were, but it's still a necessary first step to wipe disks and install a crapware free Windows immediately on a new machine, and LTSB is far and away the best for that.
WSL would be nice, but it's better all in all to just spin up a ubuntu vm in hyperv.
What a time we're living in
I believe it also broke the File History and Windows Backup (if you use those). I had a client I had to restore from backups. Thankfully they only lost 1 small file in the mix of it all.
Honestly, I don't care. I've disabled Windows updates for the time being since I no longer trust their update QA. I can deal with a non-functioning system. I can't deal with files being deleted outright that don't belong to the system in any way.
Thanks for doing the bare minimum.
It's bugs, all the way down.
This should be a huge warning.
Speaking as someone who has whose undergrad degree is in Electrical Engineering from a top-rated school and who worked designing computer chips for 4 years, Windows is not properly engineering its software. I also have extensive software experience.
Microsoft appears to be relying on users to do their testing instead of making their testing as part of the overall engineering process. [1] [2]
Can you imagine Boeing asking passengers to test their airframes or GE asking passengers to test their jet engines?
In Mac office, there is a bug with the new upgrade to Mac OS 10.14 (Mojave) where "recent files" local to the Mac are not retained (but those to the cloud are).
Proper engineering means that there are testing protocols followed while developing the product. The fact that such as simple bug as Recent files was not tested speaks a lot to the lack of proper testing of the software as does this Windows 10 bug.
Neither Microsoft nor Facebook, nor Google have people with engineering degrees mentored in engineering as their leadership and I doubt that there are many board members who have the requisite engineering education and experience.
I was mentored in a particular set of processes for ensuring proper design of computer chips and this process is simply lacking in many software systems.
Not all bugs can be caught by testing, many have to be prevented in the first place through proper processes.
[1] https://www.extremetech.com/computing/246685-microsoft-claim...
[2] https://www.reddit.com/r/Surface/comments/3s14un/just_a_remi...
Yay! Windows won't delete my data anymore!!
I can't believe I can say that.
This is a total disgrace. The company that's worth how many 100 of billions.. and this happens. Complete and utter farce.
explicitly...