I've built a load of utilities that do that just fine. I use vim as an editor.
The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though. see: https://learn.microsoft.com/en-gb/visualstudio/releases/2022... - you should use these if you are not a single developer and have to collaborate with people. Back like in the old days when we had pinned versions of the toolchain across whole company.
> The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though.
You only get access to the LTSC channel if you have a license for at least Visual Studio Professional (Community won't do it); so a lot of hobbyist programmers and students are not aware of it.
On the other hand, its existence is in my experience very well-known among people who use Visual Studio for work at some company.
The Visual Studio Build Tools are installable with winget (`winget search buildtools`).
There are licensing constraints, IANL but essentially you need a pro+ license on the account if you're going to use it to build commercial software or in a business environment.
I worked with VC++ 6.0 up until Windows 11 when it really, really wouldn't run any more, then switched to VS 2008. The code is portable across multiple systems so it didn't really matter which version of VS it's developed with, and VC++ 6.0 would load, build the project, and have it ready to run while VS 2022 was still struggling through its startup process.
VS 2008 is starting to show the elephantine... no, continental land-mass bloat that VS is currently at, and has a number of annoying bugs, but it's still vastly better than anything after about VS 2012. And the cool thing is that MS can'tfuckwithitanymore. When I fire up VS tomorrow it'll be the exact same VS I used today, not with half a dozen features broken, moved around, gone without a trace, ...
They've completely reworked release plans. 2026 LTSC will come out a year after the initial VS 2026 release (at the same time as VS 2027) and be supported for 1 more year. You pretty much have to get on the rolling updates train for the IDE, which is why the C++ toolchain now follows a different schedule and you're supposed to be able to install any specific toolchain side by side.
Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath? glibc dependencies that can't be resolved because you need two different versions simultaneously in the same build somehow... python in another realm here as well. That shiny c++ project that needs a bleeding edge boost version that is about 6 months away from being included in your package manager. Remember patching openSSL when heartbleed came around (libssHELL).
Visual studio is a dog but at least it's one dog - the real hell on windows is .net framework. The sheer incongruency of what version of windows has which version of .net framework installed and which version of .net your app will run in when launched... the actual solution at scale for universal windows compatibility on your .net app is to build a c++ shim that checks for .net beforehand and executes it with the correct version in the event of multiple version conflict - you can literally have 5 fully unique runtimes sharing the same .net target.
> glibc dependencies that can't be resolved because you need two different versions simultaneously in the same build somehow...
If you somehow experience an actual dependency issue that involves glibc itself, I'd like to hear about it. Because I don't think you ever will. The glibc people are so serious about backward and forward compatibility, you can in fact easily look up the last time they broke it: https://lwn.net/Articles/605607/
Now, if you're saying it's a dependency issue resulting from people specifying wrong glibc version constraints in their build… yeah, sure. I'm gonna say that happens because people are getting used to pinning dependency versions, which is so much the wrong thing to do with glibc it's not even funny anymore. Just remove the glibc pins if there are any.
As far as the toolchain as a whole is concerned… GCC broke compatibility a few times, mostily in C++ due to having to rework things to support newer C++ standards, but I vaguely remember there was a C ABI break somewhere on some architecture too.
Glibc has been a source of breakage for proprietary software ever since I started using Linux. How many codebases had to add this line around 2014 (the year I brought my first laptop)?
When was the last time you actually used. NET? Because that's absolutely not how it is. The. NET runtime is shipped by default with Windows and updated via WU. Let alone that you're talking about .NET Framework which has been outdated for years.
Since .NET 10 still doesn't support Type Libraries quite a few new Windows projects must be written in .NET Framework.
Microsoft sadly doesn't prioritize this so this might still be the case for a couple of years.
One thing I credit MS for is that they make it very easy to use modern C# features in .NET Framework. You can easily write new Framework assemblies with a lot of C# 14 features. You can also add a few interfaces and get most of it working (although not optimized by the CLR, e.g. Span). For an example see this project: https://www.nuget.org/packages/PolySharp/
It's also easy to target multiple framework with the same code, so you can write libraries that work in .NET programs and .NET Framework programs.
This is one of the things that tilts me about C and C++ that has nothing to do with mem safety: The compile/build UX is high friction. It's a mess for embedded (No GPOS) too in comparison to rust + probe-rs.
That hasn't been my experience at all. Cross-compiling anything on Rust was an unimaginable pain (3 years or so ago). While GCCs approach of having different binaries with different targets does have its issues, cross compiling just works.
Well, traditionally, there was no Python/pip, JS/npm in Linux development, and for C/C++ development, the package manager approach worked surprisingly well for a long time.
However, there were version problems: some Linux distributions had only stable packages and therefore lacked the latest updates, and some had problems with multiple versions of the same library. This gave rise to the language-specific package managers. It solved one problem but created a ton of new ones.
Sometimes I wish we could just go back to system package managers, because at times, language-specific package managers do not even solve the version problem, which is their raison d'être.
Nix devShells works quite well for Python development (don't know about JS)
Nixpkgs is also quite up to date.
I haven't looked back, since adopting Nix for my dev environments.
I went from POP OS (Ubuntu) to EndeavourOS (Arch) Linux because some random software with an appimage or whatever refused to run with Ubuntus “latest” GLIBC and it ticked me off, I just want to run more modern tooling, havent had any software I couldnt just run on Arch, going on over a year now.
Indeed. As late as 2 hours ago I had to change the way I build a private Tauri 2.0 app (bundled as .AppImage) because it wouldn't work on latest Kubuntu, but worked on Fedora and EndeavourOS. So now I have to build it on Ubuntu 22.04 via Docker. Fun fun.
Had fewer issues on EndeavourOS (Arch) compared to Fedora overall though... I will stay on Arch from now on.
.NET does have flags to include the necessary dependencies with the executable these days so you can just run the .exe and don't need to install .net on the host machine. Granted that does increase the size of the app (not to mention adding shitton of dll's if you don't build as single executable) but this at least is a solved problem.
They do now, after .net core and several other iterations. You'll also be shipping a huge executable compared to a clr linked .net app (which can be surprisingly small).
>Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath?
That seems more a property of npm dependency management than linux dependency management.
To play devil's advocate, the reason npm dependency management is so much worse than kernel/os management, is because their scope is much bigger, 100x more package, each package smaller, super deep dependency chains. OS package managers like apt/yum prioritize stability more and have a different process.
I have never experienced issues with pip, and I’m not sure it’s whether I’m doing something that pip directly supports and avoiding things it doesn’t help with.
I’d really love to understand why people get so mad about pip they end up writing a new tool to do more or less the same thing.
The counterpoint of this is Linux distros trying to resolve all global dependencies into a one-size-fits-nothing solution - with every package having several dozen patches trying to make a brand-new application release work with a decade-old release of libfoobar. They are trying to fit a square peg into a round hole and act surprised when it doesn't fit.
And when it inevitably leads to all kinds of weird issues the packagers of course can't be reached for support, so users end up harassing the upstream maintainer about their "shitty broken application" and demanding they fix it.
Sure, the various language toolchains suck, but so do those of Linux distros. There's a reason all-in-one packaging solutions like Docker, AppImage, Flatpak, and Snap have gotten so popular, you know?
The real kicker is when old languages also fall for this trap. The latest I'm aware of is GHC, which decided to invent it's own build system and install script. I don't begrudge them from moving away from Make, but they could have used something already established.
The purpose isn't information, the purpose is drama.
Er, sorry. I meant: the purpose isn't just drama—it's a declaration of values, a commitment to the cause of a higher purpose, the first strike in a civilizational war of independence standing strong against commercialism, corporatism, and conformity. What starts with a single sentence in an LLM-rewritten blog post ends with changing the world.
See? And I didn't even need an LLM to write that. My own brain can produce slop with an em dash just as well. :)
Humans invented writing, not LLMs. They are copying us not the other way around. You can’t jump on 1 sentence that vaguely sounds like an LLM and say it’s written by AI. It’s so silly. I understand the aversion to AI slop but this is not that.
people run on heuristics and no amount of our righteousness will change that. the entire article absolutely reeks of LLM style so the original commentor isnt off the mark. to address your point, LLMs are copying that which leads to the most human engagement, so the way you expressed things makes it seem like you are defending junk food as real food. which of course it is, however it is designed to make someone money at the cost of human health. that's not something i'd be defending personally.
Had to do this back in 2018, because I worked with a client with no direct internet access on it's DEV/build machines (and even when there was connectivity it was over traditional slow/low-latency satellite connections), so part of the process was also to build an offline install package.
Well - "run as admin" wasn't a problem for that scenario - as I was also configuring the various servers.
(And - it is better on a shared-machine to have everything installed "machine-wide" rather than "per-user", same as PowerShell modules - had another client recently who had a small "C:" drive provisioned on their primary geo-fenced VM used for their "cloud admin" team and every single user was gobbling too much space with a multitude of "user-profile" specific PowerShell modules...)
But - yes, even with a highly trimmed workload it resulted in a 80gb+ offline installer. ... and as a server-admin, I also had physical data-center access to load that installer package directly onto the VM host server via external drive.
No thanks. I’m not going to install executables downloaded from an unknown GitHub account named marler8997 without even a simple hash check.
As others have explained the Windows situation is not as bad as this blog post suggests, but even if it was this doesn’t look like a solution. It’s just one other installation script that has sketchy sources.
You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.
Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.
It is somewhat different when your system forces binaries to be signed... but yeah, largely agreed. The abject refusal of curl|sh is strange to me, unless the refusers are also die-hard GPL adherents. Binaries are significantly more opaque and easier to hide malware in, in almost all cases.
> You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.
You do because the downloaded ZIP contains an EXE, not a readable script, that then downloads the compiler. Even if you skip that thinking "I already have VS set up", the actual build line calls `cl` from a subdirectory.
I'm not going to reconstruct someone's build script. And that's just the basic example of a one file hello world, a real project would call `cl` several times, then `link`, etc.
Just supplying a SLN + VCXPROJ is good enough. The blog post's entire problem is also solved by the .vsconfig[1] file that outlines requirements. Or you can opt for CMake. Both of these alternatives use a build system I can trust over randomgithubproject.exe, along with a text-readable build/project file I can parse myself to verify I can trust it.
>The curl|sh workflow is no more dangerous that downloading an executable off the internet
It actually is for a lot of subtle reasons, assuming you were going to check the executable checksum or something, or blindly downloading + running a script.
The big thing is that it can serve you up different contents if it detects it's being piped into a shell which is in theory possible, but also because if the download is interrupted you end up with half of the script ran, and a broken install.
If you are going to do this, its much better to do something like:
sh -c "$(curl https://foo.bar/blah.sh)"
Though ideally yes you just download it and read it like a normal person.
I know Jonathan Marler for some of his Zig talks and his work in the win32 api bindings for Zig[0], they are even linked from Microsoft's own repo[1] (not sure why he has 2 github users/orgs but you can see it's the same person in the commits).
Is this post AI-written? The repeated lists with highlighted key points, the "it's not just [x], but [y]" and "no [a] just [b]" scream LLM to me. It would be good to know how much of this post and this project was human-built.
I was on the fence about such an identification. The first "list with highlighted key points" seemed quite awkward to me and definitely raised suspicion (the overall list doesn't have quite the coherence I'd expect from someone who makes the conscious choice; and the formatting exactly matches the stereotype).
But if this is LLM content then it does seem like the LLMs are still improving. (I suppose the AI flavour could be from Grammarly's new features or something.)
It's interesting... Different LLM models seem to have a few sentence structures that they seem to vastly overprefer. GPT seems to love "It's not just X, it's Y", Claude loves "The key insight is..." and Gemini, for me, in every second response, uses the phrase "X is the smoking gun". I hear the smoking gun phrase around 5 times a day at this point.
It's hated by everyone, why would people imitate it? You're inventing a rationale that either doesn't exist or would be stupider than the alternative. The obvious answer here it they just used an LLM.
I love the style it was written in. I felt a bit like reading a detective novel, exploring all terrible things that happened and waiting for a plot twist and hero comming in and saving the day.
No, they do it because they're mode-collapsed, use similar training algorithms (or even distillation on each other's outputs) and have a feedback loop based on scraping the web polluted with the outputs of previous gen models. This makes annoying patterns come and go in waves. It's pretty likely that in the next generation of models the "it's not just X, it's Y" pattern will disappear entirely, but another will annoy everyone.
This is purely an artifact of training and has nothing to do with real human writing, which has much better variety.
I came back around 2017*, expecting the same nice experience I had with VB3 to 6.
What a punch in the face it was...
I honestly cannot fathom anyone developing natively for windows (or even OSX) at this day and age.
Anything will be a webapp or a rust+egui multi-plataform developed on linux, or nothing. It's already enough the amount of self-hate required for android/ios.
* not sure the exact date. It was right in the middle of the WPF crap being forced as "the new default".*
What's exhausting is getting through a ten-paragraph article and realising there was only two paragraphs of actual content, then having to wade back through it to figure out which parts came from the prompt, and which parts were entirely made up by the automated sawdust injector.
I analyzed the test using Pangram, which is apparently reliable, it say "Fully human Written" without ambiguity.[1]
I personally like the content and the style of the article. I never managed to accept going through the pain to install and use Visual Studio and all these absurd procedures they impose to their users.
I wish open source projects would support MingW or at least not actively blocking it's usage. It's a good compiler that provides an excellent compatibility without the need of any extra runtime DLLs.
I don't understand how open source projects can insist on requiring a proprietary compiler.
There are some pretty useful abstractions and libraries that MinGW doesn't work with. Biggest example is the WIL[1], which Windows kernel programmers use and is a massive improvement in ergonomics and safety when writing native Windows platform code.
if you want to link msvc built libraries (that are external/you dont have source), mingw may not be an option. for an example you can get steamworks sdk to build with mingw but it will crash at runtime
From the capitalization I can tell you and the parent might not be aware it's "minimal GNU for Windows" which I would tend to pronounce "min g w" and capitalize as "MinGW." I used to say ming. Now it's my little friend. Say hello to my little friend, mang.
If you need the Windows(/App) SDK too for the WinRT-features, you can add `winget install --id Microsoft.WindowsSDK.10.0.18362` and/or `winget install --id Microsoft.WindowsAppRuntime.1.8`
Having been the person that used to support those packages, it’s not that simple. You need to pass what workloads you need installed too, and if it’s a project you’re not familiar with god help you.
I used to just install the desktop development one and then work through the build errors until I got it to work, was somewhat painful. (Yes, .vsconfig makes this easier but it still didn’t catch everything when last I was into Windows dev).
I thought for a moment I was missing something here. I always just use winget for this sort of thing as well. It may kickoff a bunch of things, but it’s pretty low effort and reliable.
> What if you have two different project with different requirements at the same time?
Install multiple versions of Windows SDK. They co-exist just fine; new versions don’t replace old ones. When I was an independent contractor, I had 4 versions of visual studio and 10 versions of windows SDK all installed at once, different projects used different ones.
Windows SDKs, WDKs (driver dev), Visual Studio releases, and .NET SDKs all coexist peacefully on a machine. If a project build breaks due to newer SDKs, it's because it was configured with "use newest version". (Which is usually fine but sometimes requires pinning if you're using more "niche" things like WDK)
Nearly all of the Windows hate i see comes from 20 year old takes . ( the bing/cortana / copilot / ads slop criticism is warranted, but is also easily disabled).
For big C++ projects, the .vsconfig import/export way of handling Visual Studio components has worked well for the large teams I'm on. Tell someone to import a .vsconfig and the Visual Studio Installer does everything. Only times we've had issues is from forgetting to update it with components/SDK changes.
Yeah, seems like this is just ignorance around .vsconfig files. Makes life way easier. You can also just use the VS Build Tools exe to install things instead of the full VS installer, if you plan to use a different IDE.
Nitpick, "Windows Native Development" also refers to the NT native subsystem, which would be basically coding against private APIs instead of Win32. From the title I thought that's what this was. Then I realized it was about avoiding full use of Visual Studio when building C projects (something that a lot of people already do by the way)
It starts by not looking into Windows through UNIX developer glasses.
The only issue currently plaguing Windows development is the mess with WinUI and WinAppSDK since Project Reunion, however they are relatively easy to ignore.
>It starts by not looking into Windows through UNIX developer glasses.
People don't need any UNIX biases to just want multiple versions of MSVS to work the way Microsoft advertises. For example, with every new version of Visual Studio, Microsoft always says you can install it side-by-side with an older version.
But every time, the new version of VS has a bug in the install somewhere that changes something that breaks old projects. It doesn't break for everybody or for all projects but it's always a recurring bug report with new versions. VS2019 broke something in existing VS2017 installs. VS2022 broke something in VS2019. etc.
The "side-by-side-installs-is-supposed-to-work-but-sometimes-doesn't" tradition continues with the latest VS2026 breaking something in VS2022. E.g. https://github.com/dotnet/sdk/issues/51796
I once installed VS2019 side-by-side with VS2017 and when I used VS2017 to re-open a VS2017 WinForms project, it had red squiggly lines in the editor when viewing cs files and the build failed. I now just install different versions of MSVS in totally separate virtual machines to avoid problems.
I predict that a future version VS2030 will have install bugs that breaks VS2026. The underlying issue that causes side-by-side bugs to re-appear is that MSVS installs are integrated very deeply into Windows. Puts files in c:\windows\system32, etc. (And sometimes you also get the random breakage with mismatched MSVCRT???.DLL files) To avoid future bugs, Microsoft would have to re-architect how MSVS works -- or "containerize" it to isolate it more.
In contrast, gcc/clang can have more isolation without each version interfering with each other.
I'm not arguing this thread's msvcup.exe tool is necessary but I understand the motivations to make MSVS less fragile and more predictable.
Note that this also doesn't work on Linux - your system's package manager probably has no idea how to install and handle having multiple versions of packages and headers.
That's why docker build environments are a thing - even on Windows.
Build scripts are complex, and even though I'm pretty sure VS offers pretty good support for having multiple SDK versions at the same time (that I've used), it only takes a single script that wasn't written with versioning in mind, to break the whole build.
Why? You may end up with something that doesn't get much attention anymore, but none of the official gui approaches have ever been removed as far as I know. Win32, MFC, winforms, wpf, winui, maui are all still available and apps using them are functional. Even winjs still works apparently, even if it was handed over.
I wouldn't start an app in most of them today, but I wouldn't rewrite one either without a good reason.
> It’s so vast that Microsoft distributes it with a sophisticated GUI installer where you navigate a maze of checkboxes, hunting for which “Workloads” or “Individual Components” contain the actual compiler. Select the wrong one and you might lose hours installing something you don’t need.
I have a vague memory of stumbling upon this hell when installing the ldc compiler for dlang [1].
> On Linux, the toolchain is usually just a package manager command away. On the other hand, “Visual Studio” is thousands of components.
That package manager command, at the very least, pulls in 50+ packages of headers, compilers, and their dependencies from tens of independent projects, nearly each of them following its own release schedule. Linux distributions have it much harder orchestrating all of this, and yet it's Microsoft that cannot get its wholly-owned thing together.
Actually not that complicated: You simply check in a global.json [0] where you specify the sdk and workload versions.
Then you also specify target platform sdk versions in the .csproj file and VS will automatically prompt the developer to install the correct toolchain.
At $workplace, we have a script that extracts a toolchain from a GitHub actions windows runner, packages it up, stuffs it into git LFS, which is then pulled by bazel as C++ toolchain.
This is the more scalable way, and I assume it could still somewhat easily be integrated into a bazel build.
Keeping CI entirely out of windows desktop development is the biggest efficiency and cost improvement I've seen in the last 15 years. Our CI toolchain broke so we moved back to a release manager doing it manually. It takes him 20x less time to build it and distribute it (scripted) than it does to maintain the CI pipeline and his desktop machine is several times faster than any cloud CI node we can get hold of.
Edit: Uses a shit load less actual energy than full-building a product thousands of times that never gets run.
One day I decided to port my text editor to Windows. Since it depends on pcre2 and treesitter, these two libraries had to be provided by the system.
In the span of ~2hrs I didn't manage to find a way to please Zig compiler to notice "system" libraries to link against.
Perhaps I'm too spoiled by installing a system wide dependency in a single command. Or Windows took a wrong turn a couple of decades ago and is very hostile to both developers and regular users.
I think providing purely-functional libraries as system dependencies that's tied to the whole tool chain at the time was the wrong decision by the Unix world.
The system libraries should only ship system stuff: interaction with the OS (I/O, graphics basics, process management), accessing network (DNS, IP and TLS). They should have stable APIs and ABIs.
Windows isn't hostile. It has a differnt paradigm and Unix (or more correctly usually GNU/Linux) people do not want to give up their worldview.
PCRE is basically only your apps's dependency. It has nothing else to do the rest of the operating system. So it is your responsibility to know how to build and package it.
If you depend on a library and can't figure out how you would compile against it, it's probably better for the end user that you don't make anything because you'll still need to package it up later unless you link statically.
I suspect the pitfall is how you or the zig compiler is linking. Unless you're involving things which vary by OS like hardware interaction, networking, file systems etc, you should not, with a new Lang in 2026, need to do anything special for cross-platform capabilities.
My understanding that "linkSystemLibrary" abstraction in build.zig only holds for Unix systems. And this in turn makes it impossible to build my program on Windows without modifying the build script.
I am not too into windows dev but I am currently using msvc at work. We are told to import a config file into the installer and it automatically selects all of the components any of our projects will need. Wouldn't that solve the problem too? Just distribute a project level config file and add documentation for how to import and install the stuff.
* I wonder if Microsoft intentionally doesn't provide this first party to force everyone to install VS, especially the professional/enterprise versions. One could imagine that we'd have a vsproject.toml file similar to pyproject.toml that just does everything when combined with a minimal command line tool. But that doesn't exist for some reason.
Microsoft doesn't seem to care unless you're a company. That's the reason community edition is free. Individual licenses would be pennies to them, and they gain more than that by having a new person making things in their ecosystem. It's in their interest to make their platform accessible as possible.
I am not exactly bounding with eagerness and time to contribute to an open source project but the few times I have looked I stop at the "how do I configure my dev environment to match" step.
Just give me a VM. Then you will know, and I will know, every facet of the environment the work was done in.
The ironic part is that Visual Studio may be the best product Microsoft has ever made. Compared to the rest of their offerings, it is nothing short of amazing. It boggles the mind to know that this was developed in-house - well most of it anyways.
WOW such a great work. Myself I have been struggling with Mingw just to compile from source. Of course it works much cleaner then the hated visual studio, but then when it comes to cuda compile, that´s it.
Visual studio or the magority our there, It is invasive and full of bloatware like you say.
Same struggle with electron.
How to match it with cuda to compile from source the repos?
We manage Visual Studio on our CI machines using Ansible. Chocolatey installs the full Visual Studio and then we use the APIs provided to manage components via Ansible. See our action here: https://galaxy.ansible.com/ui/repo/published/kitware/visuals...
I don't get why people go through all these flaming hoops and hurdles to deal with MSVC when MinGW and MinGW-w64/MSYS2 are options. In the latter case you even still get (mostly complete) MSVC ABI-compatibility if you compile with clang.
MinGW and MinGW-64/MSYS2 are just as inscrutable, fragile and new-user-hostile. The fact that you have to choose between MinGW (which has a 64 bit version) or MinGW64 (completely separate codebases maintained by different people as far as I can tell) is just the first in a long obstacle course of decisions, traps, and unexplained acronyms/product names. There are dozens of different versions, pre-built toolchains and packages to throw you off-course if you choose the wrong one.
If you're just a guy trying to compile a C application on Windows, and you end up on the mingw-w64 downloads page, it's not exactly smooth sailing: https://www.mingw-w64.org/downloads/
MinGW/MSYS2 are flaming poop hurdles. That’s the bending over backwards to fake a hacky ass bad dev environment. Projects that only support MinGW on Windows are projecting “don’t take windows seriously”.
Supporting Windows without MinGW garbage is really really easy. Only supporting MinGW is saying “I don’t take this platform seriously so you should probably just ignore this project”.
As someone who has been doing Win32 development for literally decades, I'm not particularly convinced this is a problem that needs more code to solve. You don't need VS to get the compiler (which is available as a separate download called something like "build tools", I believe); and merely unpacking the download and setting a few environment variables is enough to get it working. It's easy to create a portable package of it.
Last I checked the license for the headless toolchain requires that a full licensed copy of Visual Studio be installed somewhere. So I think this violates the license terms.
A bug got opened against the rustup installing the headless toolchain by itself at some point. I'll see if I can find it
I was just setting up a new machine and was setting up the Rust environment. The very first thing rustup-init asked was to install Visual Studio before proceeding. It was like 20-30gb of stuff installed before moving forward.
This tool would be a great help if I knew beforehand.
I don't really use Windows OS much, but why not just use MinGW? Then you have Clang on all platforms you can think of: Android, all the various Darwin flavors and of course Linux and Windows; as well as on platforms you can't think of like FreeBSD or even Haiku maybe? Like honestly what's the point of supporting MSVC at all?? Maybe I'm just not enough of a Windows nerd to understand? (so I'm basically wondering if mingw has any drawbacks)
If you have a self-contained project, where you don't depend on anyone else and others don't depend on you, MinGW works great. Problems arise when you have dependencies that don't work with it. I'd love to see if MinGW could find a way to be binary compatible with MSVC-compiled binaries. Right now it's kind of an all or nothing solution which makes it hard to adopt.
Ah, binary-only dependencies, right… That's very specific though, so unless you need to drive some proprietary hardware, why bother using stuff that forces you into MSVC hell lol? Also wouldn't LLVM based MinGW benefit from Clang's MSVC compat? Not sure about this at all, that's why I'm asking, heh… ^^"
> You spend less time on your project because you’re too busy being a human-powered dependency resolver for a 50GB IDE.
Really? A 50GB IDE? How the heck one knows what goes in there?
My beloved FreeBSD 15.0 PLUS its Linux VM PLUS its docker env PLUS its dependencies and IDE are close to 26Gb and pretty sure I'm taking into account a lot of things I shouldn't, so the actual count is much less than that.
Developing software under a Windows platform is something that I cannot understand, since many many many years ago.
I'll just keep using Mārtiņš Možeiko's script, portable-msvc.py, that this tool is based upon. It does everything this does, except a lock file and the autoenv. I'm not particularly interested in the former, and definitely not the latter.
As someone who is out of the loop on Windows development, is this related to the Windows Driver Kit (WDK, I think it used to be DDK)? That's a certain type of hell I don't wish upon most.
No one should use any of these weird Frankenstein monstrosities in 2026. And a batch script? :( PowerShell exists.
Install:
- contrary to the blog post, the entirety of Visual Studio, because the IDE and debugger is *really damn good*.
- LLVM-MinGW[1]
Load the 'VSDevShell' DLL[2] for PowerShell, and you're good to go, with three different toolchains now:
cl.exe from VS
clang-cl.exe—you don't need to install this separately in VS; just use the above-mentioned llvm-mingw clang.exe as `clang.exe --driver=cl /winsysroot <path\to\Windows SDK> /vctoolsdir <path\to\VC>`. Or you can use it in GNU-driver-style mode, and use -Xmicrosoft-windows-sys-root. This causes it to target the Windows ABI and links against the VS SDK/VC tools
`clang.exe` that targets the Itanium ABI and links against the MinGW libraries and LLVM libc++.
Done and dusted. Load these into a CMake toolchain and never look at them again.
People really like overcomplicating their lives.
At the same time, learn the drawbacks of all toolchains and use what is appropriate for your needs. If you want to write Windows drivers, then forget about anything non-MSVC (unless you really want to do things the hard way for the hell of it). link.exe is slow as molasses, but can do incremental linking natively. cl.exe's code gen is (sometimes) slightly worse than Clang's. The MinGW ABI does not understand things like SAL annotations[3], and this breaks very useful libraries like WIL[4] (or libraries built on top of them, like the Azure C++ SDK[5] The MinGW headers sometimes straight up miss newer features that the Windows SDK comes with, like cfapi.h[6].
LLVM-MinGW sounds external to Microsoft though. I think the blog focused on in-Microsoft solutions. And I am not sure the "contrary to the blog content" is valid - compared to Linux, the Microsoft stack is much more annoying to install. I installed it, but it was annoying to no ends and took ages.
> compared to Linux, the Microsoft stack is much more annoying to install.
Not really. It's just different. As a cross-platform dev, all desktop OSs have their own idiosyncracies that add up to a net of 'they are all equally rather bad'.
I dunno, it has its uses when porting software written for UNIX-first. Plus, I pointed out Clang, rather than GCC, because Clang is natively a cross-compiler. I don't like to be dogmatic about stuff; if it's useful then it's useful. If it isn't then I will say why (as I explained why there's no need for MSYS2/Cygwin below).
it's been 14 years since i've used msvc for anything real. iirc the philosophy back then was yearly versioned releases with rolling intermediate updates.
this seems to go down the road towards attempts at determinsticish builds which i think is probably a bad idea since the whole ecosystem is built on rolling updates and a partial move towards pinning dependencies (using bespoke tools) could get complicated.
I don't understand, just use scite editor with tcc. About a couple of megs download, no install required and your apps will run on everything from win 98 to linux with wine. And if the answer is c++ support then you get all the pain you deserve using that cursed language
I literally came to post the exact same line as my indicator that this was AI-generated. I ctrl-f'd first and sure enough I'm not alone in using 'key insight' as the canary.
Another option is explore winget and chocolaty. Most build tools and compilers can be installed via the command line on windows. Ask your favorite LLM to create a powershell script to install them all.
To me it seems as if Microsoft wants to make it deliberately harder to have
software developers. Now - I installed all the required things and compiled
on Windows too, but it is very annoying compared to Linux. Microsoft should
simply have ONE default build, e. g. "download this and 80% of developers
will be happy". No need for a gazillion checkboxes.
Windows Native is fine. People in that space are comfortable with it.
What needs to be fixed is the valley between unix and windows development for cross-os/many-compiler builds, so one that does both can work seamlessly.
It's not an easy problem and there are lots of faux solutions that seem to fix it all but don't (in builds, the devil is in edge cases).
I seriously doubt that people who get confused by the MSVC++ Installer will be able to handle a CLI app that installs a mystery MSVC++ toolchain version to a versioned directory. They're still going to click the Visual Studio icon on their desktop and scratch their head why your script didn't magically fix their problems.
Say what you want about coding agents, when the cost of writing code goes to near-zero, the cost of wrangling tools becomes a much bigger fraction of development effort. This is an amazing opportunity to address long-standing frictions.
Trollish usernames aren't allowed on HN, so we've banned this account*. If you want to pick a different username that isn't trollish, we can rename the account and unban it. It would be best to email hn@ycombinator.com for this, to make sure we get the message.
(It would have been better for us to catch this sooner, but in this case someone had to explain the name to me. Out of respect for HN's many Francophone readers, I think it's best to apply the rule.)
TLDR: I don't understand my native command line, see how lost I got when I tried to do my thing in a different environment.
- Not a unique problem to Windows or even MSVC; He's gonna hate XCode,
- Making Python a bootstrap dependency = fail,
- Lacks self-awareness to recognize aversion vs avoidance,
My background is distinctly non-Windows, but I survive around Windows so well that people think I'm a Mickeysoft type. And no, I don't use mingw, cygwin, ...
If any of the obstacles this user faced were legitimate, nobody would ever make any money on Windows, including and especially Microsoft - a company whose developers have the same challenges.
I'm being harsh because _mea quondam culpa_ and it's correctable.
Everything this user went thru is the result of aversion instead of avoidance.
To _avoid_ long deep dives into Windows, you need to recognize there is a different vocabulary and a radically different jargon dialect at play.
1. Learn a tiny minimum of Powershell; it's based on the same POSIX spec as bash and zsh, but like Python, Javascript, etc, instead of byte as the fundamental unit, they use objects. So there's less to learn to reach a greater level of convenience than soiling yourself with DOS/CMD/BAT. On Windows, pwsh has a default set of linux-like aliases to minimize the learning required for minimal operability. And never have to type \ instead of / for a directory separator.
2. Microsoft make money from training. To sell their meat-free steak (* ingredient: saw dust), they feed the suits an all-you-can-eat calorie, nutrition, and protein free buffet of documenting everything in great detail and routinely "streamlining" the names and terminology.
Development on Windows is in a different reference frame, but relative to their own reference frames, they're ultimately not all that different.
Approach in your "foreign language" mindset; English alphabet but the words mean different things.
3. What not how. "How do I grep" means you are trying to random access bytes out of a random access character stream. "What's the command to search for text in files?" well, if you're bloody mindedly using cmd, then it's "find".
4. Seriously, learn a little Powershell.
I only approached Powershell hoping to gain material for a #SatansSphincter anti-ms rant while using it as a Rosetta Stone for porting shell scripts in our CI for Windows.
I mean, it is based on the same POSIX spec as sh, bash, and zsh, with a little Perl thrown in. That can't not go horribly, insidiously, 30-rock wrong in the hands of MS, right?
Turned out, it's the same paradigm shift perl/shell users have to make when coming into Python:
from `system("ps | grep hung")` to `"hung" in system("ps")`;
from `system("ifconfig -a | sed 's/\<192\.168\.0\./10.0.0./g'")` to `system("ifconfig -a").replace("192.168.0.", "10.0.0.")`
`grep` is a command that applies an assumption to a byte stream, often the output of a command.
In powershell, executing a command is an expression. In the case of a simple command, like "ps", that expression resolves to a String, just like system(...) does in Python.
Learning even a small amount of Powershell is immensely helpful in better understanding your enemy if you're going to have to deal with Windows. The formal names for official things use "verb-singularnoun".
That last part of the convention is the magic: the naming of things on Windows is madness designed to sell certifications, so crazy even MS ultimately had to provide themselves a guide.
I'm just asking, but is there really a need for a native programs anymore? Where I worked a decade ago, we started porting all our native programs over to the browser and this was when MVC beta was just being released. At this point with Electron and Tauri, is there even a need to write a native program
Now with AI, I would think that porting a native program to the browser wouldn't be the chore it once was.
Yes, very definitely. There has always been a need for high performance native applications. Even in the beginning of the desktop computing revolution, these questions have been asked .. and yes, there is a balance between native and cloud/browser-based computing - some of it is personal, much of it is industrial and corporate, and yet more of the spectrum where both methods are applicable exists, even still, decades later.
> is there really a need for a native programs anymore
As long as you don't give a shit about the fact that your baseline memory consumption is now 500MB instead of 25MB, and that 80% of your CPU time is wasted on running javascript through a JIT and rendering HTML instead of doing logic, no.
If you don't give a shit about your users or their time, there's indeed no longer a need to write native programs.
I use COM and DLLs to extend software/automate. Using Visual Studio gives me some really nice debugging options.
I did try using python and js but the variable explorer is garbage due to 'late binding'.
I thought this was just my ignorance, but I've asked experts, AI, and google searched and they unfortunately agree. That said, some people have created their own log/prints so they don't need to deal with it.
At the risk of being that guy, I haven't had any issues onboarding people onto native projects written in Rust. rustup does a great job of fetching the required toolchains without issue. I'd imagine the same is also true of Go or Zig.
While Microsoft <3 Rust, there are still some quality tooling parity to reach versus Visual Studio abilities for .NET, Python and C++.
Incremental compilation, and linking, parallel builds, hot code reloading, REPL, graphical debugging optimised builds, GPU debugging....
Go is better left for devops stuff like Docker and Kubernetes, and Zig remains to be seen when it becomes industry relevant beyond HN and Reddit forums.
I'm pretty people who write and build C++ on Windows do it for good reasons, often reasons that are out of their control. Your comment is not going to make any difference.
You have to do this for certain rust things too. I can't remember which, but I inevitably run into a need to install the MSVC toolchain to compile rust. I think it might be related to FFI, or libs which use FFI? The same thing comes up in Linux, but the process to install it is different.
I got anxiety reading the article, describing exactly why it sucks. It's nice to know from the article and comments here there are ways around it, but the way I have been doing it was the "hope I check the right checkboxes and wait a few hours" plan. There is usually one "super checkbox" that will do the right things.
I have to do this once per OS [re]install generally.
This is harder than what I do. Just install LTSC Visual Studio build tools from [1], then chuck this in a cmd file:
I've built a load of utilities that do that just fine. I use vim as an editor.
The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though. see: https://learn.microsoft.com/en-gb/visualstudio/releases/2022... - you should use these if you are not a single developer and have to collaborate with people. Back like in the old days when we had pinned versions of the toolchain across whole company.
[1] https://download.visualstudio.microsoft.com/download/pr/5d23...
> The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though.
You only get access to the LTSC channel if you have a license for at least Visual Studio Professional (Community won't do it); so a lot of hobbyist programmers and students are not aware of it.
On the other hand, its existence is in my experience very well-known among people who use Visual Studio for work at some company.
You can install the LTSC toolchain without a license. Just not the IDE.
16 replies →
The Visual Studio Build Tools are installable with winget (`winget search buildtools`).
There are licensing constraints, IANL but essentially you need a pro+ license on the account if you're going to use it to build commercial software or in a business environment.
I worked with VC++ 6.0 up until Windows 11 when it really, really wouldn't run any more, then switched to VS 2008. The code is portable across multiple systems so it didn't really matter which version of VS it's developed with, and VC++ 6.0 would load, build the project, and have it ready to run while VS 2022 was still struggling through its startup process.
VS 2008 is starting to show the elephantine... no, continental land-mass bloat that VS is currently at, and has a number of annoying bugs, but it's still vastly better than anything after about VS 2012. And the cool thing is that MS can't fuck with it any more. When I fire up VS tomorrow it'll be the exact same VS I used today, not with half a dozen features broken, moved around, gone without a trace, ...
Yeah recent VS is awful. I recently tried VS2022. What a mess.
I noticed Visual Studio 2026 doesn't have an LTSC release yet. Any idea when that will come out?
They've completely reworked release plans. 2026 LTSC will come out a year after the initial VS 2026 release (at the same time as VS 2027) and be supported for 1 more year. You pretty much have to get on the rolling updates train for the IDE, which is why the C++ toolchain now follows a different schedule and you're supposed to be able to install any specific toolchain side by side.
2 replies →
Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath? glibc dependencies that can't be resolved because you need two different versions simultaneously in the same build somehow... python in another realm here as well. That shiny c++ project that needs a bleeding edge boost version that is about 6 months away from being included in your package manager. Remember patching openSSL when heartbleed came around (libssHELL).
Visual studio is a dog but at least it's one dog - the real hell on windows is .net framework. The sheer incongruency of what version of windows has which version of .net framework installed and which version of .net your app will run in when launched... the actual solution at scale for universal windows compatibility on your .net app is to build a c++ shim that checks for .net beforehand and executes it with the correct version in the event of multiple version conflict - you can literally have 5 fully unique runtimes sharing the same .net target.
> glibc dependencies that can't be resolved because you need two different versions simultaneously in the same build somehow...
If you somehow experience an actual dependency issue that involves glibc itself, I'd like to hear about it. Because I don't think you ever will. The glibc people are so serious about backward and forward compatibility, you can in fact easily look up the last time they broke it: https://lwn.net/Articles/605607/
Now, if you're saying it's a dependency issue resulting from people specifying wrong glibc version constraints in their build… yeah, sure. I'm gonna say that happens because people are getting used to pinning dependency versions, which is so much the wrong thing to do with glibc it's not even funny anymore. Just remove the glibc pins if there are any.
As far as the toolchain as a whole is concerned… GCC broke compatibility a few times, mostily in C++ due to having to rework things to support newer C++ standards, but I vaguely remember there was a C ABI break somewhere on some architecture too.
> The glibc people are so serious about backward and forward compatibility, you can in fact easily look up the last time they broke it
What? There was a huge breakage literally last year: https://sourceware.org/bugzilla/show_bug.cgi?id=32653
Glibc has been a source of breakage for proprietary software ever since I started using Linux. How many codebases had to add this line around 2014 (the year I brought my first laptop)?
4 replies →
When was the last time you actually used. NET? Because that's absolutely not how it is. The. NET runtime is shipped by default with Windows and updated via WU. Let alone that you're talking about .NET Framework which has been outdated for years.
.NET runtime is not shipped with Windows, but once installed can be updated by WU.
Only the latest .NET Framework 4.8 is shipped with Windows at this point.
The issue is in supporting older windows versions - which sadly is still a reality for most large-scale app developers.
4 replies →
.NET versions are faster outdated then .Net Framework 4.8
6 replies →
Which has been fixed on .NET 5 and later.
.NET Framework should only be used for legacy applications.
Unfortunately there are still many around that depend on .NET Framework.
Since .NET 10 still doesn't support Type Libraries quite a few new Windows projects must be written in .NET Framework.
Microsoft sadly doesn't prioritize this so this might still be the case for a couple of years.
One thing I credit MS for is that they make it very easy to use modern C# features in .NET Framework. You can easily write new Framework assemblies with a lot of C# 14 features. You can also add a few interfaces and get most of it working (although not optimized by the CLR, e.g. Span). For an example see this project: https://www.nuget.org/packages/PolySharp/
It's also easy to target multiple framework with the same code, so you can write libraries that work in .NET programs and .NET Framework programs.
1 reply →
.Net Framework 4.8 has a longer life cycle as the current .NET version
3 replies →
.NET Framework 5 or .NET Core 5?
1 reply →
This is one of the things that tilts me about C and C++ that has nothing to do with mem safety: The compile/build UX is high friction. It's a mess for embedded (No GPOS) too in comparison to rust + probe-rs.
That hasn't been my experience at all. Cross-compiling anything on Rust was an unimaginable pain (3 years or so ago). While GCCs approach of having different binaries with different targets does have its issues, cross compiling just works.
5 replies →
Well, traditionally, there was no Python/pip, JS/npm in Linux development, and for C/C++ development, the package manager approach worked surprisingly well for a long time.
However, there were version problems: some Linux distributions had only stable packages and therefore lacked the latest updates, and some had problems with multiple versions of the same library. This gave rise to the language-specific package managers. It solved one problem but created a ton of new ones.
Sometimes I wish we could just go back to system package managers, because at times, language-specific package managers do not even solve the version problem, which is their raison d'être.
Nix devShells works quite well for Python development (don't know about JS) Nixpkgs is also quite up to date. I haven't looked back, since adopting Nix for my dev environments.
I went from POP OS (Ubuntu) to EndeavourOS (Arch) Linux because some random software with an appimage or whatever refused to run with Ubuntus “latest” GLIBC and it ticked me off, I just want to run more modern tooling, havent had any software I couldnt just run on Arch, going on over a year now.
Indeed. As late as 2 hours ago I had to change the way I build a private Tauri 2.0 app (bundled as .AppImage) because it wouldn't work on latest Kubuntu, but worked on Fedora and EndeavourOS. So now I have to build it on Ubuntu 22.04 via Docker. Fun fun.
Had fewer issues on EndeavourOS (Arch) compared to Fedora overall though... I will stay on Arch from now on.
.NET does have flags to include the necessary dependencies with the executable these days so you can just run the .exe and don't need to install .net on the host machine. Granted that does increase the size of the app (not to mention adding shitton of dll's if you don't build as single executable) but this at least is a solved problem.
They do now, after .net core and several other iterations. You'll also be shipping a huge executable compared to a clr linked .net app (which can be surprisingly small).
1 reply →
>Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath?
That seems more a property of npm dependency management than linux dependency management.
To play devil's advocate, the reason npm dependency management is so much worse than kernel/os management, is because their scope is much bigger, 100x more package, each package smaller, super deep dependency chains. OS package managers like apt/yum prioritize stability more and have a different process.
.net has been able to ship the runtime with your app for years.
> python in another realm here as well
uv has more of less solved this (thank god). Night and day difference from Pip (or any of the other attempts to fix it honestly).
At this point they should just deprecate Pip.
I have never experienced issues with pip, and I’m not sure it’s whether I’m doing something that pip directly supports and avoiding things it doesn’t help with.
I’d really love to understand why people get so mad about pip they end up writing a new tool to do more or less the same thing.
3 replies →
Ah yes let's all depend on some startup that will surely change the license at some point.
2 replies →
> Toolchains on linux are not clear from dependency hell either - ever install an npm package.
That's where I stopped.
Toolchains on linux distributions with adults running packaging are just fine.
Toolchains for $hotlanguage where the project leaders insist on reinventing the packaging game, are not fine.
I once again state these languages need to give up the NIH and pay someone mature and responsible to maintain packaging.
The counterpoint of this is Linux distros trying to resolve all global dependencies into a one-size-fits-nothing solution - with every package having several dozen patches trying to make a brand-new application release work with a decade-old release of libfoobar. They are trying to fit a square peg into a round hole and act surprised when it doesn't fit.
And when it inevitably leads to all kinds of weird issues the packagers of course can't be reached for support, so users end up harassing the upstream maintainer about their "shitty broken application" and demanding they fix it.
Sure, the various language toolchains suck, but so do those of Linux distros. There's a reason all-in-one packaging solutions like Docker, AppImage, Flatpak, and Snap have gotten so popular, you know?
6 replies →
The real kicker is when old languages also fall for this trap. The latest I'm aware of is GHC, which decided to invent it's own build system and install script. I don't begrudge them from moving away from Make, but they could have used something already established.
> The build.bat above isn’t just a helper script; it’s a declaration of independence from the Visual Studio Installer.
I am so fed up with this! Please if you're writing an article using LLMs stop writing like this!
I never understood this sentence structure, it adds zero information, it always goes like:
“This isn’t just [what the thing literally is]; it’s [hyperbole on what the thing isn’t].”
It’s a perfectly fine sentence structure. It’s been around for years and years. That’s why LLMs use it!
In the UK, Marks and Spencer have a long-running ad campaign built around it (“it’s not just food, it’s...”)
Em dashes are fine too.
1 reply →
The purpose isn't information, the purpose is drama.
Er, sorry. I meant: the purpose isn't just drama—it's a declaration of values, a commitment to the cause of a higher purpose, the first strike in a civilizational war of independence standing strong against commercialism, corporatism, and conformity. What starts with a single sentence in an LLM-rewritten blog post ends with changing the world.
See? And I didn't even need an LLM to write that. My own brain can produce slop with an em dash just as well. :)
3 replies →
Humans invented writing, not LLMs. They are copying us not the other way around. You can’t jump on 1 sentence that vaguely sounds like an LLM and say it’s written by AI. It’s so silly. I understand the aversion to AI slop but this is not that.
people run on heuristics and no amount of our righteousness will change that. the entire article absolutely reeks of LLM style so the original commentor isnt off the mark. to address your point, LLMs are copying that which leads to the most human engagement, so the way you expressed things makes it seem like you are defending junk food as real food. which of course it is, however it is designed to make someone money at the cost of human health. that's not something i'd be defending personally.
For vibe-writing, the vibes aren't even that good!
It's so common, I wonder how no one made an extension that filters this AI slop
I'm really considering it
While this is great - Visual Studio installer has a set of "command-line parameters" for unattended installs.
You can then build a script/documentation that isolates your specific requirements and workloads:
https://learn.microsoft.com/en-us/visualstudio/install/use-c...
Had to do this back in 2018, because I worked with a client with no direct internet access on it's DEV/build machines (and even when there was connectivity it was over traditional slow/low-latency satellite connections), so part of the process was also to build an offline install package.
I tried this once. It downloaded way more stuff than needed and still required admin to actually install.
Well - "run as admin" wasn't a problem for that scenario - as I was also configuring the various servers.
(And - it is better on a shared-machine to have everything installed "machine-wide" rather than "per-user", same as PowerShell modules - had another client recently who had a small "C:" drive provisioned on their primary geo-fenced VM used for their "cloud admin" team and every single user was gobbling too much space with a multitude of "user-profile" specific PowerShell modules...)
But - yes, even with a highly trimmed workload it resulted in a 80gb+ offline installer. ... and as a server-admin, I also had physical data-center access to load that installer package directly onto the VM host server via external drive.
(ugh - "high-latency" connections...)
[dead]
Looking at the script:
> curl -L -o msvcup.zip https://github.com/marler8997/msvcup/releases/download/v2026...
No thanks. I’m not going to install executables downloaded from an unknown GitHub account named marler8997 without even a simple hash check.
As others have explained the Windows situation is not as bad as this blog post suggests, but even if it was this doesn’t look like a solution. It’s just one other installation script that has sketchy sources.
You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.
Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.
It is somewhat different when your system forces binaries to be signed... but yeah, largely agreed. The abject refusal of curl|sh is strange to me, unless the refusers are also die-hard GPL adherents. Binaries are significantly more opaque and easier to hide malware in, in almost all cases.
2 replies →
> You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.
You do because the downloaded ZIP contains an EXE, not a readable script, that then downloads the compiler. Even if you skip that thinking "I already have VS set up", the actual build line calls `cl` from a subdirectory.
I'm not going to reconstruct someone's build script. And that's just the basic example of a one file hello world, a real project would call `cl` several times, then `link`, etc.
Just supplying a SLN + VCXPROJ is good enough. The blog post's entire problem is also solved by the .vsconfig[1] file that outlines requirements. Or you can opt for CMake. Both of these alternatives use a build system I can trust over randomgithubproject.exe, along with a text-readable build/project file I can parse myself to verify I can trust it.
1: https://learn.microsoft.com/en-us/visualstudio/install/impor...
>The curl|sh workflow is no more dangerous that downloading an executable off the internet
It actually is for a lot of subtle reasons, assuming you were going to check the executable checksum or something, or blindly downloading + running a script.
The big thing is that it can serve you up different contents if it detects it's being piped into a shell which is in theory possible, but also because if the download is interrupted you end up with half of the script ran, and a broken install.
If you are going to do this, its much better to do something like:
Though ideally yes you just download it and read it like a normal person.
I know Jonathan Marler for some of his Zig talks and his work in the win32 api bindings for Zig[0], they are even linked from Microsoft's own repo[1] (not sure why he has 2 github users/orgs but you can see it's the same person in the commits).
[0] https://github.com/marlersoft/zigwin32 [1] https://github.com/microsoft/win32metadata
I would guess one of his accounts is his corporate employee account and his other is personal.
Is this post AI-written? The repeated lists with highlighted key points, the "it's not just [x], but [y]" and "no [a] just [b]" scream LLM to me. It would be good to know how much of this post and this project was human-built.
I was on the fence about such an identification. The first "list with highlighted key points" seemed quite awkward to me and definitely raised suspicion (the overall list doesn't have quite the coherence I'd expect from someone who makes the conscious choice; and the formatting exactly matches the stereotype).
But if this is LLM content then it does seem like the LLMs are still improving. (I suppose the AI flavour could be from Grammarly's new features or something.)
> "The key insight is..."
This was either written by Claude or someone who uses Claude too much.
I wish they could be upfront about it.
It's interesting... Different LLM models seem to have a few sentence structures that they seem to vastly overprefer. GPT seems to love "It's not just X, it's Y", Claude loves "The key insight is..." and Gemini, for me, in every second response, uses the phrase "X is the smoking gun". I hear the smoking gun phrase around 5 times a day at this point.
Perhaps people have mimicked the style because LLMs have popularized it and clearly it serves some benefit to readers.
Perhaps LLMs have mimicked the style because authors have popularized it and clearly it serves some benefit to readers.
1 reply →
Life imitates art, even when that art is slop
> have popularized it
It's hated by everyone, why would people imitate it? You're inventing a rationale that either doesn't exist or would be stupider than the alternative. The obvious answer here it they just used an LLM.
> and clearly it serves some benefit to readers.
What?
4 replies →
"No Visual Studio installation. No GUI. No prayer. Just a script that does exactly what it says."
https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing#...
https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing#...
The shitty AI writing is so distracting I had to stop reading.
Have you noticed that your username is literally "its_notjack"?
I love the style it was written in. I felt a bit like reading a detective novel, exploring all terrible things that happened and waiting for a plot twist and hero comming in and saving the day.
you know why LLMs repeat those patterns so much? because that's how real humans speak
Real humans don't speak in LinkedIn Standard English
9 replies →
No, they do it because they're mode-collapsed, use similar training algorithms (or even distillation on each other's outputs) and have a feedback loop based on scraping the web polluted with the outputs of previous gen models. This makes annoying patterns come and go in waves. It's pretty likely that in the next generation of models the "it's not just X, it's Y" pattern will disappear entirely, but another will annoy everyone.
This is purely an artifact of training and has nothing to do with real human writing, which has much better variety.
1 reply →
[flagged]
Yes. It appears that way
I'm so fucking tired of this
I last developed for windows in the late 90s.
I came back around 2017*, expecting the same nice experience I had with VB3 to 6.
What a punch in the face it was...
I honestly cannot fathom anyone developing natively for windows (or even OSX) at this day and age.
Anything will be a webapp or a rust+egui multi-plataform developed on linux, or nothing. It's already enough the amount of self-hate required for android/ios.
* not sure the exact date. It was right in the middle of the WPF crap being forced as "the new default".*
2 replies →
> Is this post AI-written?
What if it was?
What if it wasn't?
What if you never find out definitely?
Do you wonder that about all content?
If so, doesn't that get exhausting?
Yeah, it does. Congratulations, you figured out why the future is going to be fucking awful.
1 reply →
What's exhausting is getting through a ten-paragraph article and realising there was only two paragraphs of actual content, then having to wade back through it to figure out which parts came from the prompt, and which parts were entirely made up by the automated sawdust injector.
4 replies →
I analyzed the test using Pangram, which is apparently reliable, it say "Fully human Written" without ambiguity.[1]
I personally like the content and the style of the article. I never managed to accept going through the pain to install and use Visual Studio and all these absurd procedures they impose to their users.
[1] https://www.pangram.com/history/300b4af2-cd58-4767-aced-c4d2...
This honestly just tells me that Panagram is hot garbage
1 reply →
Fantastic work! It's a long-needed breath of fresh air for the Visual Studio DX. I wish msvcup was existed when they made us use it back during Uni.
Alternatively, there's this:
Install Visual Studio Build Tools into a container to support a consistent build system | Microsoft Learn
https://learn.microsoft.com/en-us/visualstudio/install/build...
I wish open source projects would support MingW or at least not actively blocking it's usage. It's a good compiler that provides an excellent compatibility without the need of any extra runtime DLLs.
I don't understand how open source projects can insist on requiring a proprietary compiler.
There are some pretty useful abstractions and libraries that MinGW doesn't work with. Biggest example is the WIL[1], which Windows kernel programmers use and is a massive improvement in ergonomics and safety when writing native Windows platform code.
[1]: https://github.com/microsoft/wil
I fail to see why this would not work with gcc if it works with clang. The runtime?
1 reply →
if you want to link msvc built libraries (that are external/you dont have source), mingw may not be an option. for an example you can get steamworks sdk to build with mingw but it will crash at runtime
Agreed I also like to see more support for MingW especially from open source projects. Not even a passing mention in this blog post.
Eww no. MingW is evil and no project should ever use it.
Just use Clang + MSVC STL + WinSDK. Very simple.
From the capitalization I can tell you and the parent might not be aware it's "minimal GNU for Windows" which I would tend to pronounce "min g w" and capitalize as "MinGW." I used to say ming. Now it's my little friend. Say hello to my little friend, mang.
1 reply →
>MingW is evil
Care to elaborate?
3 replies →
Why not use winget to do it?
`winget install --id Microsoft.VisualStudio.2022.BuildTools`.
If you need the Windows(/App) SDK too for the WinRT-features, you can add `winget install --id Microsoft.WindowsSDK.10.0.18362` and/or `winget install --id Microsoft.WindowsAppRuntime.1.8`
Having been the person that used to support those packages, it’s not that simple. You need to pass what workloads you need installed too, and if it’s a project you’re not familiar with god help you.
I used to just install the desktop development one and then work through the build errors until I got it to work, was somewhat painful. (Yes, .vsconfig makes this easier but it still didn’t catch everything when last I was into Windows dev).
You can do a lot of "native" windows development from modern C#/.NET via win32 interop.
Newer C# features like ref returns, structs, spans, et. al., make the overhead undetectable in many cases.
https://github.com/prasannavl/WinApi
https://github.com/microsoft/CsWin32
Exactly, the major pain point remains the .NET allergy from Windows team, but it is workable.
Or... you can
"winget install Microsoft.VisualStudio.BuildTools"
"winget install Microsoft.WindowsSDK.10.0.26100"
I thought for a moment I was missing something here. I always just use winget for this sort of thing as well. It may kickoff a bunch of things, but it’s pretty low effort and reliable.
But those are installed system wide. What if you have two different project with different requirements at the same time?
Every language should have a tool like Python uv.
> What if you have two different project with different requirements at the same time?
Install multiple versions of Windows SDK. They co-exist just fine; new versions don’t replace old ones. When I was an independent contractor, I had 4 versions of visual studio and 10 versions of windows SDK all installed at once, different projects used different ones.
1 reply →
Windows SDKs, WDKs (driver dev), Visual Studio releases, and .NET SDKs all coexist peacefully on a machine. If a project build breaks due to newer SDKs, it's because it was configured with "use newest version". (Which is usually fine but sometimes requires pinning if you're using more "niche" things like WDK)
Nearly all of the Windows hate i see comes from 20 year old takes . ( the bing/cortana / copilot / ads slop criticism is warranted, but is also easily disabled).
it's not ideal, but much much much better!
For big C++ projects, the .vsconfig import/export way of handling Visual Studio components has worked well for the large teams I'm on. Tell someone to import a .vsconfig and the Visual Studio Installer does everything. Only times we've had issues is from forgetting to update it with components/SDK changes.
Yeah, seems like this is just ignorance around .vsconfig files. Makes life way easier. You can also just use the VS Build Tools exe to install things instead of the full VS installer, if you plan to use a different IDE.
Can you use .vsconfig to tell Build Tools what your project needs?
Can you generate .vsconfig with Build Tools?
1 reply →
Nitpick, "Windows Native Development" also refers to the NT native subsystem, which would be basically coding against private APIs instead of Win32. From the title I thought that's what this was. Then I realized it was about avoiding full use of Visual Studio when building C projects (something that a lot of people already do by the way)
I would also read "Windows Native Development" as driver development or compiling directly with `nmake` (neither of which are described there).
What we did for out build agents was to just install the required version of build tools via chocolatey. But cool approach!
Nowadays you can also use winget for it.
Same. Choco solves this with a one-liner for me.
It starts by not looking into Windows through UNIX developer glasses.
The only issue currently plaguing Windows development is the mess with WinUI and WinAppSDK since Project Reunion, however they are relatively easy to ignore.
>It starts by not looking into Windows through UNIX developer glasses.
People don't need any UNIX biases to just want multiple versions of MSVS to work the way Microsoft advertises. For example, with every new version of Visual Studio, Microsoft always says you can install it side-by-side with an older version.
But every time, the new version of VS has a bug in the install somewhere that changes something that breaks old projects. It doesn't break for everybody or for all projects but it's always a recurring bug report with new versions. VS2019 broke something in existing VS2017 installs. VS2022 broke something in VS2019. etc.
The "side-by-side-installs-is-supposed-to-work-but-sometimes-doesn't" tradition continues with the latest VS2026 breaking something in VS2022. E.g. https://github.com/dotnet/sdk/issues/51796
I once installed VS2019 side-by-side with VS2017 and when I used VS2017 to re-open a VS2017 WinForms project, it had red squiggly lines in the editor when viewing cs files and the build failed. I now just install different versions of MSVS in totally separate virtual machines to avoid problems.
I predict that a future version VS2030 will have install bugs that breaks VS2026. The underlying issue that causes side-by-side bugs to re-appear is that MSVS installs are integrated very deeply into Windows. Puts files in c:\windows\system32, etc. (And sometimes you also get the random breakage with mismatched MSVCRT???.DLL files) To avoid future bugs, Microsoft would have to re-architect how MSVS works -- or "containerize" it to isolate it more.
In contrast, gcc/clang can have more isolation without each version interfering with each other.
I'm not arguing this thread's msvcup.exe tool is necessary but I understand the motivations to make MSVS less fragile and more predictable.
Note that this also doesn't work on Linux - your system's package manager probably has no idea how to install and handle having multiple versions of packages and headers.
That's why docker build environments are a thing - even on Windows.
Build scripts are complex, and even though I'm pretty sure VS offers pretty good support for having multiple SDK versions at the same time (that I've used), it only takes a single script that wasn't written with versioning in mind, to break the whole build.
6 replies →
Yes. Any user interface toolkit that isn't at least 10 years old should be ignored on windows unless you want to rewrite everything one day.
Why? You may end up with something that doesn't get much attention anymore, but none of the official gui approaches have ever been removed as far as I know. Win32, MFC, winforms, wpf, winui, maui are all still available and apps using them are functional. Even winjs still works apparently, even if it was handed over.
I wouldn't start an app in most of them today, but I wouldn't rewrite one either without a good reason.
2 replies →
I wonder if people still use WinForms, MFC and WPF...
5 replies →
> It’s so vast that Microsoft distributes it with a sophisticated GUI installer where you navigate a maze of checkboxes, hunting for which “Workloads” or “Individual Components” contain the actual compiler. Select the wrong one and you might lose hours installing something you don’t need.
I have a vague memory of stumbling upon this hell when installing the ldc compiler for dlang [1].
1. https://wiki.dlang.org/Building_and_hacking_LDC_on_Windows_u...
> No Visual Studio installation. No GUI. No prayer. Just a script that does exactly what it says.
Yeah its obvious clanker writing. I don't even mind using LLM for code but this rubs the wrong way.
> On Linux, the toolchain is usually just a package manager command away. On the other hand, “Visual Studio” is thousands of components.
That package manager command, at the very least, pulls in 50+ packages of headers, compilers, and their dependencies from tens of independent projects, nearly each of them following its own release schedule. Linux distributions have it much harder orchestrating all of this, and yet it's Microsoft that cannot get its wholly-owned thing together.
Actually not that complicated: You simply check in a global.json [0] where you specify the sdk and workload versions.
Then you also specify target platform sdk versions in the .csproj file and VS will automatically prompt the developer to install the correct toolchain.
[0] https://learn.microsoft.com/en-us/dotnet/core/tools/global-j...
global.json is only for .NET toolchains.
What you’re actually wanting here is .vsconfig https://learn.microsoft.com/en-us/visualstudio/install/impor...
[dead]
This is amazing.
At $workplace, we have a script that extracts a toolchain from a GitHub actions windows runner, packages it up, stuffs it into git LFS, which is then pulled by bazel as C++ toolchain.
This is the more scalable way, and I assume it could still somewhat easily be integrated into a bazel build.
Keeping CI entirely out of windows desktop development is the biggest efficiency and cost improvement I've seen in the last 15 years. Our CI toolchain broke so we moved back to a release manager doing it manually. It takes him 20x less time to build it and distribute it (scripted) than it does to maintain the CI pipeline and his desktop machine is several times faster than any cloud CI node we can get hold of.
Edit: Uses a shit load less actual energy than full-building a product thousands of times that never gets run.
This is really interesting. Do you think it’s possible server-deployed software could also get similar efficiencies with adequate testing?
1 reply →
One day I decided to port my text editor to Windows. Since it depends on pcre2 and treesitter, these two libraries had to be provided by the system.
In the span of ~2hrs I didn't manage to find a way to please Zig compiler to notice "system" libraries to link against.
Perhaps I'm too spoiled by installing a system wide dependency in a single command. Or Windows took a wrong turn a couple of decades ago and is very hostile to both developers and regular users.
I think providing purely-functional libraries as system dependencies that's tied to the whole tool chain at the time was the wrong decision by the Unix world.
The system libraries should only ship system stuff: interaction with the OS (I/O, graphics basics, process management), accessing network (DNS, IP and TLS). They should have stable APIs and ABIs.
Windows isn't hostile. It has a differnt paradigm and Unix (or more correctly usually GNU/Linux) people do not want to give up their worldview.
PCRE is basically only your apps's dependency. It has nothing else to do the rest of the operating system. So it is your responsibility to know how to build and package it.
If you depend on a library and can't figure out how you would compile against it, it's probably better for the end user that you don't make anything because you'll still need to package it up later unless you link statically.
I suspect the pitfall is how you or the zig compiler is linking. Unless you're involving things which vary by OS like hardware interaction, networking, file systems etc, you should not, with a new Lang in 2026, need to do anything special for cross-platform capabilities.
My understanding that "linkSystemLibrary" abstraction in build.zig only holds for Unix systems. And this in turn makes it impossible to build my program on Windows without modifying the build script.
System wide dependencies is fundamentally an awful idea that is wrong and you should never ever do it.
All dependencies should be vendored into your project.
This is the answer. I don’t know what is the best practise but for windows the easiest solution is to put the DLL in the same directory as the exe
I am not too into windows dev but I am currently using msvc at work. We are told to import a config file into the installer and it automatically selects all of the components any of our projects will need. Wouldn't that solve the problem too? Just distribute a project level config file and add documentation for how to import and install the stuff.
* Is this allowed per VS' ToS?
* I wonder if Microsoft intentionally doesn't provide this first party to force everyone to install VS, especially the professional/enterprise versions. One could imagine that we'd have a vsproject.toml file similar to pyproject.toml that just does everything when combined with a minimal command line tool. But that doesn't exist for some reason.
Microsoft doesn't seem to care unless you're a company. That's the reason community edition is free. Individual licenses would be pennies to them, and they gain more than that by having a new person making things in their ecosystem. It's in their interest to make their platform accessible as possible.
Visual Studio does have that functionality, via vsconfig files: https://learn.microsoft.com/en-us/visualstudio/install/impor...
Doesn't look like it's versioned, or installs Visual Studio itself.
2 replies →
I hope it would work with wine. Then cross compiling Win64 binaries from Linux would become convenient without requiring spinning up a Windows VM.
Yeah I noticed wine wasn't able to execute the MSI files. It also had a problem with the lock files. Both problems should be fixable though.
Just use Clang. Cross-compiling Linux->Windows is super duper easy.
Came in getting ready to hate on this article, but was actually pleasantly surprised, this is great. Good work OP.
I am not exactly bounding with eagerness and time to contribute to an open source project but the few times I have looked I stop at the "how do I configure my dev environment to match" step.
Just give me a VM. Then you will know, and I will know, every facet of the environment the work was done in.
The ironic part is that Visual Studio may be the best product Microsoft has ever made. Compared to the rest of their offerings, it is nothing short of amazing. It boggles the mind to know that this was developed in-house - well most of it anyways.
WOW such a great work. Myself I have been struggling with Mingw just to compile from source. Of course it works much cleaner then the hated visual studio, but then when it comes to cuda compile, that´s it. Visual studio or the magority our there, It is invasive and full of bloatware like you say. Same struggle with electron.
How to match it with cuda to compile from source the repos?
We manage Visual Studio on our CI machines using Ansible. Chocolatey installs the full Visual Studio and then we use the APIs provided to manage components via Ansible. See our action here: https://galaxy.ansible.com/ui/repo/published/kitware/visuals...
And here I was messing with MingW64…
This is fantastic and someone at Microslop should take notes.
Exacly.. I avoid Visual Studio.. I try to build everthing using Mingw..
Clang is the better alternative to MinGW because it can use standard Windows libraries and avoids the need for additional runtime.
25 replies →
Please add also the support for the clang-cl[1][2].
[1] https://clang.llvm.org/docs/MSVCCompatibility.html
[2] https://clang.llvm.org/docs/UsersManual.html#clang-cl
I don't get why people go through all these flaming hoops and hurdles to deal with MSVC when MinGW and MinGW-w64/MSYS2 are options. In the latter case you even still get (mostly complete) MSVC ABI-compatibility if you compile with clang.
MinGW and MinGW-64/MSYS2 are just as inscrutable, fragile and new-user-hostile. The fact that you have to choose between MinGW (which has a 64 bit version) or MinGW64 (completely separate codebases maintained by different people as far as I can tell) is just the first in a long obstacle course of decisions, traps, and unexplained acronyms/product names. There are dozens of different versions, pre-built toolchains and packages to throw you off-course if you choose the wrong one.
If you're just a guy trying to compile a C application on Windows, and you end up on the mingw-w64 downloads page, it's not exactly smooth sailing: https://www.mingw-w64.org/downloads/
Because it's fewer hoops and hurdles than using MinGW, in my experience.
MinGW/MSYS2 are flaming poop hurdles. That’s the bending over backwards to fake a hacky ass bad dev environment. Projects that only support MinGW on Windows are projecting “don’t take windows seriously”.
Supporting Windows without MinGW garbage is really really easy. Only supporting MinGW is saying “I don’t take this platform seriously so you should probably just ignore this project”.
As someone who has been doing Win32 development for literally decades, I'm not particularly convinced this is a problem that needs more code to solve. You don't need VS to get the compiler (which is available as a separate download called something like "build tools", I believe); and merely unpacking the download and setting a few environment variables is enough to get it working. It's easy to create a portable package of it.
Last I checked the license for the headless toolchain requires that a full licensed copy of Visual Studio be installed somewhere. So I think this violates the license terms.
A bug got opened against the rustup installing the headless toolchain by itself at some point. I'll see if I can find it
edit: VSCode bug states this more clearly https://github.com/microsoft/vscode/issues/95745
I was just setting up a new machine and was setting up the Rust environment. The very first thing rustup-init asked was to install Visual Studio before proceeding. It was like 20-30gb of stuff installed before moving forward.
This tool would be a great help if I knew beforehand.
I don't really use Windows OS much, but why not just use MinGW? Then you have Clang on all platforms you can think of: Android, all the various Darwin flavors and of course Linux and Windows; as well as on platforms you can't think of like FreeBSD or even Haiku maybe? Like honestly what's the point of supporting MSVC at all?? Maybe I'm just not enough of a Windows nerd to understand? (so I'm basically wondering if mingw has any drawbacks)
If you have a self-contained project, where you don't depend on anyone else and others don't depend on you, MinGW works great. Problems arise when you have dependencies that don't work with it. I'd love to see if MinGW could find a way to be binary compatible with MSVC-compiled binaries. Right now it's kind of an all or nothing solution which makes it hard to adopt.
Ah, binary-only dependencies, right… That's very specific though, so unless you need to drive some proprietary hardware, why bother using stuff that forces you into MSVC hell lol? Also wouldn't LLVM based MinGW benefit from Clang's MSVC compat? Not sure about this at all, that's why I'm asking, heh… ^^"
> You spend less time on your project because you’re too busy being a human-powered dependency resolver for a 50GB IDE.
Really? A 50GB IDE? How the heck one knows what goes in there?
My beloved FreeBSD 15.0 PLUS its Linux VM PLUS its docker env PLUS its dependencies and IDE are close to 26Gb and pretty sure I'm taking into account a lot of things I shouldn't, so the actual count is much less than that.
Developing software under a Windows platform is something that I cannot understand, since many many many years ago.
I’ve found that just installing LLVM, CMake and Ninja is enough to get started developing on Windows for most things C/C++.
I'll just keep using Mārtiņš Možeiko's script, portable-msvc.py, that this tool is based upon. It does everything this does, except a lock file and the autoenv. I'm not particularly interested in the former, and definitely not the latter.
https://gist.github.com/mmozeiko/7f3162ec2988e81e56d5c4e22cd...
Perhaps winget is enough?
winget install Microsoft.VisualStudio.2022.BuildTools
The Build Tools installer first installs the Visual Studio tool to select the workloads you want as well.
Let's say I want to compile a helloworld.cpp with no build tools installed yet.
What is the minimal winget command to get everything installed, ready for : cl main.cpp ?
Ps: I mean a winget command which does not ask anything, neither in command line, nor GUI ? Totally unattenfed.
1 reply →
“Build Requirements: Install Visual Studio”.
You’ve never experienced genuine pain in your life. Have you tried to change the GCC compiler version in Linux?
?
If it’s not packaged and you’ve got to build it yourself, Godspeed. An if you’ve got to change libc versions…
GCC is surprisingly simple to build, fortunately.
3 replies →
Wondering....
Has anyone tried doing this on ReactOS? I know this is a touch DIY, but it would be interesting to know if Win sofware could be built on ReactOS...
I will never cease to be amused by these 'Unixhead has to do windev. Reinvents the wheel' blog posts.
As someone who is out of the loop on Windows development, is this related to the Windows Driver Kit (WDK, I think it used to be DDK)? That's a certain type of hell I don't wish upon most.
You can also install visual studio build tools via the built in winget package manager.
Were you around before the new installer came out? It was light speed compared to what was before!
No one should use any of these weird Frankenstein monstrosities in 2026. And a batch script? :( PowerShell exists.
Install:
Load the 'VSDevShell' DLL[2] for PowerShell, and you're good to go, with three different toolchains now:
Done and dusted. Load these into a CMake toolchain and never look at them again.
People really like overcomplicating their lives.
At the same time, learn the drawbacks of all toolchains and use what is appropriate for your needs. If you want to write Windows drivers, then forget about anything non-MSVC (unless you really want to do things the hard way for the hell of it). link.exe is slow as molasses, but can do incremental linking natively. cl.exe's code gen is (sometimes) slightly worse than Clang's. The MinGW ABI does not understand things like SAL annotations[3], and this breaks very useful libraries like WIL[4] (or libraries built on top of them, like the Azure C++ SDK[5] The MinGW headers sometimes straight up miss newer features that the Windows SDK comes with, like cfapi.h[6].
[1]: https://github.com/mstorsjo/llvm-mingw
[2]: https://learn.microsoft.com/en-gb/visualstudio/ide/reference...
[3]: https://learn.microsoft.com/en-gb/cpp/c-runtime-library/sal-...
[4]: https://github.com/microsoft/wil
[5]: https://github.com/Azure/azure-sdk-for-cpp
[6]: https://learn.microsoft.com/en-gb/windows/win32/cfapi/build-...
LLVM-MinGW sounds external to Microsoft though. I think the blog focused on in-Microsoft solutions. And I am not sure the "contrary to the blog content" is valid - compared to Linux, the Microsoft stack is much more annoying to install. I installed it, but it was annoying to no ends and took ages.
Good to know LLVM works on windows too though.
> compared to Linux, the Microsoft stack is much more annoying to install.
Not really. It's just different. As a cross-platform dev, all desktop OSs have their own idiosyncracies that add up to a net of 'they are all equally rather bad'.
CMD.EXE is fine. I'd rather use bash than the abomination that is PowersHell.
MinGW is the most monstrous of monstrosity. Never in a million years touch that garbage.
I dunno, it has its uses when porting software written for UNIX-first. Plus, I pointed out Clang, rather than GCC, because Clang is natively a cross-compiler. I don't like to be dogmatic about stuff; if it's useful then it's useful. If it isn't then I will say why (as I explained why there's no need for MSYS2/Cygwin below).
1 reply →
“Don’t do it”
I fixed windows native development. Band together friends, force WSL3 as the backbone of Windows.
next, wrap it with wine and eventually share a bottle/winetrick
it's been 14 years since i've used msvc for anything real. iirc the philosophy back then was yearly versioned releases with rolling intermediate updates.
this seems to go down the road towards attempts at determinsticish builds which i think is probably a bad idea since the whole ecosystem is built on rolling updates and a partial move towards pinning dependencies (using bespoke tools) could get complicated.
these seems overly dramatic...i just setup a windows 11 box and installed the needed tools quite quickly via winget and I was up and running
I don't understand, just use scite editor with tcc. About a couple of megs download, no install required and your apps will run on everything from win 98 to linux with wine. And if the answer is c++ support then you get all the pain you deserve using that cursed language
> On Linux, the toolchain is usually just a package manager command away.
If you are compiling for your native system, yes.
But as soon as you try cross-compiling, you are in for a lot of pain.
mmozeiko "fixed" windows native development, just use their script. Also PortableBuildTools already exists https://github.com/Data-Oriented-House/PortableBuildTools
I thought the title was clickbait, but no, he really did fix it! Nice
If you are looking to rapidly build windows native apps just use Delphi. Superlative tool for this. Been using since ‘95
So this fixes the problem when msvc is the required compiler. Does the zig C++ compiler bring anything to the table when clang is an option?
You still need headers and libraries that ship with MSVC.
>The key insight
are we doomed to only read AI slop from now on? to get a couple paragraphs in and suddenly be hit with the realization that is AI?
it's all so tiresome
I literally came to post the exact same line as my indicator that this was AI-generated. I ctrl-f'd first and sure enough I'm not alone in using 'key insight' as the canary.
c3 does this automatically, I implemented the same thing :)
https://github.com/c3lang/c3c/pull/2854
Thank you, this might be a great way to improve the developer experience in the conda/conda-forge ecosystem.
Another option is explore winget and chocolaty. Most build tools and compilers can be installed via the command line on windows. Ask your favorite LLM to create a powershell script to install them all.
I like the tool, I like the article, but I'd prefer it it was half as long but without AI touch.
> Hours-long waits: You spend an afternoon watching a progress bar download 15GB just to get a 50MB compiler.
What year is it?! Also, haven't heard any complaints regarding VS on MacOS, how ironic...
Nix on Windows when...
Since roughly September 2022 with the release of WSL 0.67.6!
Have you actually attempted to use it recently? Are you familiar with the WSL1 bugs that surface when running random Linux distros?
(To be clear, I haven't tried this with Nix, but I have with other distros.)
2 replies →
Let me paraphrase: nix FOR windows
Please people, stop trying to fix windows and just let it die.
Why not just use Linux?
Because some developers would like to make money at some point.
Then why would they make applications for a dying platform? Is there some budding market for native win32 apps that I'm not aware of?
To play devils advocate, Linux does pose some issues as far as a stable platform base. They don't even guarantee glibc compatibility afaik.
To me it seems as if Microsoft wants to make it deliberately harder to have software developers. Now - I installed all the required things and compiled on Windows too, but it is very annoying compared to Linux. Microsoft should simply have ONE default build, e. g. "download this and 80% of developers will be happy". No need for a gazillion checkboxes.
rm -fr /
I'm not trying to diminish or take away from this post but Visual Studio is an IDE and is not necessary to build an App.
You just need the required build tools.
If you've ever had to setup a CI/CD pipeline for a Visual Studio project then you've had to do this.
just use w64devkit, it's nice
> msvcup is inspired by a small Python script written by Mārtiņš Možeiko.
This script is great. Just use it. The title saying “I fixed” is moderately offensive glory stealing.
Windows Native is fine. People in that space are comfortable with it.
What needs to be fixed is the valley between unix and windows development for cross-os/many-compiler builds, so one that does both can work seamlessly.
It's not an easy problem and there are lots of faux solutions that seem to fix it all but don't (in builds, the devil is in edge cases).
I seriously doubt that people who get confused by the MSVC++ Installer will be able to handle a CLI app that installs a mystery MSVC++ toolchain version to a versioned directory. They're still going to click the Visual Studio icon on their desktop and scratch their head why your script didn't magically fix their problems.
Say what you want about coding agents, when the cost of writing code goes to near-zero, the cost of wrangling tools becomes a much bigger fraction of development effort. This is an amazing opportunity to address long-standing frictions.
I havent run into this problem yet... but my oldest .net software is only 1 year old... Is this something that happens over the course of a few years?
This is about native development (C++), not .NET.
Thank you
This is a serious quality of life improvement for people forced to deal with Windows! Great job.
> I fixed
> msvcup is inspired by a small Python script written by Mārtiņš Možeiko.
No. Martins fixed. OP made a worse layer on top of Martins great script.
[dead]
[dead]
[dead]
[dead]
Trollish usernames aren't allowed on HN, so we've banned this account*. If you want to pick a different username that isn't trollish, we can rename the account and unban it. It would be best to email hn@ycombinator.com for this, to make sure we get the message.
(It would have been better for us to catch this sooner, but in this case someone had to explain the name to me. Out of respect for HN's many Francophone readers, I think it's best to apply the rule.)
* https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...
Gross ignorance and incompetence.
TLDR: I don't understand my native command line, see how lost I got when I tried to do my thing in a different environment.
- Not a unique problem to Windows or even MSVC; He's gonna hate XCode, - Making Python a bootstrap dependency = fail, - Lacks self-awareness to recognize aversion vs avoidance,
My background is distinctly non-Windows, but I survive around Windows so well that people think I'm a Mickeysoft type. And no, I don't use mingw, cygwin, ...
If any of the obstacles this user faced were legitimate, nobody would ever make any money on Windows, including and especially Microsoft - a company whose developers have the same challenges.
I'm being harsh because _mea quondam culpa_ and it's correctable.
Everything this user went thru is the result of aversion instead of avoidance.
To _avoid_ long deep dives into Windows, you need to recognize there is a different vocabulary and a radically different jargon dialect at play.
1. Learn a tiny minimum of Powershell; it's based on the same POSIX spec as bash and zsh, but like Python, Javascript, etc, instead of byte as the fundamental unit, they use objects. So there's less to learn to reach a greater level of convenience than soiling yourself with DOS/CMD/BAT. On Windows, pwsh has a default set of linux-like aliases to minimize the learning required for minimal operability. And never have to type \ instead of / for a directory separator.
2. Microsoft make money from training. To sell their meat-free steak (* ingredient: saw dust), they feed the suits an all-you-can-eat calorie, nutrition, and protein free buffet of documenting everything in great detail and routinely "streamlining" the names and terminology.
Development on Windows is in a different reference frame, but relative to their own reference frames, they're ultimately not all that different.
Approach in your "foreign language" mindset; English alphabet but the words mean different things.
3. What not how. "How do I grep" means you are trying to random access bytes out of a random access character stream. "What's the command to search for text in files?" well, if you're bloody mindedly using cmd, then it's "find".
4. Seriously, learn a little Powershell.
I only approached Powershell hoping to gain material for a #SatansSphincter anti-ms rant while using it as a Rosetta Stone for porting shell scripts in our CI for Windows.
I mean, it is based on the same POSIX spec as sh, bash, and zsh, with a little Perl thrown in. That can't not go horribly, insidiously, 30-rock wrong in the hands of MS, right?
Turned out, it's the same paradigm shift perl/shell users have to make when coming into Python:
from `system("ps | grep hung")` to `"hung" in system("ps")`; from `system("ifconfig -a | sed 's/\<192\.168\.0\./10.0.0./g'")` to `system("ifconfig -a").replace("192.168.0.", "10.0.0.")`
`grep` is a command that applies an assumption to a byte stream, often the output of a command.
In powershell, executing a command is an expression. In the case of a simple command, like "ps", that expression resolves to a String, just like system(...) does in Python.
Learning even a small amount of Powershell is immensely helpful in better understanding your enemy if you're going to have to deal with Windows. The formal names for official things use "verb-singularnoun".
That last part of the convention is the magic: the naming of things on Windows is madness designed to sell certifications, so crazy even MS ultimately had to provide themselves a guide.
Is this even legal?
I'm just asking, but is there really a need for a native programs anymore? Where I worked a decade ago, we started porting all our native programs over to the browser and this was when MVC beta was just being released. At this point with Electron and Tauri, is there even a need to write a native program
Now with AI, I would think that porting a native program to the browser wouldn't be the chore it once was.
Yes, very definitely. There has always been a need for high performance native applications. Even in the beginning of the desktop computing revolution, these questions have been asked .. and yes, there is a balance between native and cloud/browser-based computing - some of it is personal, much of it is industrial and corporate, and yet more of the spectrum where both methods are applicable exists, even still, decades later.
> is there really a need for a native programs anymore
As long as you don't give a shit about the fact that your baseline memory consumption is now 500MB instead of 25MB, and that 80% of your CPU time is wasted on running javascript through a JIT and rendering HTML instead of doing logic, no.
If you don't give a shit about your users or their time, there's indeed no longer a need to write native programs.
what if caring about users means giving them features instead of fighting with obsolete unproductive native GUI frameworks
funny how Electron apps tend to have many more users than their native "performant" counterparts, isn't it?
Where do you think Linux gamers get their Proton powered games from?
I use COM and DLLs to extend software/automate. Using Visual Studio gives me some really nice debugging options.
I did try using python and js but the variable explorer is garbage due to 'late binding'.
I thought this was just my ignorance, but I've asked experts, AI, and google searched and they unfortunately agree. That said, some people have created their own log/prints so they don't need to deal with it.
I just avoid Windows and Windows development. If I get paid to do it I don't mind the shittyness.
At the risk of being that guy, I haven't had any issues onboarding people onto native projects written in Rust. rustup does a great job of fetching the required toolchains without issue. I'd imagine the same is also true of Go or Zig.
While Microsoft <3 Rust, there are still some quality tooling parity to reach versus Visual Studio abilities for .NET, Python and C++.
Incremental compilation, and linking, parallel builds, hot code reloading, REPL, graphical debugging optimised builds, GPU debugging....
Go is better left for devops stuff like Docker and Kubernetes, and Zig remains to be seen when it becomes industry relevant beyond HN and Reddit forums.
I'm pretty people who write and build C++ on Windows do it for good reasons, often reasons that are out of their control. Your comment is not going to make any difference.
Before rustup can run, the very first message rustup-init spits out is asking to install the visual studio tool chain.
You have to do this for certain rust things too. I can't remember which, but I inevitably run into a need to install the MSVC toolchain to compile rust. I think it might be related to FFI, or libs which use FFI? The same thing comes up in Linux, but the process to install it is different.
I got anxiety reading the article, describing exactly why it sucks. It's nice to know from the article and comments here there are ways around it, but the way I have been doing it was the "hope I check the right checkboxes and wait a few hours" plan. There is usually one "super checkbox" that will do the right things.
I have to do this once per OS [re]install generally.
It makes use of MSVC linking infrastructure, and import libraries.
You can't really use Rust in the real world without interfacing a lot of C/C++ libraries, so yes this is still relevant.