Ask HN: What less-popular systems programming language are you using?
1 month ago
Less popular or less commonly used ones.
By that, I mean, not including the usual suspects, such as C, C++, Rust and Go (I know the controversy about the last one being a systems programming language or not).
I'm asking this because I used C for both application programming and systems programming, early in my career, before I moved to using other languages such as Java and Python.
And of late, I've been wanting to get back to doing some systems programming, but preferably in a more modern language (than C) which is meant for that.
I've pretty much settled on Zig at this point, if only for how dead-simple it is to cross-compile for other hardware platforms. The process of compiling working code for oddball platforms (in my case the Nintendo 64) was way easier than I expected it to be.
The only downside is the stdlib being as fast-moving of a target as it is. Right now I've had to put a pin on getting panic stack traces to work on my N64 code because apparently the upcoming release changes a bunch of stuff around panic/stacktrace handling (and it's already changed quite a bit over the years even before these new changes).
> how dead-simple it is to cross-compile for other hardware platforms
The fact that zig can compile C code makes it useful for other languages too. I recently started using `zig cc` to cross-compile Nim for lots of different platforms within the same environment.
It takes no time to setup and, honestly, works like magic.
> The fact that zig can compile C code makes it useful for other languages too
Agree, C interop is IMHO the big feature of Zig. There are plenty of systems programming languages in 2025, but where Zig shines is its pragmatism: a single standalone binary containing compiler, libc, build system, code formatter and test runner for C and Zig.
As of late though, I've been concerned with some "holy wars"/"ideological postures" that the dev team started which IMHO departs from the original "let's be pragmatic" mantra.
- There's a bunch of places where the stdlib just crashes on unreachable assertions, and that won't be fixed "because the kernel should have better error reporting".
- There are a bunch of kernel syscalls which are just not possible to call "because C enums should not allow aliases"
- etc
I hope this trend fades away and it gets back on a more pragmatic stance on these issues, nobody wants a systems programming language that plays the programming police.
Otherwise, C3 looks promising as well (though not as nice than Zig IMHO), but currently it's a bit too barebone to my taste. There no stable LSP, no nvim plug-in, etc.
I like Perl mostly because it's poetic (the code is super nice to read, with variable types standing out thanks to sigils), but another core strength is how very fast and light it is.
Instead of "cross-compiling" or just running a native perl interpreter (there's one for about every platform!), I prefer how Actually Portable Executables make Perl multiplatform with just 1 binary asset running everywhere!
I wanted to write a webserver processing CGI to learn more about the "old school web", so I wrote https://github.com/csdvrx/PerlPleBean and the simplicity of just downloading and running the .com on anything is very nice
I'm now trying to do the same in Python3, but it's not as fun - and I'm not yet to the part where I will try to safely run python code within the python webserver, either through restrictedpython or ast.parse(), ast.walk(), eval(compile()) ...
40 replies →
> The only downside is the stdlib being as fast-moving of a target as it is.
Ah that's an interesting take, my opinion is that the stdlib doesn't move fast enough.
In its current state it's pretty broken, most of the "process", "os" and "posix" modules are either straight up raising unreachable in normal scenarios, or simply badly designed. I would like the stdlib to be much more fast moving and fix all these issues, but I had the impression most work on it is frozen until 0.15 or 0.16, after incremental compilation is done.
You are right, the stdlib is not the highest priority right now. There are major improvements coming in 0.14 though. The new default allocator for example. I think the problem you describe can be solved by having more contributors focussing on the standard library. With the compiler, there are bottlenecks which make onboarding new people hard. This is a smaller problem in stdlib.
1 reply →
I'm picking up zig as my first system programming language myself and I love it.
Sadly the job market looks dead
It's such a new language, not even in 1.0.0 You won't really find companies willing to bet their livelihoods at such an early stage.
You can make your own though :)
What N64 code are you working on? I am intrigued.
Current progress is here: https://fsl.yellowapple.us/zig64/dir?ci=trunk
Right now it's just a bunch of WIP Zig interfaces for the N64's hardware, but the end-goal is to get it developed enough for homebrew gamedev.
1 reply →
What’re you doing with Zig and N64? Sounds awesome.
Eventually, I hope to use Zig for N64 homebrew.
To get there, though, I need to implement Zig-ish interfaces to the N64's hardware, which is slowly-but-surely happening at https://fsl.yellowapple.us/zig64/dir?ci=trunk
I've been using Odin [1] for my hobby game development and I've been liking it a lot. Feels like a more ergonomic C.
Things I like:
- Vendor libraries like Raylib and MicroUI make it easy to get started
- I can pass around memory allocators and loggers implicitly using context, or explicitly if I need to.
- natively supports vector math and swizzling
- error handling with `or_else` and `or_return`
Things I don't like:
- Name spacing is a bit annoying. The convention is to prefix the procedures but I don't like how they look. It really isn't a big issue.
Have a quick read of the overview and if you are still interested, I highly recommand 'Understanding the Odin Programming Language' book by Karl Zylinski [2]
[1] https://odin-lang.org/docs/overview/
[2] https://odinbook.com/
I like Odin, but the creator is not too motivational (or rather, actively un-motivational)[1]. I still use it nonetheless for some of my own stuff, for now.
Regardless, I do recommend people to try it out. I use Linux and OpenBSD, too, despite Linus and Theo. :)
[1] The reason for why I think this can be found in their pull requests, but it's been some time I think.
What do you mean by "motivational?" Are you talking about how the creator is against adding new features to the language? I actually think that's perfectly fine. One of my favorite things about Odin is the simplicity; the entire language and all of its rules can be understood by reading the Odin overview document. I'm actually thrilled to have a creator that doesn't want to bloat the language.
6 replies →
I remember being turned off from it for the same reasons, in particular there were some fairly harsh comments directed to V's developer which felt a bit dog piley to me. Drama's long dead now.. It is a great language though. Closest to the language that would kill C for me so far, but not quite =D
1 reply →
OCaml
The compiler is very fast, even over large codebases.
Mostly trying to bring AWS tooling to the platform[1], or experimenting with cross-compilation[2] using another less well known systems language, zig.
[1] https://github.com/chris-armstrong/smaws/ [2] https://github.com/chris-armstrong/opam-cross-lambda
I've used a lot of programming languages and the kind of groove you can get into with OCaml is hard to match. You can just dive into an enormous, unfamiliar codebase and make changes to it with so much more confidence. But while it's reasonably fast, it's also higher level than Rust so you don't have to struggle quite so much with forms like `Arc<Mutex<HashMap<String, Box<dyn Processor + Send + Sync>>>>` everywhere.
Re: AWS tooling, have you seen https://github.com/solvuu/awsm ?
It generates code for all 300+ AWS services and produces both Async and Lwt forms. Should be fairly extensible to Eio.
I worked on this. Let me know if you want to tag team.
I want to like OCaml but OPAM is just so bad... and tooling is super important (it's one of the reasons Go is popular at all). Windows support is also an afterthought. There's no native debugger as far as I can tell. This is before you even get to the language which definitely has its own big flaws (e.g. the lack of native 64-bit integers that MrMacCall mentioned.
The syntax is also not very friendly IMO. It's a shame because it has a lot of great ideas and a nice type system without getting all monad in your face. I think with better tooling and friendlier syntax it could have been a lot more popular. Too late for that though; it's going to stay consigned to Jane Street and maybe some compilers. Everyone else will use Rust and deal with the much worse compile time.
> The syntax is also not very friendly IMO.
Very true. There's an alternate syntax for OCaml called "ReasonML" that looks much more, uh, reasonable: https://reasonml.github.io/
1 reply →
> (e.g. the lack of native 64-bit integers that MrMacCall mentioned.
They exist, I think you just mean `int` is 63-bit and you need to use operators specialized `Int64.t` for the full precision.
17 replies →
Why opam is bad? Compared to what? Could you elaborate
3 replies →
>The syntax is also not very friendly IMO.
Why do you think that the syntax is not very friendly?
Not saying you are wrong, just interested to know.
Have you tried esy?
I've read some part of the book Real World OCaml, by Yaron Minsky and Anil Madhavapeddy.
https://dev.realworldocaml.org/
I also saw this book OCaml from the Very Beginning by John Whitington.
https://ocaml-book.com/
I have not read that one yet. But I know about the author, from having come across his PDF tools written in OCaml, called CamlPDF, earlier.
https://github.com/johnwhitington/camlpdf
>CamlPDF is an OCaml library for reading, writing and modifying PDF files. It is the basis of the "CPDF" command line tool and C/C++/Java/Python/.NET/JavaScript API, which is available at http://www.coherentpdf.com/.
My problem with OCaml is just that there is no stepping debugger for VScode. I'd use it except for that.
Yes
Symbolic debugger seem to be going out of fashion
It's my understanding that OCaml does not allow its programs to specify the size and signedness of its ints, so no 16-bit unsigned, 32-bit signed, etc...
Being a huge fan of F# v2 who has ditched all MS products, I didn't think OCaml was able to be systems-level because its integer vars can't be precisely specified.
I'd love to know if I'm wrong about this. Anyone?
You’re wrong, not sure where you got that conception but the int32/64 distinction is in the core language, with numerous libraries (eg stdint, integers) providing the full spectrum.
7 replies →
The modules Int64 and Int32 and part of the OCaml standard library. You mentioned that it is needed dune or Janestreet in your comments to have this functionality. They are part of the standard library. Really, they are part of Ocaml core developments. Actually, for example, you even can use the library big-arrays with these types and int8, int16, signed, unsigned... even more you have platform-native signed integers (32 bits on 32-bit architectures, 64 bits on 64-bit architectures) with Bigarray.nativeint_elt as part of the standard library so all these types are there.
You also mention that Int32 and Int64 are recent, however these libraries were part of OCaml already in the 4.X versions of the compiler and standard library (now we are in the 5.3).
Note that in OCaml you can use C libraries and it is quite common to manage Int32, Int64, signed etc...
> F# v2
What does that mean?
1 reply →
What is ML programming language? They say OCaml is the same thing with the different name, is it truth?
https://en.m.wikipedia.org/wiki/Standard_ML
Can a systems programming lanugage use garbage collection? I don't think so.
You´d be surprised.
In the 1980s, complete workstations were written in Lisp down to the lowest level code. With garbage collection of course. Operating system written in Lisp, application software written in Lisp, etc.
Symbolics Lisp Machine
https://www.chai.uni-hamburg.de/~moeller/symbolics-info/fami...
LMI Lambda http://images.computerhistory.org/revonline/images/500004885...
We're talking about commercial, production-quality, expensive machines. These machines had important software like 3D design software, CAD/CAM software, etc. And very, very advanced OS. You could inspect (step into) a function, then into the standard library, and then you could keep stepping into and into until you ended up looking at the operating system code.
The OS code, being dynamically linked, could be changed at runtime.
My two recommendations are easily Nim and Zig.
If you want something that is essentially just a modernized C, go with Zig. The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion. My only major complaint at the moment is that duck typing is fairly prevalent. Sometimes function arguments are declared `anytype` and you occasionally have to dive down multiple function calls to figure out what's going on, though that's not too much of a hindrance in practice, in my experience.
My personal favorite language is Nim. Efficient, but simple, memory management (drawing from C++/Rust). You rarely have to think too hard about it, yet making fast programs is not complicated. You can stick to the stack when you want to. The flexibility at compile-time gives you great power (but it requires great responsibility -- easy to abuse in a bad way). The type system is awesome. The only downside for me is the tooling. The LSP needs much optimization, for example.
My issue with Nim is its import system. If you have a function "foo" it's hard to tell where is it imported from. I'm not sure why this bothers me when C is the same... probably because I'm familiar by now which header defines any C function.
Also, I believe high-level compiled languages suffer from the fact that it is very hard to tell which construct is expensive and which is a zero-cost abstraction. Rust has the same issue, but "zero-cost" is a major feature of the language so you don't feel bad using an Iterator, for example, in kernel code. With Nim it is hard to tell.
It makes logical sense to do imports that way when operator overloading exists. Otherwise your custom operators would look like:
Which is very ugly. At that point, we might as well just go with the function name approach that languages like Go take:
I suppose you could change it so operators are imported into the same namespace, and non-operators still require a separate namespace when referred to. But that makes it even more complicated in my opinion. I agree it's less obvious what's coming from where, but I think when your libraries have distinct responsibilities, it usually ends up being pretty straight-forward what function comes from where based on how it's named (if it's written well).
I find the type system in Nim to be pretty poor. It's difficult to reason about what is on the stack vs heap by looking at the business logic and not the types themselves, and also hard to reason about when you do copies vs pointers, since everything is defined on the type itself. I find it to be a bad design decision, I wouldn't build anything large with it.
> The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion
https://tour.dlang.org/tour/en/gems/compile-time-function-ev...
>The concept of compile-time programming having the same appearance as runtime programming is very cool in my opinion.
You mean, something that Lisp does since the early 1980s?
I didn't say it was novel. It's just not something you see in modern languages.
Should that make it uncool?
> You mean, something that Lisp does since the early 1980s?
Um, no. Debugging a macro in Lisp is a terrible experience while debugging a comptime function in Zig is brain dead simple.
Zig is the first "macro" system I've used that doesn't want to make me blow my brains out when I need to debug it.
Yes, I think that's what they mean.
Loving Ada without using exceptions or inheritance on embedded and desktop. Some love Ada full OOP tagged types. I love Ada procedural style with privacy and abstract data types. I wish Flutter was written in Ada but atleast Dart is better than JavaScript atleast for procedural code without it's oop boiler plate. You don't actually need OOP for widgets.
I'm a big fan of Ada. I first encountered exceptions in Ada. When I first saw Python, way back in version 1.5, I was happy to see exceptions.
But is Dart better than Typescript? I prefer Typescript for multiple reasons but one of them is that you don't have to use classes to use the advanced typing system. Without a typing system I like Ruby the most, but sometimes we just need a typing system.
Dart is better in some ways and worse in others.
1. It has an actually sound type system.
2. The language and standard library are waaaaaaaay ahead of Javascript.
3. The tooling is top notch. Better than JS/TS.
But on the other hand:
4. Way smaller ecosystem.
5. Debugging is worse if you're compiling to JS. The fact that the code you run is basically identical to the code you write in TS can be a big advantage. Only really applies for web pages though.
6. Type unions are way nicer in TS.
7. Non-nullable types interact badly with classes. It can make writing methods correctly really awkward - you have to explicitly copy member variables to locals, modify them and then write them back.
8. Way smaller community.
3 replies →
> I wish Flutter was written in Ada but atleast Dart is better than JavaScript atleast for procedural code without it's oop boiler plate. You don't actually need OOP for widgets.
You can use other libraries for this like Riverpod with flutter_hooks and functional_widget which essentially removes the OOP structure of widgets and turns them more into functions, in a way.
I second that. From 8 to 64 bits systems, embedded or system programming, Ada is the best choice. I've saved tons of missing hours, headhaches, etc. with this gem. Search Github Sowebio Adel for a good setup manual and, in the same repo, v22 for a good gp kiss framework...
What are you using Ada for?
Embedded 8 and 32 bits microcontrollers to web linux based erp/crm softwares. Ada can be used for anything, with the speed of C/C++ but in a far more readable and safer way... Ada is a secret weapon. Don't spread theses infos ;)
Free Pascal, but I am interested in Ada and will be learning it more this year. I love the readability of the syntax, and on the outside looking in, the community seems good.
I have also moved back hard to using TCL as my scripting language. I like it too much, and bouncing between Python, Go, and such for DevOps glue tires me out.
For systems, I love using plan9 (9front) to solve problems, which grounds me to C, awk, sed, and the rc shell.
That would be mix of D, Object Pascal, Swift, Ada, C#, Java.
A few decades ago plenty of Oberon dialects.
As language geek, I randomly select languages when doing hobby coding.
Regarding Go's remark, even if I dislike Go's authors decisions, back in my day writing compilers, linkers, firmware, networking stacks, and OS services was considered systems programming.
Likewise .NET team has been making wonders catching up to what C# 1.0 should have been for low level code, given its Delphi linage.
Java, in the context of being whole Android userspace, including drivers, there is very little systems exposed in the NDK. Vulkan is one of the few things not exposed to Java land, and that is being fixed with WebGPU like API in an upcoming version.
What are your thoughts on D? My experience is limited but seems like a very underrated language.
I started using it recently for a prototype of something I'll eventually rewrite in C++ at work. I really like it.
Discarding the preprocessor and replacing it with a proper module system is huge. I got burnt by templates and horrifying compile times in C++, but haven't had any problems with D templates. The module system makes templates feel much more natural to use. The syntax for templates is a huge improvement, and throwing `static if` into the mix results in concise and easy-to-read code.
I also quickly realized (with the help of some people on the D discord) that the garbage collector is fine for my needs. So I don't have to spend any time thinking about memory management... put stuff on the stack when I can for speed, othrewise just GC and don't think about it. I think there may be some issue with multithreading and the GC, but this is supposed to get fixed with the new GC that's on the way.
There are a few other nice QOL improvements. Getting rid of `->` is honestly worth its weight in gold. There's nothing difficult about forgetting to change a `.` to a `->` or vice versa in C++, but not having to trip over it periodically when you're compiling makes the language that much smoother. I was also initially confused by the `inout` keyword but have come to really like that, as well. Little niceties like `const(T[])` are small but, again, reducing just a little bit of friction like this across the language makes D much, much more pleasant to deal with than C++.
I think the main challenge the language is facing right now is that it's huge and a lot of it is still getting worked out. I never thought I'd pine for C++'s "rule of 3/5/0", but it's a lot tighter and more logically consistent than the equivalent in D. But part of that is there being a huge community of C++ developers who have taken the time to promulgate rules of thumb in the community. I'd kill for an "Effective D" book to short circuit some of this process... after all, I'm trying to write code, not play at the margins, tinkering with D's idiosyncracies.
4 replies →
I've been playing with it hacking a compiler written in C++ to be sort of transliterated to D. Just to see if it then makes the compiler easier to read, while not worrying about the performance yet.
So far in converting the lexer it does make it more comprehensible, it will probably do the same for the parser and AST. The real interesting bit will be once I tackle the later stages.
C99 ;) ...compared to 'popular C' (which is essentially C89 plus some common extensions taken from early C++) C99's main improvements (designated initialization and compound literals) haven't really caught on yet even among many C programmers, but those features (IMHO) completely revolutionize the language, and especially library API design.
Also on a more serious note: I started some projects in Zig and even though most of my future projects will be built on a bedrock of C code, more and more of the top-level layers will happen in Zig.
There it is again, the urge to port my Lisp back to C.
https://github.com/codr7/eli
What I love most about C is the fact that it doesn't talk down to me no matter what crazy ideas I come up with. It's therapeutic for me, reminds me why I started writing code in the first place.
I realize that's also what many hate about it, the fact that it gives other people freedoms they would never trust themselves with.
Designated initialisers and compound literals, sure they have caught on, one just has to know where to look:
That is code which is around 4 years old.
For the latter example, one could theoretically avoid declaring the variables 'event' an 'rg_match', instead direcly including the compound literals in the respective function calls. However it is a question of taste, and what is more readable.
(The above have designated initialisers, I'm can't remember if there are any compound literal examples there.
There is however one here, when the BSTR_K macro is also expanded, also the earlier BSTR_INIT:
I remember reading this some time ago : https://floooh.github.io/2019/09/27/modern-c-for-cpp-peeps.h...
I do use those so thank you :)
C#. While a popular language, it is criminally overlooked for high-performance programming. Obviously, you can't use it for embedded or kernel development. For other use cases though, it can almost reach the performance of C/C++/Rust when written with proper care.
> Obviously, you can't use it for embedded
Embedded is diverse. I would not use .NET for small embedded, i.e. stuff running on Arduino or ESP32.
However, I have successfully used .NET runtime in production for embedded software running on top of more performant SoCs, like 4 ARMv7 cores, couple GB RAM, Linux kernel. The software still has large pieces written in C and C++ (e.g. NanoVG rendering library) but all higher-level stuff like networking, file handling, and GUI are in memory-safe C#.
You surely can use it for embedded,
https://learn.microsoft.com/en-us/archive/msdn-magazine/2015...
https://www.ghielectronics.com/netmf/
https://www.ghielectronics.com/sitcore/
https://www.wildernesslabs.co/
I sometimes write C# in my day job. But I think I don't know much about how to write really fast C#. Do you have any recommendations for learning resources on that topic?
Sure. Here are some resources:
* Span<T>: https://learn.microsoft.com/en-us/archive/msdn-magazine/2018...
* C# now has a limited borrow checker-like mechanism to safely handle local references: https://em-tg.github.io/csborrow/
* Here is a series of articles on the topic: https://www.stevejgordon.co.uk/writing-high-performance-csha...
* In general, avoid enterprise style C# (ie., lots of class and design patterns) and features like LINQ which allocate a lot of temporaries.
21 replies →
Span<T>, ReadOnlySpan<T>, Memory<T>, CollectionsMarshal, CollectionsExtensions, ref struct, ref return, ArrayPool, ArraySegment, ValueTuple, and using interfaces/structs/generics carefully.
That is if you don't want to get into unsafe code.
A few important ones: - Avoid memory allocations as much as you can. That's a primary thing. For example, case insensitive string comparisons using "a.ToUpper() == b.ToUpper()" in a tight loop are a performance disaster, when "string.Equals(a, b, StringComparison.CurrentCultureIgnoreCase)" is readily available. - Do not use string concatenation (which allocates), instead prefer StringBuilder, - Generally remember than any string operation (such as extracting a substring) means allocation of a new string. Instead use methods that return Span over the original string, in case of mystr.Substring(4,6) it can be a.AsSpan(4,6), - Beware of some combinations of Linq methods, such as "collection.Where(condition).First()" is faster than "collection.First(condition)" etc.
Apart from that (which simply concerns strings, as they're the great source of performance issues, all generic best practices, applicable to any language, should be followed.
There are plenty resources on the net, just search for it.
You actually can use it for embedded and kernel development! See .NET Nano Framework [1] for embedded - works on microcontrollers like ESP32. For kernel development there's nothing really built in to support it but people have built tools [2] to do it.
[1] https://nanoframework.net/ [2] https://gocosmos.org/
Pour one out for Midori, which would have replaced Windows with a capability-based OS completely written from kernel to shell in a C# dialect. Async/await, spans, and immutable support came from it, along with an (opt-in) Rust-like borrow checker. Satya canceled it, and all the work was lost to history. Singularity was the early public prototype.
4 replies →
And arguably it beats the performance of C/C++/Rust when written without proper care: https://blog.codinghorror.com/on-managed-code-performance-ag...
The big take-away I got from this (admittedly quite old now) experiment is that getting advertised performance out of unmanaged languages for typical real-world (i.e., non-benchmark) tasks often requires a lot more care than people really account for. Nowadays memory dominates performance more so than CPU, and the combination of a JIT compiler and a good generational, compacting garbage collector - like C# and Java developers typically enjoy - often does a better job of turning idiomatic, non-hand-optimized code into something that minimizes walks of shame to the RAM chips.
Well in that case, Java :)
I've been having a lot of fun with Java lately, the maturity of the language/implementation and libraries allows me to focus on the actual problem I'm solving in ways no other language can currently match.
https://github.com/codr7/tyred-java https://github.com/codr7/eli-java
[dead]
The only true "system programming" I've done was in Microsoft Macro Assembler, a product I grew to hate with a passion.
A non-answer, but tangentially relevant:
I once fiddled with Forth, but never actually accomplished anything with it.
Several OSs are written in Lisp; in some of them the difference between OS and application is a bit vague. At the time none of them were available to me to play with.
I discovered Oberon and fell in love. My first real programming language was Pascal, and Oberon is part of the same family. Oberon consisted of a compiler, operating system, user interface, application software, and tools, all self-hosted on Oberon. There was even an Oberon CPU at one time. But Oberon turned out to be just an academic curiosity, and wasn't available for any hardware I had access to anyway.
Have a look at https://github.com/rochus-keller/Oberon which runs on different operating systems and architectures. You can even generate platform-independent C which I e.g. used to port the Oberon System 3 (https://github.com/rochus-keller/OberonSystem3).
"Microsoft Macro Assembler, a product I grew to hate with a passion."
Turbo Assembler FTW :)
MASM was always horrible.
nasm has been lovely, but I haven't used in 10+ years. https://github.com/netwide-assembler/nasm
2 replies →
We have had a great experience using Common Lisp [1] for our causal space-time systems digital twin [2]
[1] http://sbcl.org/
[2] https://graphmetrix.com/trinpod-server
I so envy people who manage to find interesting Common Lisp work, it's like we live in different dimensions.
There are many independent consultants working in Lisp.
Yes, it is rare.
Requires open minded middle management and that is rare.
2 replies →
I started using Idris a few years ago because the idea is fascinating. Such as state machines in your type system, the size of a list being defined in the static type system, even if the list size changes over time (pretty mind blowing), etc..
But ultimately I realized that I’m not writing the type of software which requires such strict verification. If I was writing an internet protocol or something like that, I may reach for it again.
Similar boat. I've read about Idris (and been 'shown the door' enough times) and I love the idea of it, but sadly I haven't yet had any reason to use it.
I am currently contracted 3 days a week writing Zig. I can't say much because NDA, but I just love working with Zig almost every day. I think for the right projects, it is such a great choice for mission critical software.
You get the added benefit of being able to easily consume C libraries without much fuss. The fuss is in navigating the C APIs of decades old libraries that we all still depend on every day.
Do tell us sometime when you can in the future. It's always interesting to hear what Zig people are doing because they do some very weird stuff.
They wouldn't be using Zig otherwise. :)
In LuaJIT and Odin it is also easy to do FFI.
I write a fair bit of rust/c for my day job. Do you find zig easier than the ffi interface in Rust?
I maintain auto-generated Rust and Zig bindings for my C libraries (along with Odin-, Nim-, C3-, D- and Jai-bindings), and it's a difference like night and day (with Zig being near-perfect and Rust being near-worst-case - at least among the listed languages).
> Do you find zig easier than the ffi interface in Rust?
Yes, but it's mostly cultural.
Rust folks have a nasty habit of trying to "Rust-ify" bindings. And then proceed to only do the easy 80% of the job. So now you wind up debugging an incomplete set of bindings with strange abstractions and the wrapped library.
Zig folks suck in the header file and deal with the library as-is. That's less pretty, but it's also less complicated.
I've somehow avoided Rust, so I can only comment on what I see in the documentation.
In Zig, you can just import a C header. And as long as you have configured the source location in your `build.zig` file, off you go. Zig automatically generates bindings for you. Import the header and start coding.
This is all thanks to Zig's `translate-c` utility that is used under the hood.
Rust by contrast has a lot more steps required, including hand writing the function bindings.
2 replies →
F#! I’m in love with the language. It is my defacto pick for most things these days. Very expressive AND strongly typed. Being a part of the .Net ecosystem is also a plus.
I wouldn’t call F# a systems programming language, but it’s definitely on my list of things to properly try out at some point
Can you create desktop GUI apps with it?
Yes!
https://funcui.avaloniaui.net
https://github.com/fabulous-dev/Fabulous
(I'm sure there are more, these two are those which I could recall from the top off my head)
2 replies →
Every now and then Freepascal with Lazarus but the same bug being in the IDE for ten years plus kind of annoys me. If I save a new project and I move any files around it does weird stuff, or if I rename a module.
Theres also D but finding libraries for whatever I want to work on proves problematic at times as well.
On the other hand, the Ultibo OS for the Raspberry Pi is written in FreePascal.
Using Elixir and Elm at my day job.
Coming from a more Python/Java/PHP/JS background, Elixir was a lot easier to pick up and doesn't frustrate me as much. Most of the remaining scary bits involve concurrency and process supervision trees.
Macros are powerful, but also easy to use in a way that makes everything hard to debug. For those unfamiliar with them, it's a bit like a function except any expressions you call it with are not evaluated first, but arrive as metadata that can be used to assemble and run new code.
The question is about systems programming.
Why elm over LiveView?
I know “why” elm, I liked everything I saw about it, but how do you combine the two, if you do?
There's a bit of a struggle between sections that use just one or the other, but Elm has the managerial blessing right now.
While I think Elm is neat, it suffers from ecosystem issues. It drive a large amount of Not Invented Here because JS invented somewhere else is hard to incorporate. Also, good luck rendering arbitrary HTML that comes in as data from somewhere else.
2 replies →
Does the lack of movement on Elm even in terms of bugfixes cause any issues? Maybe you use elm-janitor?
My major system programming languages are C and C++, but I did some projects in Oberon (which turned out to be not particularly suited for systems programming), and then created a modified, better suited version of it called Oberon+ (https://github.com/rochus-keller/Oberon), which I e.g. used to create platform-independend versions of different Oberon System generations.
But Oberon+ is still too high-level for many system programming tasks. So I'm designing a new system programming language called Micron (for Micro Oberon, see https://github.com/micron-language/specification) which has the full power of C without its disadvantages. You can even use it for the OS boot sequence when there is no stack and no heap, but also for higher-level application development, due to its selectable language levels.
Ada
The open source tooling has significantly improved since I started using it in the last five years.
Not presently, but not long ago, Fortran and Ada. I still like Ada better than the alternatives, especially as it's changed this past couple decades. I find it hard to miss Fortran, though. I'd consider it for scientific computing and that's about it, which isn't my present domain.
Interesting, thanks.
Did you ever check out Eiffel for systems programming work?
I had been checking it out some years ago, and apart from the general points about it, one use of it that I found interesting was in an article about using it for creating HP printer drivers. The author had mentioned some concrete benefits that they found from using it for that purpose.
Edit: I searched for that article, and found it:
Eiffel for embedded systems at Hewlett-Packard:
https://archive.eiffel.com/eiffel/projects/hp/creel.html
I learned it once long ago, but never used it for anything other than that learning experience. I did like its concepts, though the language itself didn't quite stick with me.
How would Fortran be used other than numerics/scientific computing?
This was in an embedded systems context, I came on later but it was what most of the core system was written in. It's been used in a lot of avionics systems over the years.
Modern Fortran has ISO C bindings in its standard library. You can call any C library from Fortran and wrap it in a Fortran module if you want to make it easier to use.
Despite its history it is a pretty modern language if you enable all warnings, set implicit none and ignore the old style of coding (a la FORTRAN 77 of older).
2 replies →
These days, we have many better options, but back in the day, Fortran was also used for compilers (e.g., IBM's Fortran H), operating systems (such as PRIMOS[0] and LTSS[1]), symbolic computation (e.g., early Prolog implementations), and real-time control systems[2].
[0] https://en.wikipedia.org/wiki/PRIMOS
[1] https://en.wikipedia.org/wiki/Livermore_Time_Sharing_System
[2] https://webhome.weizmann.ac.il/home/fhlevins/RTF/RTF-TOC.htm...
not a direct answer to your question, but the use in the domain you mentioned itself, is huge.
from the Wikipedia article about Fortran, under the Science and Engineering section:
https://en.m.wikipedia.org/wiki/Fortran
Although a 1968 journal article by the authors of BASIC already described FORTRAN as "old-fashioned",[58] programs have been written in Fortran for many decades and there is a vast body of Fortran software in daily use throughout the scientific and engineering communities.[59] Jay Pasachoff wrote in 1984 that "physics and astronomy students simply have to learn FORTRAN. So much exists in FORTRAN that it seems unlikely that scientists will change to Pascal, Modula-2, or whatever."[60] In 1993, Cecil E. Leith called FORTRAN the "mother tongue of scientific computing", adding that its replacement by any other possible language "may remain a forlorn hope".[61]
It is the primary language for some of the most intensive super-computing tasks, such as in astronomy, climate modeling, computational chemistry, computational economics, computational fluid dynamics, computational physics, data analysis,[62] hydrological modeling, numerical linear algebra and numerical libraries (LAPACK, IMSL and NAG), optimization, satellite simulation, structural engineering, and weather prediction.[63] Many of the floating-point benchmarks to gauge the performance of new computer processors, such as the floating-point components of the SPEC benchmarks (e.g., CFP2006, CFP2017) are written in Fortran. Math algorithms are well documented in Numerical Recipes.
1 reply →
My first Fortran program was a tool I wrote to read 8" SS/SD CP/M floppies on a minicomputer. That was very easy to do, as the dialect had a couple of useful string extensions and the operating system had efficient ways of reading a floppy.
I recently dabbled in "hare" which was quite a nice experienced.
I liked how the language stayed pretty simple compared to other C-replacements. The standard library is also pretty nice. It is however an extremely niche language, but still quite capable
I really like the design choices they've made. Namely:
- Once you cut out the legacy nonsense out of C, you can then add a few nice modern features to your language and still end up with something that's smaller and simpler than C.
- Performance optimizations are possible. But by default, simplicity is always picked over performance. (i.e. most UB is eliminated, even if it hurts performance)
- A few basic pointer features go a long way in eliminating memory most memory safety bugs. There are non-nullable pointers, ranges with automatic bound checks, and no C strings.
- They get a lot of mileage out of their tagged union type. It allows for elegant implementations of algebraic types, polymorphism, and error handling.
- The error handling!
I was pretty excited about Hare until Devault said that Hare wouldn't be doing multithreading as he preferred multiprocessing. That was a pretty big dealbreaker for me. The rest of the language looks quite clean though!
hare-ev [0] is using epoll under the covers, which means multithreading is there, already. Especially as ev may be merged into the stdlib at some point.
[0] https://git.sr.ht/~sircmpwn/hare-ev
3 replies →
You could always link to pthread and use that in your Hare code, no?
1 reply →
hare will not support Windows.
https://harelang.org/documentation/install/#supported-platfo...
Interesting reasons.
Makes sense
But why 8-character indents as the standard formatting for Hare programs? I notice that Odin seems to prefer 8-character indents as well. It seems like a real blow to readability for deeply nested code. Maybe you aren't supposed to write deeply nested code?
Squeak[1], Cuis. Metacircular Smalltalk VM[2] written in itself. We sometimes call it SqueakNOS for 'Squeak no operating system needed'.
[1] https://ftp.squeak.org/docs/OOPSLA.Squeak.html
[2] https://tinlizzie.org/VPRIPapers/
An FTP link for documents? That’s a real throwback
It is a html link and a subdomain name that is the same as a protocol.
Here pdf link:
https://dl.acm.org/doi/pdf/10.1145/263698.263754
Pascal.
Sure these days not many folks write OS kernel in Pascal, but there are some, e.g: https://github.com/torokernel/torokernel
I once want to try Forth (perhaps there's a Unix clone in Forth?), but seems like most folks using it are embedded/hardware devs.
I had read somewhere that some of the early Apple (not Mac) software, i.e., systems, application or both, was written in some Pascal variant.
https://folklore.org/Hungarian.html:
“The Macintosh used the same Motorola 68000 microprocessor as its predecessor, the Lisa, and we wanted to leverage as much code written for Lisa as we could. But most of the Lisa code was written in the Pascal programming language. Since the Macintosh had much tighter memory constraints, we needed to write most of our system-oriented code in the most efficient way possible, using the native language of the processor, 68000 assembly language. Even so, we could still use Lisa code by hand translating the Pascal into assembly language.”
MacOS was clearly Pascal-oriented, with its ‘Str255’, ‘Str63’, etc. data types.
2 replies →
These days I write nearly all my code in Virgil (https://github.com/titzer/virgil).
It has features like classes, first-class functions, tuples, ADTs, unboxing, and a little data layout language, some unsafe features, like support for generating and integrating new machine code, and can talk directly to kernels.
Nim, I love its “make simple things simple and complex things possible” philosophy.
I absolutely adore Nim.
That said, the edges are still (very) rough when it comes to tooling (generics and macros absolutely murder Nimsuggest/lsp) and also "invisible" things impacting performance such as defect handling (--panics:on) and the way the different memory management schemes introduce different types of overhead even when working with purely stack allocated data.
But even with all that it's still an extremely pleasant and performant language to work with (when writing single threaded programs at least)
Definitely agree that there are rough edges, but Nim is in a better state than ever. The LSP isn't great yet, I'll agree with that. There are great optional type libraries for working around exceptions if you don't want them, and the new memory management system (ARC/ORC) is very efficient compared to the old refc implementation (now much more like the C++/Rust approach).
For parallel programming, there are also handy libraries. The best of which is Weave[1], but Malebolgia[2] is authored by the creator of Nim and works well in its own way too.
There is also active work being done on a new implementation of Nim which intends to clean up the some of the long-term spaghetti that the current implementation has turned into (like most long-term projects do), called Nimony[3], and is also led by the original creator of Nim. It is years away from production according to him, but is at least in the works.
I'd have to say Nim is by far my favorite programming language. The terseness, flexibility, and high performance, make it feel almost sci-fi to me. My only major complaint currently is the tooling, but even the tooling is still adequate. I'm glad it exists. Highly recommend.
[1] https://github.com/mratsim/weave
[2] https://github.com/Araq/malebolgia
[3] https://github.com/nim-lang/nimony
I really should continue my Nim series :( https://youtube.com/@nimward
Yes please!
Perl had somewhat the same, IIRC.
https://www.google.com/search?q=perl+simple+things+easy+and+...
zig is coming along quite nicely. If you've not heard about zig, take a look at https://ghostty.org/ (a terminal for Linux/Mac and Windows in future), https://tigerbeetle.com (a database for financial accounting) and http://bun.sh (a modern, faster alternative to nodejs).
I’ve replaced Ruby as the “glue” language on my machine with Crystal, being able to plop out a binary and not worry about the myriad things that can go wrong with needing the entire environment to be perfect, including reinstalling gems for every version etc is such a relief. Bundler is just a frustrating sticky plaster over that.
I’d like to give Zig and Nim a go, but Go and Elixir are probably next on the list, simply because I have unread books for them staring at me.
I stopped buying tech books, started a safari online subscription. Much better. Now I have just one thing staring at me (virtually) from the internet, instead of a dozen things staring at me from the book shelf. It's less intrusive.
I almost exclusively work in Ada for my hobby projects these days; It's great for doing both high level and low level programming.
Where does tooling and platform support stand for Ada? Could one develop desktop, mobile, web apps, too using Ada? Thanks.
Tooling is pretty good; Ada has a package manager similar to Cargo (Alice: https://alire.ada.dev ), and you can install it pretty easily via GetAda ( https://getada.dev ), which brings in the compilers and any libraries you'd need. (disclaimer: I'm the author of GetAda)
Desktop apps: definitely. There's bindings for various UI toolkits like GTK and I know of a few people working on games in Ada, usually thick bindings like SDL: https://github.com/ada-game-framework and https://www.youtube.com/playlist?list=PLn3eTxaOtL2Oxl9HbNOhI... There's also Gnoga, which is similar to Electron for writing UI apps: https://github.com/Blady-Com/gnoga
A bunch of libraries for various drivers or other useful things on https://alire.ada.dev/crates.html (to include something like "ada_gui" in your ada project, you would just use alire, e.g. `alr with ada_gui`).
Much of Ada's webapp functionality is either interfacing with the Ada Web Server or gnoga (I've written a few servers using Ada Web Server, including one for an ".io" game).
There's an LLVM compiler which in theory can produce wasm but I've not messed with it: https://github.com/AdaCore/gnat-llvm
Mobile platforms can be targetted and cross-compiled in Alire, but I'm not sure who's doing it right now.
For anyone interested, I definitely recommend checking out some of the presentations of Ada's recent FOSDEM dev room https://fosdem.org/2025/schedule/track/ada/
Are there Ada jobs?
Your favourite job board will have a tag for Ada. There are jobs out there. Some in lower level things like finance, reliability testing, embedded software. Some in higher level things like gaming, AI, web services.
There are fewer, and they do tend to be more demanding, but they certainly exist.
Huntsville Alabama has some
Ada for bigger projects, D for quick one-offs and more “scripty” work.
I had played around with D some time ago, and wrote some small programs in it for fun and learning. I both liked and disliked things about the language.
there was some Russian dev running a systems tech company, I forget his name, living in Thailand, like in koh samui or similar place. he used D for his work, which was software products. came across him on the net. I saw a couple of his posts about D.
one was titled, why D, and the other was, D as a scripting language.
I thought both were good.
It’s a little like go in that it compiles quickly enough to replace scripts while still yielding good enough performance for a lot of systems tasks. It predates go and I wish Google had just supported D, it’s a much nicer language IMO
What are you using Ada for?
Fun side projects mostly, my GH username is the same as here if you’re (morbidly) curious.
1 reply →
Julia https://julialang.org/
I don't know if Julia is a system programming language
It's quite funny to classify it as such, given you need to run your programs like a script as it's nearly impossible to compile a binary you can distribute (though I am aware they're working on this as a priority task, currently).
3 replies →
It's certainly not a traditional one, but it is increasingly used as one https://arxiv.org/abs/2502.01128.
I don't work in Seed7 by Thomas Mertes but it deserves to be better known.
https://en.wikipedia.org/wiki/Seed7
It has a SourceForge page that actually doesn't suck and you will not hate landing into, unlike almost anything else SourceForge:
https://seed7.sourceforge.net/
Though there is an old school SourceForge file area with tarballs, the project page also links to a GitHub repo.
Nim. Fantastic choice for modern headless software. Simple obvious type system, preference for immutability and referential transparency. Dynamic collections are by default managed by hidden unique pointers on the stack. So the default RC isn't necessary unless explicitly invoked for a ref type.
Currently solo managing a 30k line data analysis application I built for my company. Easily fits in my head given the obvious pyramidal functional-like structure. Maybe two lines of memory semantics anywhere in the entire thing, and only one module that's OO with a constrained scope. Lots of static data files (style sheets, fonts) slurped up as const strings at compile time. Incredible performance. Invoked by our PHP server backend, so instead of doing parallel or async in the analysis, the server gets that through batch invocation.
Working stupid well for our product, plus I can easily compile binaries that run on ARM and RISC-V chips for our embedded team just by invoking the proper gcc backend.
Replaced an ailing and deliberately obfuscated 20 year old jumble of C and PHP designed to extort an IP settlement from my company. Did it in a year.
Do you have any recommendations for well designed open source Nim projects for someone study to get a feel for the language?
Anything written by Treeform[1] is a good place to start, their libraries make up a big chunk of Nim ecosystem.
1 - https://github.com/treeform/hobby
Honestly hard to say. There are a number of styles of architecting Nim libraries and programs, and almost none match my own. My most particular criticism of the Nim ecosystem is the abuse of macros: There are a number of libraries implementing huge chunks of functionality behind macros such that code paths only appear at compile time and are not reflected in sources. Some libraries constrain macro use, but many are built entirely out of macros. I'd say to avoid looking to those examples.
1 reply →
OK, here's a pretty niche blast from the past: the boo programming language. It ran on the CLR (.NET) and had syntax similar to python. I recall using it back around 2006 - 2008 because it offered scripting features for .NET on Windows.
https://boo-language.github.io/ "A scarily powerful language for .Net". I didn't use it for too long before switching to Iron Python.
I remember reading about the Boo language and IronPython some years ago. Do you still use IronPython?
I do not.
These days I would reach for a shell script for general scripting, filling in the gaps with maybe a C# console app or something in Common Lisp if I want/need some interactivity.
Something that happens pretty frequently is I'll take information I've written into an emacs org doc and run it through a CL function, whose output could be an org mode table which I can from there export to a different document format if necessary.
I do systems programming in i386 (32-bit) assembly language with NASM.
For me it doesn't scale beyond a few dozen kilobytes (executable program file size) per program. For others (such as Chris Sawyer) assembly scales much better.
Did you get a look at fasm [0] ? It has nice capabilities
[0] : https://flatassembler.net/
fasm is indeed great. It has many features, it can do all the code size optimizations, it even knows the sizes of variables (e.g. `mov myvar, 5` depends on `myvar db 0` vs `myvar dw 0`). NASM and fasm syntax are quite similar.
NASM supports more output file formats (i.e. .o files for many systems), and it can receive macro definitions from the command line (e.g. `nasm -DDEBUG`).
If the support was still there I'd still be using VB.NET.
I've coded professionally in a dozen languages, including a lot of time in x86 assembler, C++ etc.
Still like VB.NET better than any other. To me, it was the most readable code.
You may like Python! At least the syntax.
Is it not still supported?
It is, but it's really on life support. It's supported for legacy development in the most part. There are so few people coding in it now you'll never see any example or tutorial .NET code in VB.NET.
it is still supported and developed.
I think that it depends on the system.
Firmware is probably still best done in C (sometimes, C++), mostly because so many SDKs, libraries, and toolkits are done in those languages. Also, there's decades of "prior art" in C. Lots of places to look for solutions.
I worked on a project, where we tried using a very "less-popular" language for firmware.
It didn't end well.
I'd say that being a "Systems Programmer" means that you are operating at a fairly "advanced" level, where the benefits of "safer" languages may be less of a factor, and the power of more "dangerous" languages is more attractive.
Of course, on HN, suggesting C or C++ is suggesting "less popular" languages...
Though we see "fairly advanced level" of C and C++ leading to 70 percent of the vulnerabilities in the Chromium project. So I would bet on anyone's advanced level.
Good, safe, code tends to come from Discipline, Humility, and thoroughness.
I've known many highly experienced and intelligent software devs that are terrible at that stuff, and are like coding time bombs.
I highly doubt they're less popular, it's just that most people who use them aren't as vocal as their detractors.
This is my thought, also (it was basically a joke).
Another "unpopular" language, is PHP: https://w3techs.com/technologies/history_overview/programmin...
Tried trying zig, but was baffled by all the allocator dance you need to do and asking nicely to access a list (catching potential exceptions?) Tried odin, but the tooling is very raw. Tried rust, didnt want to try to please borrow checker that distracts me from my thoughts.
Idk, if someone just reinvents clean C without the nonsense garbage with some modules and package manager this will be a huge win. Let me access my null pointers, let me leak memory, just get the hell out of my way and let me program and hold my hand only where I want it to be held - sane types that give me refactoring, code completion and code understanding, modules with imports. Let compiler give sane error messages instead of this cryptic c++ garbage. Is this too much to ask?
D's "Better C"[1] mode looks like what you describe. Has syntax similar to C with a real module system, metaprogramming, slice types etc.,
1 - https://dlang.org/spec/betterc.html
I also had a brief look at Zig for writing a WASM module, but settled for Rust. I had no real gripes with the language, but the spartan documentation made making progress into a slog.
I wouldn't mind a "better C" that could use an LLM for static code analysis while I was coding. I.e. be more strict about typing, perhaps. Get out of my way, but please inform me if I need more coffee.
Allocation in Zig takes some getting used to but it's actually really nice. It took me a few weeks but I honestly believe you should give it another chance and more time
I personally find it much more ergonomic to have the allocator attached to the type (as in Ada). Aside from the obvious benefit of not needing to explicitly pass around your allocator everywhere, it also comes with a few other benefits:
- It becomes impossible to call the wrong deallocation procedure.
- Deallocation can happen when the type (or allocator) goes out of scope, preventing dangling pointers as you can't have a pointer type in scope when the original type is out of scope.
This probably goes against Zig's design goal of making everything explicit, but I think that they take that too far in many ways.
1 reply →
Same can be said about Borrow Checker.
looks like zig is exactly what you want. Difference only in std. C prefer global allocator, while zig ask it explicitly.
So, if only there is std with implicit allocators?
C's compilation unit model, lack of a formal module system and lack of language-level package management are the best things about it.
Separating interface and implementation is a good thing, but often you just want to split things into separate files without separate compilation. C supports #include and so it is maximally flexible.
[dead]
Common Lisp. It offers powerful abstractions and high speed. I’m happy with it.
I do as well. I've replaced a lot of my production bash scripts with Lisp.
Pretty much all SBCL.
Seconding CL. For my personality, purposes, and preferences it's the closest thing to a perfect language.
+1, it is my go-to language whenever I have no idea how complex the task will get
commenting after seeing multiple comments here, after about a day.
first of all, thanks, guys, to all who replied. that's a wealth of info to follow up on.
referring to the comments seen so far:
I considered mentioning (Free) Pascal, but thought of not doing it, because I thought it is nowadays too niche, even though it is one of my early programming language loves (forgetting that the title of my post says "less popular languages" :)
and I also didn't think of Ada at all, somehow, although have been interested in it, too, lately, and have been checking out websites and blogs about it, and also have been searching hn.algolia.com for posts about it.
so it was cool to see multiple mentions of Ada here, by people who like and use it.
> so it was cool to see multiple mentions of Ada here, by people who like and use it.
I'm surprised how popular Ada is here in these comments. I like some of the ideas (ranged types, for example) in Ada, I'm inspired to give it a try after seeing all the comments here.
heh. same.
D and Crystal always fascinate me. And if Go is a system language, Erlang and Common Lisp are even more so.
I wish Crystal had better IDE support, otherwise it’s just about perfect.
I wish we lived in a world where Crystal dominated over Go, but we're far from being there.
Admittedly, the slowness of the compiler (due to the nature of the language), and lack of better tooling are not helping, but 9 out of 10 times I enjoy way more writing Crystal than Go.
I think Crystal is going to need much more community support (articles, tutorials, blogs, community) and corporate sponsorship for it to even thrive in today's environment where we have an abundance of choices.
IIRC, I read somewhere, several months ago ago, that its type inference made it slow to compile anything but small programs?
4 replies →
I like Erlang a lot.
Me too buddy, super powerful, syntax is a little weird but once you get used to it..
gen servers, everywhere.
Using the `mcl` DSL language in https://github.com/purpleidea/mgmt/
It's awesome. But I'm biased because I designed it.
You can't build anything, but you can build many things much more easily. Particularly distributed systems.
I will put in a plug for Mercury: https://mercurylang.org/
I read a while ago, when checking out Prince XML (a high-end HTML to PDF conversion tool), that is written using Mercury.
https://www.princexml.com/
Wow, I haven’t heard about that language in a long time. What do you use it for?
Looks like Prolog.
I wonder what the major differences are.
1 reply →
I like nim so far, but I have to admit I haven't done all that much with it yet.
Nim is great, I wrote a crypto trading engine with it. The performance is excellent, memory safety works well, and it was much easier to write compared to Rust.
What kind of profits are you seeing with it?
1 reply →
Perl is kinda less popular now. I use that at work. Used to write perl6/raku in my previous job, I loved the grammars made a nice way to try and Wirte an nginx configuration manager.
Perl here, too.
We still use it for all kinds of web services development work, mainly because there's years of in-house modules for everything and the malleability Perl has.
As the founder of r/altprog on Reddit (been following random languages for 12 years now), my favorite "alt" language is Nim. It feels like Python & Javascript had a baby that was C++. Wish it had lambda operators like C# and JS, but it does have the cool feature of defining your own language constructs.
Also, shoutouts to Zig, Crystal, and Ballerina: those are other interesting ones off the top of my head, that folks should look into.
I've been using Zig for nearly 4 years now. A lot of changes in that period were not great, but I haven't really wanted to use anything else.
I have been watching with interest. I can't help but think Rust will easily win. Zig isn't different enough, and it's somewhat opinionated (in good ways but not always clearly better)
I just looked into Zig and it looks great on first glance. What recent changes were not great in your opinion?
i don't know if pike counts as a systems language, but i consider it an alternative to C, if only because it has good C integration so that you can easily include a module written in C. pikes syntax is also very close to C, which may be appealing to some (ironically that's an aspect i don't really care about myself)
if the question of go being a systems language is controversial, then pike is even more so. i would situate pike somewhere between python and go. pikes major drawback is that it doesn't produce standalone executables.
the real question i'd like to ask is, what actually is a systems language?
I'm not very good at using it, but every now and then try to do a small project in Chicken Scheme. Mostly I'm unsuccessful, but I enjoy the language a lot and have a great time using it.
I've been using Zig for years, and for the last year I've been using it at work. I've coded professionally in all the usual languages, but Zig does what I want much more easily.
Have a look at hare. It's got some interesting bits [1]
Also C3
[1] https://harelang.org
[2] https://c3-lang.org
I'm a fan of C3, I like that it's not trying to be _too_ far removed, but adds enough to rid you some of the tedious chores of C. Dev and their community is also really nice.
Nice to hear! I will definitely give it a go now. It was on my list.
Fuzztester here is asking about system languages. I see a lot of people suggesting things I'd consider non systems languages.
Yeah, but unfortunately 'systems programming language' is a bit vaguely defined. I'd call any language which can deliver a binary executable and which offers some degree of lower-level control (like getting the disassembly of a procedure or deliberately being able to stack-allocate things) systems languages, but others may have different ideas.
I'm looking forward to getting back to Zig soon, especially now that there is support for Asahi linux.
I like that for low level SoC stuff, there is now the packed struct, which makes register representation very nice to deal with, especially with the definable int types, although I'm often torn between a u1, bool and sometimes even a enum(u1) for certain flags. I tend to let the SoC documentation (naming convention) drive that decision.
Otherwise there is a lot of nice and simply designed language stuff in Zig that also takes me back to my C / asm days. My least fav. part is maybe multi-line string literals that look like comments. I prefer the kotlin approach there.
I'd like to find a non walled-garden Zig community if there are other Zig fans here, ie just a forum. Or tips on editor to use? since I am tired of electrons being burned needlessly, and almost feel like I need to VM these modern dev tools.
Does https://ziggit.dev/ not cut it for you in terms of non-walled garden?
It seems to be good enough that I basically don't interact with the Zig Discord anymore.
I'm lurking so far, but looks like just what I was searching for. Thanks!
I’m doing all my Zig editing in Zed and it works great.
For version management I use mise (or scoop on Windows).
Thanks. I managed to find a build of sublime that kinda works (menu bar issues), but not figured out code completion, just seems to spam recent tokens at me. I had a look at Zed thanks, but didnt fancy the tight AI integration stuff. I'm a bit grumpy old mad and like my tools totally offline.
1 reply →
Take a look at Pony https://www.ponylang.io/
Pony is fun and I love the actor paradigm but it definitely feels like the community lost a lot of energy when Sylvan Clebsch stopped working on it (to work on a similar project for MS).
I will, thanks.
MoonBit [0]
It's still being developed but on man the language is good.
You read its documentation and pretty much every-single-thing is right decision (from my PoV).
Beautiful language if you like OCaml, Rust. Primary target is wasm, but compiles to native as well.
[0] https://www.moonbitlang.com/
Odin. It's just too easy and fun.
Why is Odin easy for you? Because it is non-OOP (I think, have not confirmed that) or some other reason?
Build system, module system, simplicity of C but much nicer, clearer syntax, lots of batteries included, it just does a lot of stuff to make life easier versus Zig or C/C++.
I personally don't think programming paradigms like OOP, procedural or functional make anything easier/harder necessarily, just talking QoL stuff.
And obviously "easy" is relative: Odin is still a low level language where you need to manage your own memory.
1 reply →
Does anyone remember BlitzBasic / BlitzPlus / Blitz3D? They were my first programming languages. I loved how dead simple it was to spin up a DirectX-based 3D game or a GUI application. There was something very nice about a simple, performant, batteries-included programming environment.
I started with DarkBasic but moved to BlitzBasic after a few months. Wrote a couple of small games and really enjoyed just messing around with it - it felt just like when I started programming on my first computer.
I still use BlitzMax for game development (when I get time) - there's an updated version with some nice language additions support for more architectures, including the Raspberry Pi: https://blitzmax.org/
I am using Haxe which compiles to other languages (C++, JavaScript, PHP...). This is a nice language when you want to create a web application or a even a CLI.
If you have played video games by Shiro Games (Evoland, Dune Spice Wars) or Motion Twin (Dead Cells) or even Paper Please!, then you had been exposed to this language.
Raku, for scripting jobs. It is a huge language, done right imo. It has more "flow" phenomenon than any other language I've encountered (like playing a great video game).
Not write-only like its ancestor. So many language criticisms solved. A true pleasure.
Still in its function-first development phase but apparently near the end. AST rewrite is still underway, then the team will address performance.
Cython. Writes like Python, runs like C. Strangely underappreciated.
It certainly doesn’t run like C. I once thought to port my JSM machine learning engine to python and it felt Cython might just be what I needed. Simply put it’s tight loops doing bitwise ops on bit-vectors. In reality no amount of adding type annotations would help the thing was slower then C++ by an order of magnitude.
I've generally found it to be within a factor of 2 of hand-tuned C. (It's literally autogenerated C.) But implementation matters, and I doubt we're going to check your work here in the comments.
I’ve been using ReScript and ReasonML (derived from OCAML, originating from Jordan Walke) for the better part of 7 years now professionally instead of TypeScript/JavaScript and couldn’t be happier. Blissfully enjoying a fully sound type system with stronger type inference vs. TypeScript and without all the complex type jugglin, null hell and an insane compiler with good errors, enormous speed and output of optimized JS code. It does indeed have a learning curve (these days far less steep than it used to be) but the benefits are just so many in my eyes. At the same time TypeScript has come a long way but it still struggles in comparison and I never feel as safe when writing TypeScript.
One downside is, of course, far less adoption and libs usually will have to have ReScript bindings written for them but that's fairly straight-forward and not something I have to do very often.
Haskell + copilot (from Nasa) and/or MicroHs when targeting embedded (like RPi Pico)
Very cool, was not aware of either of them. Thanks for sharing
https://github.com/copilot-language/copilot
https://github.com/augustss/MicroHs
Any good examples for these in systems projects?
Trying to make a game with https://c3-lang.org/, quite happy so far.
F# and Haxe. Love both of those languages
I've written a non-trivial (5K SLOC) app in Zig and it's very nice.
Chicken Scheme for personal projects.
I’m considering Mojo.
Oh, yeah! Haven't heard anything about it in the last 6 months. Any interesting developments?
Lots! [1]
The biggest thing to be added recently is GPU programming, which given Mojo's focus on ML/AI makes a lot of sense.
It's probably not the best language to look into for general purpose systems programming, but if you are going to be interacting with GPUs or other hardware then maybe it's good to give it a look.
It is still changing a lot, so no real stability yet, but to be expected for such a young language.
[1] https://docs.modular.com/mojo/changelog/
Me too for projects and tools. Would like to use Ada for IOT projects and tools.
Lately I've been using: https://janet-lang.org/ It's not a systems programming language, but it can be embedded in C.
https://jank-lang.org/ looks interesting to me --I have not tried it yet. I'm not sure if this language could qualify as a systems programming language. What do you think?
Forth. Old but very versatile. wrote the runtime myself years ago in portabel c.
D language (Dlang). It is especially good if you are porting from C as the semantics are the same enough to rum a lot of code via copy and paste, or if not it will fail to compile.
I'm converting an old C codebase to Swift, though given Swift's non-support for mix-language targets I'm considering switching to Zig.
Unfortunately the Zig compiler crashes when building my project and I haven't looked into debugging the compiler to find out why. There's a lot of gnarly code in this project (based on dikumud from ~1989?) with many inexperienced hands touching it over the decades.
Ada and Scheme.
I’ve been using Odin and really enjoying it lately. In my free time I’ve been using it for gamedev and for some Python interop at work
I've been writing some Scala Native recently. See https://github.com/creativescala/terminus/. It's a high-level language but you can still reach down and grub about in memory if necessary. I'm having fun.
I still absolutely love my Elm. Never a programming language has made me as confident and joyful when writing code <3.
Are people using Elm for systems level programming? I have only used it on the front end.
Roc was inspired by Elm, and has CLI as one of its "platforms", which is systems in a loose sense. Early days for Roc, though there may be orgs using it in productiom.
An interesting takeaway from this is that it looks like Rust has really fallen off, in terms of popularity. There was a time when it would've topped these lists (and yes I know you mentioned it - I mean people would've mentioned it anyway). It seems like Nim has claimed 100% of its mindshare.
Rust hasn't fallen off, it's just largely considered mainstream now.
Some time ago, Rust had no viable replacement at all. If somebody came asking "hey, how can I replace Rust on this system level software" the only possible answer was "you don't".
Nowadays, alternatives exist, and so people can answer with one.
None of that has any meaning for the popularity of Rust or lack thereof.
The prompt explicitly says “not Rust”. So the answers don’t say Rust.
Lua: Picked it up when I was dabbling in building Games relying on Love2D which uses Lua as the underlying language.
CoffeeScript: Felt in love with CS as I wanted to rapid protypes with (now defunct) Framer Classic.
Smalltalk/Squeak/Vala are something I have wanted to dabble with for a while but haven't gotten around to.
I am using Ada atm. Not a "modern" language but I believe it might have a great future :)
The IEC 61131-3 languages, though 95% of my work is Structured Text. Anyone need a PLC programmed?
Does Cython count? I’ve been trying to learn more advanced usage. It’s pretty small and reasonably familiar.
I also have messed around with nim a little bit. I like it, I’m just not sure it’s worth putting a lot of effort into.
I don't think I would recommend using Cython outside of writing Python bindings. In my experience, the community is too small and the documentation is too lacking. Even writing bindings, I spent an inordinate amount of time debugging inscrutable compilation errors.
Playing with D, while reading up on Odin and the various Cyclone papers.
Elm. Gonna hold on to it for as long as possible because it’s fantastic for personal projects - drama free, functional, simple, typed and comes with batteries and great errors and tooling.
It hasn't been released yet, but I'm very excited for Carbon :)
I've used c and java, and have recently been thinking about go. It's interesting that the comments here only mention go in the negative. Can someone give me the back story about go?
I think Go is fine for application development (any stuffs that runs on top of OS).
But for system programming, which is generally understood as developing OS kernel (filesystem, memory management, device driver etc) or embedded which you built a mini OS), then Go is not the proper choice (features like coroutine, AFAIK, needs OS). You'd want C/Pascal/Rust/Zig/<what else?> ...
I don't know if go counts as "systems programming" like the other commenter mentions.
But I have been recently using it for some tooling and small servers on personal projects where I'd have used python before.
Frankly it's been a joy and I wish I'd started earlier. The concurrency primitives are great, and the static binaries make deployment easy (raspberry pi in this case).
Struggle to use anything other than python professionally, the need to settle on a common denominator trumps pretty much everything else.
I personally mix languages, using higher level languages for the structural work and calling C for the code or data structures that require it.
So a good FFI to C has always been an important requirement for me.
Ada and Odin that I would consider less popular, rarely Forth.
OCaml and Factor, too, but I am not sure OCaml is not popular. Factor rarely, but I love it, too, just do not use it as much. I actually write more Factor than Forth.
Perl 6 / Raku. The swiss army chainsaw (Perl) raised to the power of the swiss army chainsaw. For its expressiveness in spite of some learning curve.
I had to take a look at some examples, because my last Perl work was over 15 years ago. This is neat:
The same thing in other languages would require a lot more code, without a parser module. An LLM tells me that functional languages can handle this stuff well too, but that Raku code is just extremely simple to grasp
So to be fair, "extremely simple to grasp" because looked at superficially. Once you dig into it, there is a lot happening in there. Both in what can go into an actual production grammar, and what is present in the parsing output. which is why I was mentioning learning curve.
But yeah, most stuff is easy, there is more than one way to do it, and the impossible isn't. Or something.
Lua in my 2D engine https://github.com/willtobyte/carimbo
Haxe and Dart (without Flutter) are quite nice.
D. It's quite C-like, but more concise, has a richer standard library, garbage collection, threading, etc. etc.
Ada (safety critical stuff) in work. Not a great fan, but it has its passionate defenders.
Cobol, Vimscript/VimL and Ada
I had a college prof who in his real job wrote -all the things- in Modula-2.
Rebol
Fortran. Had to use it for some obscure projects a while back.
It’s still kicking.
Micropython of course
I'll link to it because many people don't know a version of Python runs on microcontrollers:
https://micropython.org/
visual basic. i learned a lot from this language because i'm able to create a system from scratch without import a library.
My team hates when I write POSIT shell.
D such a fantastic language.
J
By what metric is that a systems language?
Swift
I've been learning Elixir just for fun. I wish I was using it in my day job.
Elixir is not a systems programming language.
what do you understand as 'systems programming'?
there are people making operating systems for AMD64 in Pascal etc.... so there's plenty of choices, odd and even ones.
some examples of different interpretations of 'systems programming'.
low level systems code - like interacting with devices directly on bare metal. (mmio/io etc.)
kernel code - like writing a new subsystem in linux, which uses other drivers.
high-level systems - like game engines, automation frameworks, other high performance oriented systems-with-lot-of-subsystems?
These different domains, on different targets, might have more or less plausible options for you to try.
(kebab-use-elisp)
tcl
I agree, tcl is pretty sweet (if totally weird). I use it because it's the mother tongue of sqlite.
RPG
Dude, you're taking the easy way out. Please go purist and pull the wiring boards out of the closet.
[dead]
Python.
It's not a systems programming language, but I actually wrote a userland "device driver" in Python, for startup MVP pragmatic firefighting reasons.
It was somehow rock-solid in over a year of factory production overseas. Which might not have been the case, if I'd written it in C and put in the kernel, like might normally be good practice.
(It hooked into some kernel interfaces, did a little multiple USB device management, and low-level keyboard-like decoding with efficient I/O, and buffered and parsed and did something with the output of that.)
I have mixed feelings about Python: it often it hurts more than it helps (if you know better ways to do things), but the ecosystem has some nice off-the-shelf components, and it's popular/employable. However, due to the popularity, the average quality of any article you might find through Web is unfortunately low.
(For an unpopular language, you'll get a few people writing articles from a junior/neophyte knowledge level, as part of their learning, or because someone said it was good for resume-boosting. That can be good. But no one is going to waste time pounding SEO low-quality filler for a language that doesn't make money. Well, at least they wouldn't before LLMs, but who knows how the economics have changed, now. :)
C#, to match the performance of reference implementations in C and Rust, and completely crush the performance of those in Go :)
Why do you need to downvote / compare Go on every of your post? Do you have insecurities with Go?
I was reading another post about someone showing some AI stuff and then: https://news.ycombinator.com/item?id=43246127
what do you mean by reference implementations, in this context?
All sorts of algorithms for processing data useful in high-load scenarios: checksum calculation, text searching/analysis/transformation, data compression, networking (request analysis, routing, filtering), interop with other C ABI dependencies, etc.
1 reply →
> Less popular
Bash.
> I used C for both application programming and systems programming
Gross. Learn C++, it's better than C in every way! shotsfired.jpg
> I've been wanting to get back to doing some systems programming, but preferably in a more modern language (than C) which is meant for that.
Use C++ then. Or if you're a hater and/or don't know how to keep footguns pointed away from your legs, use Rust.
> less commonly used ones
but tbqh why not Xojo?