← Back to context

Comment by dzonga

12 hours ago

good to see incredible stuff being shipped in Swift. Haven't used it since v3 though.

around 2015-17 - Swift could have easily dethroned Python.

it was simple enough - very fast - could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift.

the server ecosystem was starting to come to life, even supported by IBM.

I think the letdown was on the Apple side - they didn't bring in the community fast enough whether on marketing, or messaging - unfortunately Swift has remained largely an Apple ecosystem thing - with complexity now chasing C++.

> the server ecosystem was starting to come to life, even supported by IBM.

I was in college at the time and doing some odd freelance jobs to make some money. Unbeknownst to my clients I was writing their website backends in swift, using build packs on heroku to get them hosted.

It was a fun time for me and I love swift but I will admit last year I went ahead and rewrote an entire one of those sites in good ol typescript. I love swift but anything outside of the Apple ecosystem with it just seems like it hasn’t hit critical mass yet.

> Swift has remained largely an Apple ecosystem

Even today, with the fancy Swift 6.3, the experience of using Swift for anything other than apps for Apple platforms is very painful. There is also the question of trust - I don't think anyone would voluntarily introduce Apple "The Gatekeeper" in parts of their stack unless they're forced to do it.

  • You can use swift on the server but what for? You have a gigantic ecosystems in languages X,Y,Z.

    Even Apple does not use Swift on the server (AFAIK) so why would you?

True. Google was even thinking of switching TensorFlow from Python to Swift.

https://github.com/tensorflow/swift

  • That’s really because Chris Lattner was at Google Brain at the time. Don’t think it ever took off in meaningful ways

  • I was enthusiastic about early TensorFlow in Swift efforts, sorry when the effort ended. My interest then flowed into early Mojo development for a while.

    I wrote an eBook on Swift several ago but rarely update that book anymore. Count me as one of the many developers who for a while thought Swift would take over the world. At least Swift is a fun language to use, and now with LLM coding tools writing macOS/iOS/iPadOS apps is fairly easy.

    • funnily enough, I talked recently to someone working on the swift compiler (not an Apple employee) to make Swift functions differentiable. So its not all dead yet

Python 3 barely managed to dethrone Python.

  • I'm sorry, that's absolutely bullshit. In fact, I wish we had left everyone who complained behind—the python community would have been happier and healthier for it. Absolute crybabies who wanted to be catered to without caring for how intractable the problems with python2 were—e.g. dealing with unicode was a royal pain in the ass, and the bytes/string divide completely fixed it. IMO, it was the best-executed breaking change I've ever witnessed in a language.

    In comparison, e.g. Scala 2 -> Scala 3 was an absolute nightmare—it just didn't have the same vocal wailing from maintainers in the community (or, I suppose, a fraction of Python's popularity to begin with).

    • Being to aggressive in breaking stuff gets you a shitshow like Node.js or Ruby. Long-term source code compatibility is a very useful feature for open source and a sign of a mature eco system. Feel free to add stuff, but once it's part of a stable release it has to be maintained long after a "better" way to do it comes along.

      2 replies →

Python's interactive interpreter makes it pretty useful as a shell, for iterative development, and crucially useful in a Jupyter notebook. I've also found CircuitPython's interpreter to be bonkers useful in prototyping embedded projects. (This, on top of the nice datascience, ML, and NN libraries).

Swift just wasn't doing the same things. And even if it did, Swift would compete with other languages that were understood as "a better Python", like Julia. Even then, Swift only came to Linux in 2016, Windows in 2020, and FreeBSD less than a year ago with WWDC 2025.

I think it doesn't help that the mid 2010s saw a burst of Cool and New languages announced or go mainstream. Go, Julia, Rust, TypeScript, Solidity, etc. along with Swift. I think most of us only have space to pick up one or two of these cool-and-new languages every few years.

I had a similar journey with F# - the language looked excellent and I really wanted to make it one of my go-to languages, but every time I tried to use it I found bits that would only work on windows (especially around desktop apps). I finally just gave up, though I hear it has gotten better at being truly cross-platform these days.

  • Same and not to mention reames of weird .Net errors that were never fixed and hard to understand or speculate about

> could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift.

In 2015-2017 you could interop with C, C++ support wasn't added until very recently.

I do agree with you though and I am not sure what the exact reasoning is, but Swift is definitely an Apple ecosystem language despite the random efforts to gain traction elsewhere.

> around 2015-17 - Swift could have easily dethroned Python.

Why could it?

> it was simple enough - very fast - could plug into the C/C++ ecosystem. Hence all the numeric stuff people were doing in Python powered by C++ libraries could've been done with Swift.

Half a dozen languages fit this description.

> the server ecosystem was starting to come to life, even supported by IBM.

No, not at all. Kitura, Vapor (a fitting name) were just a toys that no serious player ever touched.

  • After that, and IBM losing interest, Apple did hire a few competent people (including contributors to Netty and Akka) to build the Swift Server Workgroup.

    But I don't know why I'd pick Swift on the server when Rust is better in almost every dimension, with a thriving and more community-driven ecosystem.

    • I think it's not about that but about dogfooding Swift on the server. Apple uses Go, Java etc for a lot of its server components and refused to invest in hiring people that would extend the ecosystem for server Swift.

      Thats the problem.

      1 reply →

Maybe Chris Lattner leaving and creating Mojo also didn’t help in that regard.

Swift for TensorFlow was a cool idea in that time …

  • It remains to be seen how much Mojo has learnt from that effort.

    NVidia, AMD and Intel now have doubled now into giving Python GPU JITs, and Julia, the same capabilities as their CUDA, ROCm, and SYSCL offerings with C++.

    With Julia and Python having their 1.0 long behind them.

  • Lattner probably left because Apple didn't give the team any breathing room to properly implement the language. It was "we must have this feature yesterday". A lot of Swift is the equivalent of Javascrip's "we have 10 days to implement and ship it":

    https://youtu.be/ovYbgbrQ-v8?si=tAko6n88PmpWrzvO&t=1400

    --- start quote ---

    Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff...

    We had a ton of users, it had a ton of iternal technical debt... the whole team was behind, and instead of fixing the core, what the team did is they started adding all these special cases.

    --- end quote ---

    • For this language to become default at Apple they had to be doing a massive amount of internal promotion - in other words they knew where it was going.

      And then if that's the case, how were they not ready to solve the many problems that a big organization would run into? And all the schedule constraints that come with it?

    • > Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff...

      That's true, but only partly true. It already was a gigantic super complicated bag of special cases right from the start.

      Rob Rix noted the following 10 years ago:

      Swift is a crescendo of special cases stopping just short of the general; the result is complexity in the semantics, complexity in the behaviour (i.e. bugs), and complexity in use (i.e. workarounds).

      https://www.quora.com/Which-features-overcomplicate-Swift-Wh...

      Me, 2014:

      Apple's new Swift language has taken a page from the C++ and Java playbooks and made initialization a special case. Well, lots of special cases actually. The Swift book has 30 pages on initialization, and they aren't just illustration and explanation, they are dense with rules and special cases

      https://blog.metaobject.com/2014/06/remove-features-for-grea...

      Of course, that doesn't mean that it didn't get worse. It got lot worse. For example (me again, 2020):

      I was really surprised to learn that Swift recently adopted Smalltalk keyword syntax ... Of course, Swift wouldn't be Swift if this weren't a special case of a special case, specifically the case of multiple trailing closures, which is a special case of trailing closures, which are weird and special-casey enough by themselves.

      https://blog.metaobject.com/2020/06/the-curious-case-of-swif...

      Oh, and Function Builders (2020, also me):

      A prediction I made was that these rules, despite or more likely because of their complexity, would not be sufficient. And that turned out to be correct, as predicted, people turned to workarounds, just like they did with C++ and Java constructors.

      https://blog.metaobject.com/2020/04/swift-initialization-swi...

      So it is true that it is now bad and that it has gotten worse. It's just not the case that it was ever simple to start with. And the further explosion of complexity was not some accidental thing that happened to what was otherwise a good beginning. That very explosion was already pretty much predetermined in the language as it existed from inception and in the values that were visible.

      From my exchange with Chris regarding initializers:

      "Chris Lattner said...

      Marcel, I totally agree with your simplicity goal, but this isn't practical unless you are willing to sacrifice non-default initializable types (e.g. non-nullable pointers) or memory safety."

      Part of my response:

      "Let me turn it around: Chris, I totally agree with your goal of initializable types, but it is just not practical unless you are willing to sacrifice simplicity, parsimony and power (and ignore the fact that it doesn't actually work)."

      Simplicity is not the easy option. Simplicity is hard. Swift took the easy route.

      [...] when you first attack a problem it seems really simple because you don't understand it. Then when you start to really understand it, you come up with these very complicated solutions because it's really hairy. Most people stop there. But a few people keep burning the midnight oil and finally understand the underlying principles of the problem and come up with an elegantly simple solution for it. But very few people go the distance to get there.

      -- Steve Jobs (borrowed and adapted from Heinelein)

      https://blog.metaobject.com/2014/04/sophisticated-simplicity...

    • To be fair, I think such a fate in inevitable for most languages after many years of changes and development.

> Swift could have easily dethroned Python.

Just IMO, but... no. To me a "could have easily" requires n-1 things to have happened, and 1 thing not happening. Like, we "could have easily" had a nuclear exchange with the USSR, were it not for the ONE Russian guy who decided to wait for more evidence. https://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alar...

But even in '15-'17, there were too many people doing too many things with Python (the big shift to data orientation started in the mid/late 90's which paved the way to ML and massive python usage) by then.

The 'n' was large, and not nearly of the 'n' things were in Swift's favor then.

Again, IMO.

The thing what people don't get with C++'s complexity is that complexity is unavoidable.

It is also there in Ada, C#, Java, Python, Common Lisp,....

Even if the languages started tiny, complexity eventually grows on them.

C23 + compiler extensions is quite far from where K&R C was.

Scheme R7 is quite far from where Scheme started.

Go's warts are directly related to ignoring history of growing pains from other ecosystems.

  • > Even if the languages started tiny, complexity eventually grows on them.

    And then of course the case that proves the opposite, Clojure. Sure, new ideas appear, but core language is more or less unchanged since introduced, rock solid and decades old projects still run just fine, although usually a bit faster.

    • That is because Clojure is done, there is hardly anything being done other than probably what matters to NuBank and Datomic mostly.

      Also its market share adoption kind of shows it.

      8 replies →

That's my read too.

Swift was feeling pretty exciting around ~v3. It was small and easy to learn, felt modern, and had solid interop with ObjC/C++.

...but then absolutely exploded in complexity. New features and syntax thrown in make it feel like C++. 10 ways of doing the same thing. I wish they'd kept the language simple and lean, and wrapped additional complexity as optional packages. It just feels like such a small amount of what the Swift language does actually needs to be part of the language.

  • I get this feeling with C#. I have been here since its release. I looked at Swift and then they moved very quickly at the beginning, so the book I had to teach me was out of date moments after it was printed. With all the complexity being thrown in, I stuck with C++ because at least it was only 1 language I had to keep track of (barely)!

    • C# is the other direction, IMO.

      I've been using C# since the first release in 2003/4 timeline?

      Aside from a few high profile language features like LINQ, generics, `async/await`, the syntax has grown, but the key additions have made the language simpler to use and more terse. Tuples and destructuring for example. Spread operators for collections. Switch expressions and pattern matching. These are mostly syntactic affordances.

      You don't have to use any of them; you can write C# exactly as you wrote it in 2003...if you want to. But I'm not sure why one would forgo the improved terseness of modern C#.

      Next big language addition will be discriminated unions and even that is really "opt-in" if you want to use it.

      5 replies →

  • Which keywords would you get rid of and why? You don't have to use all of them!

    • I would remove result builders and all other uses of @attributes that change the semantics of the code (e.g property wrappers).

      I would remove the distinction between value types and reference types at the type level. This has caused so many bugs in my code. This distinction should be made where the types are used not where they are defined.

      I would remove everything related to concurrency from the language itself. The idea to let code execute on random threads without any explicit hint at the call site is ridiculous. It's far too complicated and error prone, which is why Swift designers had to radically change the defaults between Swift 6.0 and 6.2 and it's still a mess.

      I would remove properties that are really functions (and of course property wrappers). I want to see at the call site whether I'm calling a function or accessing a variable.

      I would probably remove async/await as well, but this is a broader debate beyond Swift.

      And yes you absolutely do have to know and use all features that a language has, especially if it's a corporate language where features are introduced in order to support platform APIs.

      1 reply →

    • I'm not a Swift user, but I can tell you from C++ experience that this logic doesn't mitigate a complex programming language.

      * If you're in a team (or reading code in a third-party repo) then you need to know whatever features are used in that code, even if they're not in "your" subset of the language.

      * Different codebases using different subsets of the language can feel quite different, which is annoying even if you know all the features used in them.

      * Even if you're writing code entirely on your own, you still end up needing to learn about more language features than you need to for your code in order that you can make an informed decision about what goes in "your" subset.

    • But you have to know all of them to read other people's code.

      To answer your question: I would immediately get rid of guard.

      Also, I think the complexity and interplay of structs, classes, enums, protocols and now actors is staggering.

      3 replies →

    • i would get rid of associatedtype, borrowing, consuming, deinit, extension, fileprivate, init, inout, internal, nonisolated, open, operator, precedencegroup, protocol, rethrows, subscript, typealias, #available, #colorLiteral, #else, #elseif, #endif, #fileLiteral, #if, #imageLiteral, #keyPath, #selector, #sourceLocation, #unavailable, associativity, convenience, didSet, dynamic, indirect, infix, lazy, left, mutating, nonmutating, postfix, precedence, prefix, right, unowned, weak, and willSet

      1 reply →

    • You can take this approach in personal projects - with teams you need to decide on this and then on-board people into your use of the language. This does not work.

      1 reply →

  • I felt that too many smart people were getting involved in the evolution of the language. There should have been a benevolent dictator to say NO.

>"around 2015-17 - Swift could have easily dethroned Python."

NumPy, SciPy, Pandas, and Pytorch are what drove the mass adoption of Python over the last few years. No language feature could touch those libraries. I now know how the C++/Java people felt when JS started taking over. It's a nightmare to watch a joke language (literally; Python being named for Monty Python) become the default simply because of platform limitations.

> Haven't used it since v3 though.

Since 5.10 it's been worth picking back up if you're on MacOS.

> Swift could have easily dethroned Python

No way something that compiles as slowly as Swift dethrones Python.

Edit: Plus Swift goes directly against the Zen of Python

> Explicit is better than implicit.

> Namespaces are one honking great idea -- let's do more of those!

coupled with shitty LSP support (even to this day) makes code even harder to understand than when you `import *` in Python.

Edit 2: To expand a little on how shitty the LSP support is for those who don't work with Swift: any trivial iOS or macOS project that builds fine in Xcode can have a bunch of SourceKit-LSP (the official Swift LSP) errors because it fails to resolve frameworks/libraries. The only sane way to work with Swift in VS Code or derivatives I've found is to turn off SourceKit diagnostics altogether and only keep swiftc diagnostics. And I have the swift-lsp plugin in Claude Code, there's a routine baseline of SourceKit errors ignored. So you have symbols without explicit namespaces, and the LSP simply can't resolve lots of them, so no lookup for you. Good luck.

  • >No way something that compiles as slowly as Swift dethrones Python.

    This must have pushed Chris Lattner towards making Mojo both interpreted and compiled at the same time.

  • > Explicit is better than implicit.

    That's funny. To me magic is implicit by definition and Python strikes me as a very magical language compared to something like Java that is way more explicit.

    • Until you start using frameworks like Spring and then everything is so painfully magic that no one knows how the program actually runs.

    • Magical language how? And you should see what reflection based Java monstrosities do in the background.

  • Plus Swift goes directly against the Zen of Python

    The Zen of Python is how we got crap like argparse where arguments are placed in the namespace instead of a dict.

Eh, I don't think Swift would ever have dethroned Python. What pain point would it practically solve? I don't use Python often but I don't hear folks complaining about it much.

I do, though, think Swift had/has(?) a chance to dethrone Rust in the non-garbage collected space. Rust is incredibly powerful but sometimes you don't really need that complexity, you just need something that can compile cross-platform and maintain great performance. Before now I've written Rust projects that heavily use Rc<> just so I don't have to spend forever thinking about lifetimes, when I do that I think "I wish I could just use Swift for this" sometimes.

You're right, though, that Swift remains Apple's language and they don't have a lot of interest in non-Apple uses of it (e.g. Swift SDK for Android was only released late last year). They're much happier to bend the language in weird ways to create things like SwiftUI.

  • > just need something that can compile cross-platform and maintain great performance.

    I think Go has already taken that part of the cake.

Dethroned Python? The Apple language, seriously. Where is numpy for swift?