Well it is not quite a mutable vs immutable strings war, nor Ruby being late to the party or something like that.
The move is so we can avoid allocating a string each we declare and use it since it will be frozen by default. It is a big optimization for GC mainly. Before we had to do such optimization by hand if we intend not to modify it:
# before
def my_method
do_stuff_with("My String") # 1 allocation at each call
end
# before, optim
MY_STRING = "My String".freeze # this does 2 allocations with 1 at init being GC quite early
def my_method
do_stuff_with(MY_STRING)
end
# after
def my_method
do_stuff_with("My String") # 1 allocation first time
end
But this move also complicates strings manipulation in the sense of it will lean users toward immutable ops that tend to allocate a lot of strings.
foo.upcase.reverse
# VS
bar = foo.dup
bar.upcase!
bar.reverse!
So now we have to be deliberate about it:
my_string = +"My String" # it is not frozen
We have frozen string literals for quite a while now, enabled file by file with the "frozen_string_literal: true" comment and I've seen it as the recommended way by the community and the de-facto standard in most codebase I've seen. It is generally enforced by code quality tools like Rubocop.
So the mutable vs immutable is well known, and as it is part of the language, well, people should know the ins and outs.
I'm just a bit surprised that they devised this long path toward real frozen string literals, because it is already ongoing for years with the "frozen_string_literal: true" comment. Maybe to add proper warnings etc. in a way that does not "touch" code ? I prefer the explicit file by file comment. And for deps, well, the version bump of Ruby adding frozen string literals by default is quite a filter already.
It is sorta late to the party. Common Lisp has similar with regards to how lists are done. Specifically, it is not uncommon to make a static list like `'(1 2 3)`. Doing this, however, has implications on what operations you can do on the data elsewhere.
I say sorta late to the party, as I think it is more than fair to say there was not much of a party that folks were interested in in the lisp world. :D
Is it the future path of any successful JIT / dynamic typed / scripting language to realize they needed all optimizations from compiled / statically typed / lower level languages ?
Would Ruby be as successful if they had all those complicated features right from the start ?
Or do all languages start from a nice simple clean slate tabula rasa to get developers hooked, until the language is enough famous to get well developed and starts to be similar to all others big programming languages ?
I would actually say it’s the opposite in this case: it’s extremely common in scripting languages for strings to be immutable, mutable strings are usually only available in lower level languages. I’m very surprised Ruby had this feature at all.
But in the syntax of scripting language its very easy to create a new string from the old string, destroy old string, replace variable by new string. Which appears mutable from the point of view of the developer
Common Lisp and Smalltalk have mutable strings, I think. So it’s not too surprising that ruby does too, since it was pretty heavily influenced by these languages.
It’s true that many languages have immutable strings, but for a web-focused scripting language it makes sense to default to mutable strings.
I think the comment is about that you now need to choose mutable vs immutable, and that is framed as a consequence of broader adoption. Which is a development I have also seen before.
> all optimizations from compiled / statically typed / lower level languages
Mutable strings are totally possible (and not even especially hard) in compiled, statically typed, and lower-level languages. They're just not especially performant, and are sometimes a footgun.
> all those complicated features right from the start
Arguably, mutable strings are the more complicated feature. Removing them by default simplifies the language, or at least forces you to go out of your way to find the complexity.
Its not about possible / not possible its about what the language does by default and how language syntax changes to switch between different variants of strings
What? Mutable strings are more performant generally. Sometimes immutability allows you to use high level algorithms that provide better performance, but most code doesn't take advantage of that.
TIL that Ruby has mutable strings, and (until the announced change) even had them mutable by default (and the change only affects literal strings; non-literal strings are still mutable). Python has always only ever had immutable strings.
Strings will still be mutable by default after the change which only makes string literals always frozen (which has been a file-level opt-in for a while.)
This looks like a really thoughtful and well-planned transition, with years of opt-in warnings followed by years of opt-out warnings before the actual cutover. I’m constantly impressed with the methodical approach of the Ruby team to constantly improving the language in ways that disrupt their users as little as possible.
We implemented this recently on a Rails project as part of a Rubocop integration. It actually uncovered a lot of subtle bugs, though I will say that the Ruby language lends itself to buggy code. Thankfully we have sophisticated tooling these days that (mostly) mitigates this.
Btw, OCaml also transitioned from read-write Strings to read-only Strings, and Buffer to be that read-write string. It was introduced in 4.02 released September 2014.
I recall it was a bit bumpy, but not all that rough in the end. I suppose static type checking helps here to find all the ways how it could be used. There was a switch to allow running old code (to make strings and buffers interchangeable).
> Btw, OCaml also transitioned from read-write Strings to read-only Strings
Ruby is not doing that, it's transitioning from mutable strings that can be frozen with no special treatment of literals (unless you opt-in to literals being frozen on per file basis) to mutable strings with all string literals frozen.
How does this work under the hood? Does Ruby keep a giant map of all strings in the application to check new strings against to see if it can dedupe? Does it keep a reference count to each unique string that requires a set lookup to update on each string instance’s deallocation? Set lookups in a giant set can be pretty expensive!
Even if it didn't dedupe strings, mutable string literals means that it has to create a new string every time it encounters a literal in run time. If you have a literal string in a method, every time you call the method a new string is created. If you have one inside a loop, every iteration a new string is created. You get the idea.
With immutable strings literals, string literals can be reused.
You make an arrow function that takes an object as input, and calls another with a string and a field from the object, for instance to populate a lookup table. You probably don’t want someone changing map keys out from under you, because you’ll break resize. So copies are being made to ensure this?
This would have fooLit be frozen at parse time. In this situation there would be "foo", "f", and "o" as frozen strings; and fooLit and fooVar would be two different strings since fooVar was created at runtime.
Creating a string that happens to be present in the frozen strings wouldn't create a new one.
> How does this work under the hood? Does Ruby keep a giant map of all strings in the application to check new strings against to see if it can dedupe?
1. Strings have a flag (FL_FREEZE) that are set when the string is frozen. This is checked whenever a string would be mutated, to prevent it.
2. There is an interned string table for frozen strings.
> Does it keep a reference count to each unique string that requires a set lookup to update on each string instance’s deallocation?
This I am less sure about, I poked around in the implementation for a bit, but I am not sure of this answer. It appears to me that it just deletes it, but that cannot be right, I suspect I'm missing something, I only dig around in Ruby internals once or twice a year :)
There's no need for ref counting, since Ruby has a mark & sweep GC.
The interned string table uses weak references. Any string added to the interned string tables has the `FL_FSTR` flag set to it, and when a string a freed, if it has that flag the GC knowns to remove it from the interned string table.
The keyword to know to search for this in the VM is `fstring`, that's what interned strings are called internally:
The way it works in Python is that string literals are stored in a constant slot of their parent object, so at runtime the VM just returns the value at that index.
Though since Ruby already has symbols which act as immutable interned strings, frozen literals might just piggyback on that, with frozen strings being symbols under the hood.
The phrase “Frozen String Literals” is kind of weird to me. When I assign a string literal to a variable, I do not think of the variable itself as a “string literal.” That phrase is for the literal characters in between quotes in the code, which by definition are already “frozen.” They’re a static part of the code itself. This change makes it so that you cannot mutate a variable which was initialized using a string literal. (if I understand correctly!)
Yeah it’s just the naming is weird. The string literal is not the object on the heap, it’s part of the program’s code itself, which was (assumedly) never mutable to begin with.
In ruby, "frozen" is a property of some values that makes them immutable. I mean, other than the part where you can mutably unfreeze objects of many classes. (At least you can't unfreeze strings.) This change makes string values that come from literals initially frozen. It has nothing to do with variable bindings.
which, without `# frozen_string_literal: true`, I believe allocates a string when the application loads (it sounds like it might be 2) and another string at runtime and then mutate that.
because that will allocate the frozen string to `FOO` when the application loads, then make a copy of it to `foo` at runtime, then mutate that copy. That means two strings that never leave memory (FOO, SUB_ME) and one that has to be GCed (return value) instead of just one that never leaves memory (SUB_ME) and one that has to be GCed (foo/return value).
This is true in particular when FOO is only used in `my_method`. If it's also used in `my_other_method` and it logically makes sense for both methods to use the same base string, then it's beneficial to use the wider-scope constant.
(The reason this seems reasonable in an application is that the method defines the string, mutates it, and sends it along, which primarily works because I work on a small team. Ostensibly it should send a frozen string, though I rarely do that in practice because my rule is don't mutate a string outside the context in which it was defined, and that seems sensible enough.)
Am I mistaken and/or is there another, perhaps more common pattern that I'm not thinking about that makes this desirable? Presumably I can just add # frozen_string_literal: false to my files if I want so this isn't a complaint. I'm just curious to know the reasoning since it is not obvious to me.
Many comments about Python 2-3 moves. The problem with Python was that 2 to 3 offers little to no incentive.
So I sometimes wonder why JIT isn't used as a motivation to move / remove features. Basically if you want JIT to work, your code has to be x ready or without feature x. So if you still want those performance improvements you will have to move forward.
Has anyone actually benchmarked the use of frozen string literals? I feel like this is one of those micro-optimizations that everyone does, but they're probably accomplishing a diminishingly small performance improvement, while making the codebase less readable. On net, a negative.
I wonder what the basis is for the description of the 3.7 / 4 ruby releases is. I haven't seen this transition plan with version numbers described outside of this blog post.
Hey there. I wrote the article. While I know the version numbers aren’t concrete, I added the proposal anyways as a way for readers to visualise what the maintainers had in mind. Since we’re only at 3.4 with 3.5 in preview, it can’t be claimed concretely what the future holds. I just didn’t make that super obvious in the post.
I had to explain the same reasoning in Reddit the other day. Perhaps it’s time to take this as a feedback and update the blog.
Btw I just asked gpt to write an article on the same topic, with a reference to the Ruby issues page. And it DID NOT add the future proposal part. So LLMs are definitely smarter than me.
First, only literal strings are concerned. Second, Rails's `String#html_safe` method's return value is a new instance, it doesn't mutate the receiver in place.
Ruby has been extremely slow and deliberate in rolling out frozen string literals. They added a magic comment to opt in to them on a per-file basis all the way back in Ruby 2.3—almost a decade ago.
Most linting setups I've seen since then have required this line. I don’t expect many libraries to run afoul of this, and this warning setting will make finding them easy and safe. This will be nothing like the headache Python users faced transitioning to 3.
I hope this is corect - i do agree it has been a long and slow migration path and migrating is fairly easy - migrating python 2 to 3 code was fairly easy as well anyone could do it in their codebase, it remains a big deal and possibly very impactful to make such breaking changes to the behavior of primitives in mature ecosystems. How many gems does the average rails app have, okay they all need to be updated and they sohld be being updated for other reasons, I remain skeptical of how smooth the change is going to be over all ecosystem wise but time will tell.
I agree it has been a well advertised and loudly migration path and timeframe for it
Most of the problems in Python 3 were the string encoding changes. It was very pervasive, fixing things was often not so straight-forward, and writing programs that worked well with Python 2 and 3 was possible but somewhat difficult and error-prone.
The rest of the changes were a bit annoying but mostly boring; some things could have been done better here too, but the string encoding thing was the main issue that caused people to hold on to Python 2 for a long time.
The frozen string literal changes are nothing like it. It's been "good practise" to do this for years, on errors fixing things is trivial, there is a long migration path, and AFAIK there are no plans to remove "frozen_string_literal: false". It's just a change in the default from false to true, not a change in features.
"Learning lessons" doesn't mean "never do anything like this ever again". You're the one who failed to learn from Python 3, by simply saying "language change bad" without deeper understanding of what went wrong with Python 3, and how to do things better. Other languages like Go also make incompatible changes to the language, but do so in a way that learned the lessons from Python 3 (which is why you're not seeing people complain about it).
??? This is nothing like the Python transition. In Python there were two incompatible language versions side by side for years that made it really hard on library maintainers. Ruby is giving a 7-8 year transition period before this even hits, with years of warnings built-in to the plan. What more would you have them do?
Not to mention that in addition to the opt-in warning that came with 3.4, if you've been using any reasonable linter such as Rubocop for the past 10ish years then you're already being yelled at for lack of `# frozen_string_literal: true` magic comment.
Even if you have an incompatible codebase that you don't wish to convert, you'll be able to set `RUBYOPT="--disable-frozen-string-literal"` so it keeps running.
And since that flag really doesn't require lots of work in the VM, it's likely to be kept around pretty much forever.
Well it is not quite a mutable vs immutable strings war, nor Ruby being late to the party or something like that.
The move is so we can avoid allocating a string each we declare and use it since it will be frozen by default. It is a big optimization for GC mainly. Before we had to do such optimization by hand if we intend not to modify it:
But this move also complicates strings manipulation in the sense of it will lean users toward immutable ops that tend to allocate a lot of strings.
So now we have to be deliberate about it:
We have frozen string literals for quite a while now, enabled file by file with the "frozen_string_literal: true" comment and I've seen it as the recommended way by the community and the de-facto standard in most codebase I've seen. It is generally enforced by code quality tools like Rubocop.
So the mutable vs immutable is well known, and as it is part of the language, well, people should know the ins and outs.
I'm just a bit surprised that they devised this long path toward real frozen string literals, because it is already ongoing for years with the "frozen_string_literal: true" comment. Maybe to add proper warnings etc. in a way that does not "touch" code ? I prefer the explicit file by file comment. And for deps, well, the version bump of Ruby adding frozen string literals by default is quite a filter already.
Well, Ruby is well alive and it is what matters)
> I'm just a bit surprised that they devised this long path
The original plan was to make the breaking change in 3.0, but that plan was canceled because it broke too much code all at once.
Hence why I proposed this multi-step plan to ease the transition.
See the discussion on the tracker if you are curious: https://bugs.ruby-lang.org/issues/20205
It is sorta late to the party. Common Lisp has similar with regards to how lists are done. Specifically, it is not uncommon to make a static list like `'(1 2 3)`. Doing this, however, has implications on what operations you can do on the data elsewhere.
I say sorta late to the party, as I think it is more than fair to say there was not much of a party that folks were interested in in the lisp world. :D
It's not a party unless someone talks about Lisp. Or maybe Rust.
Oh, I think I see some nameless person I know over there. Well-met Lisper, but goodbye!
Is it the future path of any successful JIT / dynamic typed / scripting language to realize they needed all optimizations from compiled / statically typed / lower level languages ?
Would Ruby be as successful if they had all those complicated features right from the start ?
Or do all languages start from a nice simple clean slate tabula rasa to get developers hooked, until the language is enough famous to get well developed and starts to be similar to all others big programming languages ?
I would actually say it’s the opposite in this case: it’s extremely common in scripting languages for strings to be immutable, mutable strings are usually only available in lower level languages. I’m very surprised Ruby had this feature at all.
Even in C, string literals are immutable, and mutating them is undefined behavior.
But in the syntax of scripting language its very easy to create a new string from the old string, destroy old string, replace variable by new string. Which appears mutable from the point of view of the developer
2 replies →
Common Lisp and Smalltalk have mutable strings, I think. So it’s not too surprising that ruby does too, since it was pretty heavily influenced by these languages.
7 replies →
All strings in python are immutable, as an example.
17 replies →
It’s true that many languages have immutable strings, but for a web-focused scripting language it makes sense to default to mutable strings. I think the comment is about that you now need to choose mutable vs immutable, and that is framed as a consequence of broader adoption. Which is a development I have also seen before.
6 replies →
Mutable strings were part of the perl heritage, which was one of the direct ancestors for ruby.
> all optimizations from compiled / statically typed / lower level languages
Mutable strings are totally possible (and not even especially hard) in compiled, statically typed, and lower-level languages. They're just not especially performant, and are sometimes a footgun.
> all those complicated features right from the start
Arguably, mutable strings are the more complicated feature. Removing them by default simplifies the language, or at least forces you to go out of your way to find the complexity.
Its not about possible / not possible its about what the language does by default and how language syntax changes to switch between different variants of strings
> They're just not especially performant
What? Mutable strings are more performant generally. Sometimes immutability allows you to use high level algorithms that provide better performance, but most code doesn't take advantage of that.
5 replies →
erlang is jitted and dynamic and all terms are immutable.
A lot of functional programming languages are like this and that can make them not efficient
1 reply →
TIL that Ruby has mutable strings, and (until the announced change) even had them mutable by default (and the change only affects literal strings; non-literal strings are still mutable). Python has always only ever had immutable strings.
In Ruby you tend to use :symbol for small immutable strings
<< is inplace append operator for strings/arrays, while + is used to make copy. So += will make new string & rebind variable
I’ll just kill the comment. It said Symbol isn’t garbage collected. It has been since 2.2 and I wasn’t aware. Sorry.
Good reminder that anyone can go on the internet, just say stuff, and be wrong.
8 replies →
We have mutable default arguments in Python (https://docs.python-guide.org/writing/gotchas/#mutable-defau...), by default too, though.
Not if they are strings, which is what this article is about.
Strings are going to keep being mutable by default. Only strings created by string literals won't be.
Thanks for the clarification! I have adjusted my wording.
Strings will still be mutable by default after the change which only makes string literals always frozen (which has been a file-level opt-in for a while.)
just dont ask about unicode
Unicode support in Ruby has been great since the beginning.
5 replies →
Cool, good for you! I learned this in 2005.
Cool, good for you!
This looks like a really thoughtful and well-planned transition, with years of opt-in warnings followed by years of opt-out warnings before the actual cutover. I’m constantly impressed with the methodical approach of the Ruby team to constantly improving the language in ways that disrupt their users as little as possible.
We implemented this recently on a Rails project as part of a Rubocop integration. It actually uncovered a lot of subtle bugs, though I will say that the Ruby language lends itself to buggy code. Thankfully we have sophisticated tooling these days that (mostly) mitigates this.
Btw, OCaml also transitioned from read-write Strings to read-only Strings, and Buffer to be that read-write string. It was introduced in 4.02 released September 2014.
I recall it was a bit bumpy, but not all that rough in the end. I suppose static type checking helps here to find all the ways how it could be used. There was a switch to allow running old code (to make strings and buffers interchangeable).
> Btw, OCaml also transitioned from read-write Strings to read-only Strings
Ruby is not doing that, it's transitioning from mutable strings that can be frozen with no special treatment of literals (unless you opt-in to literals being frozen on per file basis) to mutable strings with all string literals frozen.
Nitpick: `bytes` is the read-write string, `Buffer` is the StringBuilder-like.
Oops, that's not a nitpick but a good correction :).
How does this work under the hood? Does Ruby keep a giant map of all strings in the application to check new strings against to see if it can dedupe? Does it keep a reference count to each unique string that requires a set lookup to update on each string instance’s deallocation? Set lookups in a giant set can be pretty expensive!
Even if it didn't dedupe strings, mutable string literals means that it has to create a new string every time it encounters a literal in run time. If you have a literal string in a method, every time you call the method a new string is created. If you have one inside a loop, every iteration a new string is created. You get the idea.
With immutable strings literals, string literals can be reused.
Here’s a more concrete example:
You make an arrow function that takes an object as input, and calls another with a string and a field from the object, for instance to populate a lookup table. You probably don’t want someone changing map keys out from under you, because you’ll break resize. So copies are being made to ensure this?
The literals would be identified at parse time.
This would have fooLit be frozen at parse time. In this situation there would be "foo", "f", and "o" as frozen strings; and fooLit and fooVar would be two different strings since fooVar was created at runtime.
Creating a string that happens to be present in the frozen strings wouldn't create a new one.
Got it, so this could not be extended to non-literal strings
1 reply →
> How does this work under the hood? Does Ruby keep a giant map of all strings in the application to check new strings against to see if it can dedupe?
1. Strings have a flag (FL_FREEZE) that are set when the string is frozen. This is checked whenever a string would be mutated, to prevent it.
2. There is an interned string table for frozen strings.
> Does it keep a reference count to each unique string that requires a set lookup to update on each string instance’s deallocation?
This I am less sure about, I poked around in the implementation for a bit, but I am not sure of this answer. It appears to me that it just deletes it, but that cannot be right, I suspect I'm missing something, I only dig around in Ruby internals once or twice a year :)
There's no need for ref counting, since Ruby has a mark & sweep GC.
The interned string table uses weak references. Any string added to the interned string tables has the `FL_FSTR` flag set to it, and when a string a freed, if it has that flag the GC knowns to remove it from the interned string table.
The keyword to know to search for this in the VM is `fstring`, that's what interned strings are called internally:
- https://github.com/ruby/ruby/blob/b146eae3b5e9154d3fb692e8fe...
- https://github.com/ruby/ruby/blob/b146eae3b5e9154d3fb692e8fe...
1 reply →
The way it works in Python is that string literals are stored in a constant slot of their parent object, so at runtime the VM just returns the value at that index.
Though since Ruby already has symbols which act as immutable interned strings, frozen literals might just piggyback on that, with frozen strings being symbols under the hood.
The phrase “Frozen String Literals” is kind of weird to me. When I assign a string literal to a variable, I do not think of the variable itself as a “string literal.” That phrase is for the literal characters in between quotes in the code, which by definition are already “frozen.” They’re a static part of the code itself. This change makes it so that you cannot mutate a variable which was initialized using a string literal. (if I understand correctly!)
That's not quite how Ruby and similar languages like Python or JS work.
Variables don't "contain" a string, they just point to objects on the heap.
So:
Here both variables are essentially pointers to a pre-existing object on the heap, and that object is immutable.
Yeah it’s just the naming is weird. The string literal is not the object on the heap, it’s part of the program’s code itself, which was (assumedly) never mutable to begin with.
1 reply →
In ruby, "frozen" is a property of some values that makes them immutable. I mean, other than the part where you can mutably unfreeze objects of many classes. (At least you can't unfreeze strings.) This change makes string values that come from literals initially frozen. It has nothing to do with variable bindings.
The amount of misconceptions in this thread about mutable strings...
Like what ?
I often do something like
which, without `# frozen_string_literal: true`, I believe allocates a string when the application loads (it sounds like it might be 2) and another string at runtime and then mutate that.
That seems like it's better than doing
because that will allocate the frozen string to `FOO` when the application loads, then make a copy of it to `foo` at runtime, then mutate that copy. That means two strings that never leave memory (FOO, SUB_ME) and one that has to be GCed (return value) instead of just one that never leaves memory (SUB_ME) and one that has to be GCed (foo/return value).
This is true in particular when FOO is only used in `my_method`. If it's also used in `my_other_method` and it logically makes sense for both methods to use the same base string, then it's beneficial to use the wider-scope constant.
(The reason this seems reasonable in an application is that the method defines the string, mutates it, and sends it along, which primarily works because I work on a small team. Ostensibly it should send a frozen string, though I rarely do that in practice because my rule is don't mutate a string outside the context in which it was defined, and that seems sensible enough.)
Am I mistaken and/or is there another, perhaps more common pattern that I'm not thinking about that makes this desirable? Presumably I can just add # frozen_string_literal: false to my files if I want so this isn't a complaint. I'm just curious to know the reasoning since it is not obvious to me.
Many comments about Python 2-3 moves. The problem with Python was that 2 to 3 offers little to no incentive.
So I sometimes wonder why JIT isn't used as a motivation to move / remove features. Basically if you want JIT to work, your code has to be x ready or without feature x. So if you still want those performance improvements you will have to move forward.
Just a bit under 15 years after we did this at Twitter.
I would assume if Shopify and Github are on board then Rails is pretty well tested.
btw byroot in this thread is a ruby code committer and rails core committer as well
Has anyone actually benchmarked the use of frozen string literals? I feel like this is one of those micro-optimizations that everyone does, but they're probably accomplishing a diminishingly small performance improvement, while making the codebase less readable. On net, a negative.
Yes: https://bugs.ruby-lang.org/issues/20205#note-34
Ah excellent. Seems like a modest performance improvement.
This should hopefully go over easier than the keywords arguments transition.
I wonder what the basis is for the description of the 3.7 / 4 ruby releases is. I haven't seen this transition plan with version numbers described outside of this blog post.
It's used as an example here: https://bugs.ruby-lang.org/issues/20205
But not actually stated it's the plan. I'd bet whatever LLM wrote the article took it as a stronger statement than it is.
Hey there. I wrote the article. While I know the version numbers aren’t concrete, I added the proposal anyways as a way for readers to visualise what the maintainers had in mind. Since we’re only at 3.4 with 3.5 in preview, it can’t be claimed concretely what the future holds. I just didn’t make that super obvious in the post.
I had to explain the same reasoning in Reddit the other day. Perhaps it’s time to take this as a feedback and update the blog.
Btw I just asked gpt to write an article on the same topic, with a reference to the Ruby issues page. And it DID NOT add the future proposal part. So LLMs are definitely smarter than me.
How’s the htmlsafe trick going to work if strings are immmutable?
First, only literal strings are concerned. Second, Rails's `String#html_safe` method's return value is a new instance, it doesn't mutate the receiver in place.
We learned nothing from Python 2->3
An obviously good change, actually massive performance improvements not hard to implement but its still gonna be such a headache and dependency hell
Ruby has been extremely slow and deliberate in rolling out frozen string literals. They added a magic comment to opt in to them on a per-file basis all the way back in Ruby 2.3—almost a decade ago.
https://www.ruby-lang.org/en/news/2015/12/25/ruby-2-3-0-rele...
Most linting setups I've seen since then have required this line. I don’t expect many libraries to run afoul of this, and this warning setting will make finding them easy and safe. This will be nothing like the headache Python users faced transitioning to 3.
I hope this is corect - i do agree it has been a long and slow migration path and migrating is fairly easy - migrating python 2 to 3 code was fairly easy as well anyone could do it in their codebase, it remains a big deal and possibly very impactful to make such breaking changes to the behavior of primitives in mature ecosystems. How many gems does the average rails app have, okay they all need to be updated and they sohld be being updated for other reasons, I remain skeptical of how smooth the change is going to be over all ecosystem wise but time will tell.
I agree it has been a well advertised and loudly migration path and timeframe for it
Most of the problems in Python 3 were the string encoding changes. It was very pervasive, fixing things was often not so straight-forward, and writing programs that worked well with Python 2 and 3 was possible but somewhat difficult and error-prone.
The rest of the changes were a bit annoying but mostly boring; some things could have been done better here too, but the string encoding thing was the main issue that caused people to hold on to Python 2 for a long time.
The frozen string literal changes are nothing like it. It's been "good practise" to do this for years, on errors fixing things is trivial, there is a long migration path, and AFAIK there are no plans to remove "frozen_string_literal: false". It's just a change in the default from false to true, not a change in features.
"Learning lessons" doesn't mean "never do anything like this ever again". You're the one who failed to learn from Python 3, by simply saying "language change bad" without deeper understanding of what went wrong with Python 3, and how to do things better. Other languages like Go also make incompatible changes to the language, but do so in a way that learned the lessons from Python 3 (which is why you're not seeing people complain about it).
??? This is nothing like the Python transition. In Python there were two incompatible language versions side by side for years that made it really hard on library maintainers. Ruby is giving a 7-8 year transition period before this even hits, with years of warnings built-in to the plan. What more would you have them do?
Not to mention that in addition to the opt-in warning that came with 3.4, if you've been using any reasonable linter such as Rubocop for the past 10ish years then you're already being yelled at for lack of `# frozen_string_literal: true` magic comment.
This change doesn’t really compare to Py 2 vs 3
Even if you have an incompatible codebase that you don't wish to convert, you'll be able to set `RUBYOPT="--disable-frozen-string-literal"` so it keeps running.
And since that flag really doesn't require lots of work in the VM, it's likely to be kept around pretty much forever.
great post thanks