'10 / 3 = 3' is either bad notation or wrong. It's not true under any usual definition of 10, 3, / or =. '3 = 3.0' on the other hand is perfectly reasonable in many circumstances. If you think 10/3 can equal 3 but not 3.0, you are either confused or confusing or both. What you mean to write is '≈', and when you do that, it's obvious that 3 and 3.0 are both usable in that sentence.
It is perfectly reasonable to define 3: ℕ = succ(succ(succ(zero))). It's also perfectly reasonable to define 3: ℝ as the image of succ(succ(succ(zero))): ℕ under the canonical embedding. Or you can define 3: ℚ with the obvious element. You can also define 3.0: ℚ or 3.0: ℝ as the obvious elements. If you were really a deviant, I suppose you could even define 3.0: ℕ, and people would roll their eyes, but everyone would understand you. Obviously, there are reasonable ways to define things so that `3 = 3.0` is a meaningful sentence (typechecks) and also literally true.
Again, different conventions are used in different contexts. The "user" of mathematics should pick the conventions and notations that make sense for what they're doing to communicate what they're trying to say. That itself is an important lesson. The sigfig convention you learned in middle school isn't the word of God.
Not being aware of these things to be capable of musing about them is I suppose another issue with our education system.
If I ask for someone for 3 of something and they give me 3.001 of it, it's whatever. If I ask someone for 3.000 of something and they give me 3.001 of it, it's out of spec.
Admittedly I only did this in school and it's been over 10 years, but I recall when doing engineering drawings, we'd specify ± (or separate lower/upper tolerances in some situations). Using decimal points to indicate uncertainty was not a thing I believe I did after high school. Does any actual professional use decimal places and not explicit ±?
Similarly, we calculated those ± values using the chain rule/uncertainty propagation, not with the simple decimal place rules you learn as a kid. I assume no one serious uses the child rules when CAD software can just as easily use the real ones.
'10 / 3 = 3' is either bad notation or wrong. It's not true under any usual definition of 10, 3, / or =. '3 = 3.0' on the other hand is perfectly reasonable in many circumstances. If you think 10/3 can equal 3 but not 3.0, you are either confused or confusing or both. What you mean to write is '≈', and when you do that, it's obvious that 3 and 3.0 are both usable in that sentence.
It is perfectly reasonable to define 3: ℕ = succ(succ(succ(zero))). It's also perfectly reasonable to define 3: ℝ as the image of succ(succ(succ(zero))): ℕ under the canonical embedding. Or you can define 3: ℚ with the obvious element. You can also define 3.0: ℚ or 3.0: ℝ as the obvious elements. If you were really a deviant, I suppose you could even define 3.0: ℕ, and people would roll their eyes, but everyone would understand you. Obviously, there are reasonable ways to define things so that `3 = 3.0` is a meaningful sentence (typechecks) and also literally true.
Again, different conventions are used in different contexts. The "user" of mathematics should pick the conventions and notations that make sense for what they're doing to communicate what they're trying to say. That itself is an important lesson. The sigfig convention you learned in middle school isn't the word of God.
Not being aware of these things to be capable of musing about them is I suppose another issue with our education system.
If I ask for someone for 3 of something and they give me 3.001 of it, it's whatever. If I ask someone for 3.000 of something and they give me 3.001 of it, it's out of spec.
Admittedly I only did this in school and it's been over 10 years, but I recall when doing engineering drawings, we'd specify ± (or separate lower/upper tolerances in some situations). Using decimal points to indicate uncertainty was not a thing I believe I did after high school. Does any actual professional use decimal places and not explicit ±?
Similarly, we calculated those ± values using the chain rule/uncertainty propagation, not with the simple decimal place rules you learn as a kid. I assume no one serious uses the child rules when CAD software can just as easily use the real ones.
1 reply →
> 10 / 3 can = 3, depending on the expected levels of precision.
>> 10 / 3 will never = 3.0.
You should read what they wrote again. They wrote with `≈`, which is a different operator than `=`.
What they wrote is correct.
.
> It is sad you still haven't learned this lesson after many decades.
>> I hope I'm never on a bridge you build or plane built to the specs you write if you truly think 10 / 3 = 3.00000.
HN doesn't allow this sort of behavior.