Comment by hashmash

3 days ago

Sometimes I'd like to have unsigned types too, but supporting it would actually make things more complicated overall. The main problem is the interaction between signed and unsigned types. If you call a method which returns an unsigned int, how do you safely pass it to a method which accepts a signed int? Or vice versa?

Having more type conversion headaches is a worse problem than having to use `& 0xff` masks when doing less-common, low-level operations.

> If you call a method which returns an unsigned int, how do you safely pass it to a method which accepts a signed int?

The same way you pass a 64-bit integer to a function that expects a 32-bit integer: a conversion function that raises an error if it's out of range.

  • This adds an extra level of friction that doesn't happen when the set of primitive types is small and simple. When everyone agrees what an int is, it can be freely passed around without having to perform special conversions and deal with errors.

    When trying to adapt a long to an int, the usual pattern is to overload the necessary methods to work with longs. Following the same pattern for uint/int conversions, the safe option is to work with longs, since it eliminates the possibility of having any conversion errors.

    Now if we're taking about signed and unsigned 64-bit values, there's no 128-bit value to upgrade to. Personally, I've never had this issue considering that 63 bits of integer precision is massive. Unsigned longs don't seem that critical.

Like I said, I understand why they don’t.

I think the only answer would be you can’t interact directly with signed stuff. “new uint(42)” or “ulong.valueOf(795364)” or “myUValue.tryToInt()” or something.

Of course if you’re gonna have that much friction it becomes questionable how useful the whole thing is.

It’s just my personal pain point. Like I said I haven’t had to do it much but when I have it’s about the most frustrating thing I’ve ever done in Java.