Right now you can do:
typeof(1 + 2_i64) # => Int32
typeof(1_i64 + 2) # => Int64
typeof(1 + 2_i8) # => Int32
typeof(1_i8 + 2) # => Int8
The result if the type of the left-hand side.
This is maybe convenient. However, when you want to ensure that all your operations are done in a given type, for example Int64, it's a problem because some intermediate results might end up being Int32 and then being added to Int64, and overflow might happen in between. Check for example this stack overflow question.
The problem is bad because it's hard to see where the types were misused.
The idea is to just allow operations between the same types. So for example:
int32 = 1
int64 = 1_i64
another_int32 = 2
another_int64 = 2_i64
int32 + another_int32 # okay
int32 + int64 # compile error
int64 + int32 # compile error
int64 + another_int64 # okay
# But!
int32 + 2 # okay
int64 + 2 # okay too! Because of autocasting
int64 += 1 # okay too! Because of autocasting
It might become a bit less convenient when you mix two different types. However it's still convenient if you want to add one, or generally an integer literal, because of autocasting.
I believe this will improve a lot the language. Right now mixing integer types and float types is a bit messy.
I think Crystal is not ready for this if Int32 is the default. Too many places that could benefit from Int64 but if it's such a pain to use people would be like "ehhh it'll be fine as Int32".
Also it would be like the biggest breaking change ever.
How about just disallowing operations that don't fit into each other? So good ones would be for example: UInt32#+(Int32)
, In64#+(Int32)
, Int64#+(UInt32)
etc
But disallowing things like Int32#+(Int64)
, Int64#-(UInt64)
, Int32#-(UInt32)
and so on.
Well, more as a compromise solution to keep some developer ergonomy in tact. If I could pick I think I would rather disallow automatic unions on local variable assignments between number types (or even in general with anything but Nil
), but that's probably a lot trickier to implement. I think status quo is also fine, it's something you need to learn and every statically typed language I've seen has its quirks in how it deals with differently sized integers, all subtly different.
I agree with both @oprypin and @asterite : as long as we implement target-specific Int
and UInt
as the default integer types and we use & recommend those _everywhere_ instead of specific types, then we can think about disabling mixed type operations. We usually only need specific types to implement algorithms or protocols, that is places where you must be careful about them, so being strict becomes a convenience, not a burden.
I could see maybe always promoting to the larger type of the two? Or is that too confusing?
I could see maybe always promoting to the larger type of the two? Or is that too confusing?
It's the same problem I describe but worse, because right now if you do:
a += b
which expands to:
a = a + b
because the resulting type is typeof(a)
, a
will remain with the same type.
With what you propose, the type of a
can suddenly become a union type, which is even worse than the current behavior.
I agree that this should follow after #8111. With the current numeric types, I feel it would be too restrictive, or rather confusing.
OK that makes sense. Yeah where precision might be lost I'd say either warn or fail (that shouldn't be too common, so not too painful of a change?). @oprypin also sometimes people think they are using Int64 but don't they realize they accidentally aren't, as I see it... :)
I tried changing this and there are a lot of changes to do in the stdlib already. So this is probably very difficult/tedious to change and might not be worth it... or maybe yes...? If someone wants to try it, go ahead.
FWIW in java it just takes the "larger" of the two (int + long => long) which kind of makes sense...but would still be implicit :|
Most helpful comment
I think Crystal is not ready for this if Int32 is the default. Too many places that could benefit from Int64 but if it's such a pain to use people would be like "ehhh it'll be fine as Int32".
Also it would be like the biggest breaking change ever.