i'm running into an issue with `UInt` in JS. When ...
# javascript
p
i'm running into an issue with
UInt
in JS. When I expose this value to JavaScript, it seems to be interpreted as a signed int instead. is there a workaround?
a
Could you please provide the actual behavior and the expected???
p
sure so I have a field:
val unsignedInt: UInt = 3_000_000_000u
this value is over the max for a signed 32bit int, but a perfectly valid unsigned 32bit int
if i now consume this from javascript, and compare to a javascript declared
number
value of 3,000,000,000. the comparison fails because
unsignedInt
is actually represented by a
number
but
number
is signed, so the value is
-1294967296
c
JavaScript doesn't have unsigned integers, so there's no way to make this work :/ The maximum safe integer that can be expressed in JS is ~2⁵³ but UInt goes up to ~2⁶⁴
j
That's a ULong. UInt is 2^32
c
Ah yes, sorry
So, feasible actually. I don't think the stdlib provides a way to do this conversion directly
j
Yeah. It's probably that the compiler lowers it with the same logic as the JVM which is to a signed, 32-bit value
c
However, the compiler doesn't insert runtime checks at all. If you accept an
Int
from a function, JS can pass an out-of-bounds number. So I guess you could implement the conversion in JS and it would be fine.
p
yeah i think the workaround i'm going with is to expose as a string and have each platform parse.
actually found a sort of wild JS workaround:
kotlinUInt >>> 0
fixes this issue. source: https://stackoverflow.com/a/11385688 but kind of leads to more confusion on why this happens? the fact that this actually works seems to imply that the last (LSB) 32 bits of the JS number are correct. but the according to JS docs, the sign bit should be the first bit (MSB)?
👀 1