Is this expected behavior? I have a lot of calcul...
# multiplatform
a
Is this expected behavior? I have a lot of calculations done with Float values so when unit testing I have certain values expected which depend on the precision of those. On the same tests running in JS I get slightly different values, is this because its internally using a double to do the calculation? I am rounding to 2 decimal places but the first issue I encountered was a Float whos JVM value ended with .1 but JS is giving .125 so rounding 2 decimal places is an issue.