Numbered Steps to Reproduce Problem:
Insert an integer literal higher than 2^31 (e.g. 0xFFFFFFFF (2^32-1), or 10000000000).
Print the value of this literal, and observe that it is reported as 2.14748e+09.
Do the same for a literal lower than -2^31.
Code Snippet (if applicable) to Reproduce Problem:
/world/New()
world.log << 10000000000 // 10^10
world.log << -10000000000 // -10^10
world.log << 0x100000000 // 2^32, about 4.3*10^9
world.log << -0x100000000 // -2^32, about -4.3*10^9
Expected Results: For the code above to print the following (or something similar).
1e+10
-1e+10
4.29497e+09
-4.29497e+09
Actual Results: The code above prints the following:
2.14748e+09
-2.14748e+09
2.14748e+09
-2.14748e+09
Does the problem occur:
Every time? Or how often? Every time
In other games? N/A
In other user accounts? Unknown
On other computers? Unknown
Did the problem NOT occur in any earlier versions? If so, what was the last version that worked? Unknown
Workarounds: Use a floating-point or scientific notation literal instead (e.g. 10000000000.0 or 10000000000e0 instead of 1000000000).
I'm hesitant to change this outright without understanding better where EvalInt() may be called. I don't want a situation where a token is understood to be an int, that understanding is then relied upon, but when it's evaluated it turns out not to be an int after all. So I'll just have to be sure that everything's kosher before making that change.