I'm trying out Rhino to embed Javascript in Java. I noticed that when I eval a script that adds two ints together in Javascript, the result comes back as a Double.
ScriptEngine engine = new ScriptEngineManager().getEngineByName("JavaScript");
engine.put("x", 3);
engine.put("y", 4);
assertEquals(3, engine.eval("x")); // OK
assertEquals(4, engine.eval("y")); // OK
assertEquals(7, engine.eval("x + y")); // FAILS, actual = (Double) 7.0
So why does the x + y
expression return a double instead of an int?
Is Javascript itself doing some type promotion I don't understand?
Fun fact of the day: All numbers in javascript (ECMAScript) are double-precision.
The Number type has exactly 18437736874454810627 (that is, 264−253+3)
values, representing the double-precision 64-bit format IEEE 754
values as specified in the IEEE Standard for Binary Floating-Point
Arithmetic, except that the 9007199254740990 (that is, 253−2) distinct
“Not-a-Number” values of the IEEE Standard are represented in
ECMAScript as a single special NaN value.
http://people.mozilla.org/~jorendorff/es6-draft.html#sec-8.1.5
JavaScript only has one numerical type - Number which is analogous to the Java Double type. I expect the engine is coercing the type to Number
to perform the arithmetic.