If I run
var num = 23;
var n = num.toString();
console.log(n)
it logs 23
as expected but if I apply the toString()
directly to a number like,
var n = 15.toString();
console.log(n)
it throws an error:
Uncaught SyntaxError: Invalid or unexpected token.
I noticed it also works fine for decimal values in the num variables (like .3, .99, 2.12, 99.1) etc. Could some one please help me understand the difference and how this function works
I can't explain why, but if you do
it works.
Also another way to cast to string:
yay Javascript.
When you store it into num as 23 that is the value assigned to it. When you call
23.toString()
it thinks it is 23(decimal point) and some word tostring which doesn't make sense.So what you have to do is add another decimal point afterward to let it know that it is 23.0
What you get then is
23.(invisibleZeroHere).toString()
AKA 23..toString()I believe the JavaScript parsers simply don't allow you to call methods directly on the literals.
However you can do this...
... and it will work.
EDIT
Thanks @apsillers, for the explanation. I didn't know that. The first dot on numbers is treated as part of the number, hence the problem.
1.1.toString()
works. Interesting.JavaScript parsing is working as intended. You have declared the following:
We see this as an Integer with a function being called on it.
The parser doesn't. The parser sees an attempt to declare a floating-point literal. The parser uses this:
[(+|-)][digits][.digits][(E|e)[(+|-)]digits]
It assumes that you are declaring a Floating-Point Literal because it is:
If you really, for all intents and purposes, want to call
23.toString()
, the course of action is to isolate the literal like so:or
That being said, JavaScript is flexible enough to know you want to use 23 as a String in most cases. This compiles fine.
So does this.
This doesn't.