I'm curious, I have programmed JavaScript already few years but sometimes I get confused when I see the following variable declarations: (ofc. those could be any other numbers as well).
var exampleOne = 0.5;
var exampleTwo = .5;
What is the difference between these two, or is there any? Are there some sort of hidden benefits which I clearly don't understand?
I'm curious, I have programmed JavaScript already few years but sometimes I get confused when I see the following variable declarations: (ofc. those could be any other numbers as well).
var exampleOne = 0.5;
var exampleTwo = .5;
What is the difference between these two, or is there any? Are there some sort of hidden benefits which I clearly don't understand?
Share Improve this question edited Feb 18, 2014 at 2:10 Felix Kling 818k181 gold badges1.1k silver badges1.2k bronze badges asked Feb 18, 2014 at 1:50 Mauno VähäMauno Vähä 9,7883 gold badges35 silver badges55 bronze badges 3- Some people find the first version easier to read, but there's no difference to the puter. – Barmar Commented Feb 18, 2014 at 1:52
- Relevant section in the spec: es5.github.io/#x7.8.3 (DecimalLiteral) – Felix Kling Commented Feb 18, 2014 at 1:52
- The difference is one character. – Ja͢ck Commented Feb 18, 2014 at 1:54
2 Answers
Reset to default 4To quote the specification:
0.5
matches the rule DecimalLiteral :: DecimalIntegerLiteral . DecimalDigits which is evaluated as (MV means mathematical value):
The MV of DecimalLiteral :: DecimalIntegerLiteral . DecimalDigits is the MV of DecimalIntegerLiteral plus (the MV of DecimalDigits times 10–n), where n is the number of characters in DecimalDigits.
.5
matches the rule DecimalLiteral :: . DecimalDigits which is evaluated as
The MV of DecimalLiteral :: . DecimalDigits is the MV of DecimalDigits times 10–n, where n is the number of characters in DecimalDigits.
So you can see that the only difference is that the value of the digits preceding the .
are added to the final value. And adding 0
to a value doesn't change the value.
There is no difference.
The Numeric Literals are parsed equivalently - that is, both 0.5
and .5
(as would .50
) represent the same number. (Unlike most other languages, JavaScript has only one kind of number.)
I prefer to always include the [optional] leading 0 before the decimal.