It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:
var num = 4.5;
num % 1 === 0; // false - 4.5 is a decimal
But, this fails for
var num = 1.0; // or 2.0, 3.0, ...
num % 1 // 0
Unfortunately, these doesn't work either
num.toString() // 1
typeof num // "number"
I am writing a JavaScript color parsing library, and I want to process input differently if it is given as 1.0 or 1. In my case, 1.0
really means 100% and 1
means 1.
Otherwise, both rgb 1 1 1
and rgb 1 255 255
will be parsed as rgb 255 255 255
(since I am right now taking anything <= 1 to mean a ratio).
It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:
var num = 4.5;
num % 1 === 0; // false - 4.5 is a decimal
But, this fails for
var num = 1.0; // or 2.0, 3.0, ...
num % 1 // 0
Unfortunately, these doesn't work either
num.toString() // 1
typeof num // "number"
I am writing a JavaScript color parsing library, and I want to process input differently if it is given as 1.0 or 1. In my case, 1.0
really means 100% and 1
means 1.
Otherwise, both rgb 1 1 1
and rgb 1 255 255
will be parsed as rgb 255 255 255
(since I am right now taking anything <= 1 to mean a ratio).
6 Answers
Reset to default 6Those numbers aren't actually decimals or integers. They're all floats. The only real difference between 1
and 1.0
is the notation that was used to create floats of equal values.
Edit: to help illustrate, consider:
1 === 1.0; // true
parseInt('1') == parseInt('1.0'); // true
parseFloat('1') === parseFloat('1.0'); // true
parseInt('1') === parseFloat('1'); // true
// etc...
Also, to demonstrate that they are really the same underlying data type:
typeof(1); // 'number'
typeof(1.0); // 'number'
Also, note that 'number' isn't unambiguous in JavaScript like it would be in other languages, because numbers are always floats.
Edit 2: One more addition, since it's relevant. To the best of my knowledge, the only context in JavaScript in which you actually have "real and true" integers that aren't really represented as floats, is when you're doing bitwise operations. However, in this case, the interpreter converts all the floats to integers, performs the operation, and then converts the result back to a float before control is returned. Not totally pertinent to this question, but it helps to have a good understanding of Number handling in JS in general.
Let your script parse the input as string, then it will be a matter of checking if there is the point like this.
mystring.indexOf('.');
Check this example and this example.
Number.isInteger(4.5)
Number.isInteger()
is part of the ES6 standard and not supported in IE11.
You'll have to do it when parsing the string. If there's a decimal point in the string, treat it as percentage, otherwise it's just the integer value.
So, e.g.:
rgb 1 1 1 // same as #010101
rgb 1.0 1 1 // same as #ff0101
Since the rgb
is there, you're parsing the string anyway. Just look for .
in there as you're doing it.
Well, as far as the compiler is concerned, there is no difference between 1.0 and 1, and because there is no difference, it is impossible to tell the difference between them. You should change it from 1.0 to 100 for the the percentage thing. That might fix it.
var num = 1;
and
var num = 1.0;
are the same. You mention that you want to treat them differently when given from a user. You will want to parse the difference when it is still a string and convert it to the appropriate number.
1.0 === 1
. – pimvdb Commented Sep 14, 2011 at 19:46