<head>
<script>
window.setInterval(function(){timer()},100);
function timer()
{
document.getElementById("timer").innerHTML=
(parseInt(document.getElementById("timer").innerHTML*100)+1)/100;
}
</script>
</head>
<body>
<div id="timer">0.000</div>
</body>
<head>
<script>
window.setInterval(function(){timer()},100);
function timer()
{
document.getElementById("timer").innerHTML=
(parseInt(document.getElementById("timer").innerHTML*100)+1)/100;
}
</script>
</head>
<body>
<div id="timer">0.000</div>
</body>
As you see, timer counts only up to 0.29
.
Why is it?
Share Improve this question edited Nov 30, 2014 at 19:09 nicael asked May 9, 2014 at 15:46 nicaelnicael 19k13 gold badges61 silver badges91 bronze badges 01 Answer
Reset to default 25 +50It's because of the way floating point math works coupled with your parseInt()
. Refer to Is floating point math broken.
When it reaches 0.29
, it does 0.29 x 100
, which you're expecting to result in 29
but actually it is:
console.log(0.29 * 100);
28.999999999999996
Next, you convert it to an integer using parseInt()
which results in 28
(removing all of the decimal places), finally you add 1
and divide by 100
making the result 0.29
and this is repeated on each tick of the timer, the number can't increase.
It would be better to store the raw value as a variable and output it using .toFixed(2)
, instead of using the number on the UI as the source. Like this:
Fiddle
var num = 0.00;
window.setInterval(function () {
timer();
}, 100);
function timer() {
num = ((num * 100) + 1) / 100;
document.getElementById("timer").innerHTML = num.toFixed(2);
}