最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

What is the minimal non-zero value Math.random() can return in javascript - Stack Overflow

programmeradmin1浏览0评论

I know puters can't work with continuums. Math.random() javascript function returns a floating-point number between 0 (inclusively) and 1 (exclusively). I wonder what is the minimal non-zero number it can return. What "step" has this function?

I know puters can't work with continuums. Math.random() javascript function returns a floating-point number between 0 (inclusively) and 1 (exclusively). I wonder what is the minimal non-zero number it can return. What "step" has this function?

Share Improve this question asked Feb 27, 2015 at 16:56 Maksim MedvedevMaksim Medvedev 1135 bronze badges 3
  • 1 I believe that in most implementations, the granularity is 2^53. So the answer would be 2^-53. – Phylogenesis Commented Feb 27, 2015 at 17:03
  • 1 The spec says using an implementation-dependent algorithm or strategy, I'd guess the smallest result is what 1 / Number.MAX_VALUE gives you (assuming non-zero result, e.g. 5.562684646268003e-309) – Paul S. Commented Feb 27, 2015 at 17:03
  • The floating point value makes it interesting. For a typical C implementation, which returns an integer, RAND_MAX may be as low as 65535. While of course this can lead to lots of small decimals, ultimately the granularity is 1/65536, which is just about 4-and-bit decimals. – Jongware Commented Feb 27, 2015 at 17:34
Add a ment  | 

4 Answers 4

Reset to default 6

The standard surely doesn't express this value, so it depends on the implementation (and exaggerating a bit on this point, probably even an implementation that aways returns 0.42 as result for Math.random() is still pliant with the specification).

The smallest positive number that can be represented by a 64-bit normalized floating point number in IEEE754 format is 2−1022, i.e. 2.2250738585072014 × 10−308.

However the floating point representation uses a varying resolution, depending on the magnitude.

For numbers close to 1 the resolution is 2-53. Probably (just probably) many implementations pick a random integer number n between 0 and 253-1 and use as result n/9007199254740992.

It's almost certainly not from just picking any random float.

That would not accurately represent the step of the random function, because it would not even be close to uniform distribution. Let's say you got that 2-1022 "step" (smallest non-zero value that fits in a float), plus 0.25 as a random value. Well, that would be rounded to 0.25 because floats can't represent that accuracy. So you've have a whole swathe of "values" that are all equal to 0.25 due to rounding. This is not even remotely uniform.

I would say it's more likely that a float is generated with the exponent set to 0 with random bits for the mantissa, which would result in randomness of step 2-51 (I think XD) between 1 (included) and 2 (not included), from which you can then just subtract 1. In this case, the step would be the size of the mantissa.

ECMA provides no guidelines for the precision of randomness:

Returns a Number value with positive sign, greater than or equal to 0 but less than 1, chosen randomly or pseudo randomly with approximately uniform distribution over that range, using an implementation-dependent algorithm or strategy. This function takes no arguments.

Math.random() returns a double between 0 and 1. We can ignore the exponent, and the sign, and we're left with 53 usable bits. This means that the minimum possible step between numbers in a perfect world is 1/2^53 == 0.00000000000000011102230246251565404236316680908203125.

However, the prevailing JavaScript implementations provide different levels of randomness:

  • V8 (Chrome, chromium, node.js): 32 bits (src)
  • JavaScriptCore (Safari) 32 bits (src)
  • SpiderMonkey (Firefox): 53 bits (src)
  • Internet Explorer 53 bits

While some environments will give you more, the least mon denominator in the wild is 32 bits. Even on a server, your node.js app only gets 32 bits.

The minimum precision you can depend on if you want your application to run correctly everywhere is 1/2^32 == 0.00000000023283064365386962890625.

I don't know what the spec says, but a quick test in the console of Chrome, Firefox, and IE11 shows that the precision never goes beyond 20 decimal places.

Test it yourself:

for (var i = 0; i < 1000; ++i) console.log(Math.random());

Or to see the smallest number after a large number of iterations:

var smallest = 1;
for (var i = 0; i < 1000000; ++i) smallest = Math.min(smallest, Math.random());
发布评论

评论列表(0)

  1. 暂无评论