When running
console.log(new Intl.NumberFormat('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: 0,
maximumSignificantDigits: 3,
minimumSignificantDigits: 1
}).format(10.123456789));
When running
console.log(new Intl.NumberFormat('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: 0,
maximumSignificantDigits: 3,
minimumSignificantDigits: 1
}).format(10.123456789));
I would expect the output to be 10
. Instead for some reason it outputs 10.1
which breaks the maximumFractionDigits: 0
constraint. What's going on? Considering this constraint is ignored across browsers it seems this is according to specification, but I just can't phantom a reason for this.
2 Answers
Reset to default 13Check out the more current answer from @DavidMulder or read on if you like history...
From the Intl.NumberFormat parameter descriptions (emphasis added):
The following properties fall into two groups:
minimumIntegerDigits
,minimumFractionDigits
, andmaximumFractionDigits
in one group,minimumSignificantDigits
andmaximumSignificantDigits
in the other. If at least one property from the second group is defined, then the first group is ignored.
There has to be some override behavior to handle property setting conflicts but in your example, one might reasonably wish that the override behavior was not quite so all or nothing (since making the adjustment for the fraction digits limitation falls within the specified range of significant digits). Unfortunately, the spec is simply to ignore any of the fraction or integer digit limitations if the significant digit properties are set.
If anyone comes looking for a way to utilize both types of properties to format a single number, below is a very basic example using the two constructors in succession (beware, this can get messy very quickly with more complex formatting requirements).
const sigDigits = (n, min, max, minf, maxf) => {
let num = new Intl.NumberFormat('en-US', {
minimumSignificantDigits: min,
maximumSignificantDigits: max
})
.format(n);
num = new Intl.NumberFormat('en-US', {
minimumFractionDigits: minf,
maximumFractionDigits: maxf
})
.format(num);
return num;
};
const result = sigDigits(10.123456789, 1, 3, 0, 0);
console.log(result);
// 10
Self answering this, as roundingPriority
has gotten added since I asked this question.
The fraction digits (minimumFractionDigits/maximumFractionDigits) and significant digits (minimumSignificantDigits/maximumSignificantDigits) are both ways of controlling how many fractional and leading digits should be formatted. If both are used at the same time, it is possible for them to conflict.
These conflicts are resolved using the roundingPriority property. By default, this has a value of "auto", which means that if either minimumSignificantDigits or maximumSignificantDigits is specified, the fractional and integer digit properties will be ignored.
Source: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Intl/NumberFormat/NumberFormat
So, using lessPrecision
will respect the maximum
constraints:
console.log(new Intl.NumberFormat('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: 0,
maximumSignificantDigits: 3,
minimumSignificantDigits: 1,
roundingPriority: 'lessPrecision'
}).format(10.123456789));
Browser support
Chrome Firefox Safari options.roundingPriority parameter 106 116 15.4
Math.round
, though I agree that's less than ideal. – p.s.w.g Commented Apr 24, 2019 at 16:26