How to format numbers with significant digits

Control which digits display and round in formatted numbers by specifying precision

Introduction

When formatting numbers for display, you sometimes need to control precision based on how many meaningful digits a number contains, rather than how many digits appear after the decimal point. This approach is called formatting with significant digits.

Significant digits represent the digits in a number that carry meaningful information about its precision. The number 123.45 has five significant digits. The number 0.00123 has three significant digits, because the leading zeros only indicate magnitude, not precision.

This lesson shows you how to format numbers using significant digits in JavaScript. You will learn when this approach is better than controlling decimal places, and how to use the minimumSignificantDigits and maximumSignificantDigits options with the Intl.NumberFormat API.

What are significant digits

Significant digits are the digits in a number that indicate its precision. Understanding which digits are significant requires following specific rules.

All non-zero digits are always significant. In the number 123, all three digits are significant. In 45.67, all four digits are significant.

Leading zeros are never significant. They only indicate the position of the decimal point. In 0.0045, only the 4 and 5 are significant digits. The number has two significant digits, not six.

Trailing zeros after the decimal point are significant. They indicate that the measurement or calculation was precise to that level. The number 1.200 has four significant digits, while 1.2 has only two.

Trailing zeros before the decimal point depend on context. In the number 1200, it is unclear whether the zeros are significant without additional information. Scientific notation or explicit precision indicators resolve this ambiguity.

Format numbers with maximum significant digits

The maximumSignificantDigits option limits how many significant digits appear in the formatted output. This option is useful when you want to display numbers with consistent precision regardless of their magnitude.

const formatter = new Intl.NumberFormat("en-US", {
  maximumSignificantDigits: 3,
});

console.log(formatter.format(1.2345));
// Output: "1.23"

console.log(formatter.format(12.345));
// Output: "12.3"

console.log(formatter.format(123.45));
// Output: "123"

console.log(formatter.format(1234.5));
// Output: "1,230"

When the number contains more significant digits than the maximum, the formatter rounds the number. The rounding follows standard rounding rules, rounding to the nearest value. When a number falls exactly halfway between two values, it rounds to the nearest even number.

The maximumSignificantDigits option accepts values from 1 to 21. The default value when this option is not specified is 21, which effectively means no limit.

const oneDigit = new Intl.NumberFormat("en-US", {
  maximumSignificantDigits: 1,
});

console.log(oneDigit.format(54.33));
// Output: "50"

console.log(oneDigit.format(56.33));
// Output: "60"

This option works with all number types, including integers, decimals, and numbers in different notations.

Format numbers with minimum significant digits

The minimumSignificantDigits option ensures that at least the specified number of significant digits appear in the formatted output. When the number contains fewer significant digits than the minimum, the formatter adds trailing zeros.

const formatter = new Intl.NumberFormat("en-US", {
  minimumSignificantDigits: 5,
});

console.log(formatter.format(1.23));
// Output: "1.2300"

console.log(formatter.format(123));
// Output: "123.00"

console.log(formatter.format(0.0012));
// Output: "0.0012000"

This option is useful when you need to display numbers with a consistent level of precision, showing that measurements or calculations were performed to a specific accuracy.

The minimumSignificantDigits option accepts values from 1 to 21. The default value is 1, which means numbers display their natural precision without added zeros.

const manyDigits = new Intl.NumberFormat("en-US", {
  minimumSignificantDigits: 10,
});

console.log(manyDigits.format(5));
// Output: "5.000000000"

The formatter adds zeros after the decimal point to reach the minimum, or adds zeros before the decimal point if necessary.

Combine minimum and maximum significant digits

You can specify both minimumSignificantDigits and maximumSignificantDigits together to create a range of acceptable precision. The formatter will display numbers within this range.

const formatter = new Intl.NumberFormat("en-US", {
  minimumSignificantDigits: 3,
  maximumSignificantDigits: 5,
});

console.log(formatter.format(1.2));
// Output: "1.20"
// Expanded to meet minimum of 3

console.log(formatter.format(1.234));
// Output: "1.234"
// Within range, displayed as-is

console.log(formatter.format(1.23456789));
// Output: "1.2346"
// Rounded to meet maximum of 5

When combining these options, the minimum must be less than or equal to the maximum. If you specify a minimum greater than the maximum, the formatter throws a RangeError.

try {
  const invalid = new Intl.NumberFormat("en-US", {
    minimumSignificantDigits: 5,
    maximumSignificantDigits: 3,
  });
} catch (error) {
  console.log(error.name);
  // Output: "RangeError"
}

This combination is particularly useful for scientific or financial applications where you want to enforce both a minimum level of precision and prevent excessive digits from cluttering the display.

How significant digits differ from decimal places

Significant digits and decimal places represent two different approaches to controlling number precision. Understanding when to use each approach helps you format numbers appropriately.

Decimal places control how many digits appear after the decimal point, regardless of the number's magnitude. The options minimumFractionDigits and maximumFractionDigits implement this approach.

const decimalPlaces = new Intl.NumberFormat("en-US", {
  minimumFractionDigits: 2,
  maximumFractionDigits: 2,
});

console.log(decimalPlaces.format(1.2));
// Output: "1.20"

console.log(decimalPlaces.format(12.3));
// Output: "12.30"

console.log(decimalPlaces.format(123.4));
// Output: "123.40"

Significant digits control how many meaningful digits appear in the entire number, adapting to the number's magnitude. Numbers with different magnitudes display different numbers of decimal places to maintain consistent precision.

const significantDigits = new Intl.NumberFormat("en-US", {
  minimumSignificantDigits: 3,
  maximumSignificantDigits: 3,
});

console.log(significantDigits.format(1.2));
// Output: "1.20"

console.log(significantDigits.format(12.3));
// Output: "12.3"

console.log(significantDigits.format(123.4));
// Output: "123"

Notice how the significant digits approach shows fewer decimal places as the number's magnitude increases, while the decimal places approach shows the same number of decimal places regardless of magnitude.

Interaction with fraction digit options

When you specify significant digit options, they take precedence over fraction digit options by default. The formatter ignores minimumFractionDigits and maximumFractionDigits when either significant digit option is present.

const formatter = new Intl.NumberFormat("en-US", {
  minimumFractionDigits: 2,
  maximumFractionDigits: 2,
  maximumSignificantDigits: 3,
});

console.log(formatter.format(1234.56));
// Output: "1,230"
// Significant digits option takes precedence
// Fraction digit options are ignored

This behavior is controlled by the roundingPriority option, which determines how the formatter resolves conflicts between different precision settings. The default value is "auto", which gives precedence to significant digits.

You can change this behavior by setting roundingPriority to "morePrecision" or "lessPrecision", but these are advanced options for specialized use cases. For most applications, the default precedence behavior is appropriate.

When to use significant digits instead of decimal places

Choose significant digits when you need consistent precision across numbers with different magnitudes. This approach is common in scientific, engineering, and data visualization contexts.

Use significant digits for scientific measurements and calculations. Lab results, sensor readings, and physical measurements often need to reflect the precision of the measurement instrument. Showing three significant digits consistently communicates precision regardless of whether the measurement is 0.0123, 1.23, or 123.

const measurement = new Intl.NumberFormat("en-US", {
  maximumSignificantDigits: 4,
});

console.log(measurement.format(0.012345));
// Output: "0.01235"

console.log(measurement.format(1.2345));
// Output: "1.235"

console.log(measurement.format(1234.5));
// Output: "1,235"

Use significant digits for dashboard metrics that display numbers with varying magnitudes. When showing statistics like page views, revenue, or user counts, significant digits prevent tiny numbers from displaying excessive precision while keeping large numbers readable.

const metric = new Intl.NumberFormat("en-US", {
  maximumSignificantDigits: 3,
});

console.log(metric.format(1.234));
// Output: "1.23"

console.log(metric.format(123.4));
// Output: "123"

console.log(metric.format(12345));
// Output: "12,300"

Use decimal places for currency and financial amounts where the fractional part represents cents or other fixed currency subdivisions. These amounts need consistent decimal places regardless of magnitude.

const currency = new Intl.NumberFormat("en-US", {
  style: "currency",
  currency: "USD",
  minimumFractionDigits: 2,
  maximumFractionDigits: 2,
});

console.log(currency.format(1.5));
// Output: "$1.50"

console.log(currency.format(123.5));
// Output: "$123.50"

The choice between these approaches depends on whether precision relates to the total number of digits or to a fixed fractional component.