How JavaScript Counts
JavaScript Numbers
JavaScript has one number type for both integers and decimals. Learn its quirks — Infinity, NaN, and floating-point precision.
What you'll learn
- Write integer and decimal number literals
- Recognize `NaN` and `Infinity`
- Understand why `0.1 + 0.2 !== 0.3`
JavaScript has one number type for both integers and decimals.
There’s no separate int or float.
console.log(42); // integer
console.log(3.14); // decimal
console.log(-7); // negative
console.log(1e6); // 1,000,000 (scientific notation)
console.log(0xff); // hex — 255
console.log(0b1010); // binary — 10
console.log(0o17); // octal — 15
console.log(1_000_000); // numeric separators (ES2021) — same as 1000000 Special Number Values
JavaScript reserves a few special values within the number type.
Infinity
What you get when a number goes too big — or you divide by zero.
console.log(1 / 0); // Infinity
console.log(-1 / 0); // -Infinity
console.log(Number.MAX_VALUE * 2); // Infinity
console.log(Infinity + 1); // Infinity (still) NaN (Not a Number)
You get NaN when a math operation doesn’t make sense.
console.log("hello" * 2); // NaN
console.log(Math.sqrt(-1)); // NaN
console.log(0 / 0); // NaN
// NaN is the one value not equal to itself:
console.log(NaN === NaN); // false
console.log(Number.isNaN(NaN)); // true ← use this to test Floating-Point Precision
This catches every JavaScript beginner exactly once.
console.log(0.1 + 0.2); // 0.30000000000000004
console.log(0.1 + 0.2 === 0.3); // false 😱 This isn’t a JavaScript bug — it’s how almost every programming
language stores decimal numbers (IEEE-754). Numbers like 0.1 can’t
be represented exactly in binary, so tiny rounding errors creep in.
For most code, this doesn’t matter. For money or anything where exactness counts:
- Multiply to work in cents: store
$1.10as110. - Use
Number.EPSILONfor “close enough” comparisons. - For truly arbitrary precision, use
BigInt(next-but-one lesson).
const a = 0.1 + 0.2;
const b = 0.3;
console.log(Math.abs(a - b) < Number.EPSILON); // true Safe Integer Range
Regular number values are accurate up to Number.MAX_SAFE_INTEGER
(about 9 quadrillion). Beyond that, integers start losing precision.
console.log(Number.MAX_SAFE_INTEGER); // 9007199254740991
console.log(Number.MAX_SAFE_INTEGER + 1); // 9007199254740992 ✅
console.log(Number.MAX_SAFE_INTEGER + 2); // 9007199254740992 😱 — same! For bigger integers, use BigInt.
Up Next
Now: the methods for converting strings to numbers, formatting, and checking.
JavaScript Number Methods →