Every developer has stared at a number like 1715174400 and wondered what it means. Or debugged a bug where a timestamp was off by exactly 1000x. Timestamps are deceptively simple — and deceptively easy to get wrong. Here's what you actually need to know.
What Is the Unix Epoch?
The Unix epoch is the starting point for Unix timestamps: January 1, 1970, 00:00:00 UTC. A Unix timestamp is the number of seconds that have elapsed since that moment.
So 1715174400 means roughly 1.715 billion seconds have passed since January 1970 — which works out to May 8, 2024.
Why 1970? Largely historical accident. When Unix was being developed at Bell Labs in the late 1960s and early 1970s, the engineers needed a reference point. 1970 was recent, round, and convenient. No deeper reason. The UTC starting midnight was chosen to avoid daylight saving time complications.
Seconds vs. Milliseconds: The Classic Bug
This is probably the most common timestamp bug in existence. Unix timestamps are traditionally in seconds. But JavaScript's Date.now() and new Date().getTime() return milliseconds.
// JavaScript
Date.now() // → 1715174400000 (milliseconds)
Math.floor(Date.now() / 1000) // → 1715174400 (seconds, Unix-style)
When you're calling a Python backend from a JavaScript frontend, or parsing a timestamp from a database in a different language, always check what unit you're working with. A value around 1.7 × 10⁹? Seconds. Around 1.7 × 10¹²? Milliseconds. Getting this wrong produces timestamps that are either 1000 seconds in the past or 1000 seconds in the future — which can cause subtle auth bugs, expiry logic failures, or data corruption.
Some systems (Go's time.Unix, Java's Instant.ofEpochSecond) let you specify explicitly. Use that.
ISO 8601: The Format That Sorts Correctly
Alongside raw timestamps, you'll often see dates in ISO 8601 format: 2024-05-08T12:00:00Z. This format has a useful property: it sorts lexicographically. Alphabetical order and chronological order are the same.
2024-01-01T00:00:00Z
2024-05-08T12:00:00Z
2024-12-31T23:59:59Z
Compare these as strings and you get the right order. This matters a lot when storing dates in filenames, log entries, or databases where you might sort as text rather than by a proper datetime column.
The Z suffix means UTC (it stands for "Zulu time," a NATO phonetic alphabet term for UTC). You might also see offsets like +05:30 for IST. When in doubt, use Z.
ISO 8601 is defined by the ISO standard and is the format recommended for data interchange — it's also what JSON serialization libraries default to.
Timezone Handling: Always Store UTC
The single most important rule: store timestamps in UTC, display in local time.
This sounds obvious but it's violated constantly. If you store 2024-05-08 09:00:00 without a timezone, you have a problem. Is that New York time? London? Tokyo? You've lost information, and converting it later is guesswork.
// Bad: stores ambiguous local time
const ts = new Date().toLocaleString();
// Good: stores unambiguous UTC
const ts = new Date().toISOString(); // "2024-05-08T12:00:00.000Z"
When you need to display a time to a user, convert at the last moment using their timezone. JavaScript's Intl.DateTimeFormat handles this well:
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: 'America/New_York',
dateStyle: 'medium',
timeStyle: 'short'
});
formatter.format(new Date('2024-05-08T12:00:00Z'));
// → "May 8, 2024, 8:00 AM"
The timezone offset for America/New_York is not fixed — it changes between -5 and -4 depending on daylight saving time. Which brings us to: never hardcode UTC offsets. Use IANA timezone names (America/New_York, Europe/London) rather than EST or +05:30. IANA names encode the full history of DST changes for that region.
The Y2K38 Problem
You've heard of Y2K. There's a similar problem coming on January 19, 2038, at 03:14:07 UTC. That's when a signed 32-bit integer counting seconds since 1970 will overflow.
Signed 32-bit integers max out at 2,147,483,647. Add one second and the value wraps to -2,147,483,648, which represents December 13, 1901. Systems still using 32-bit time_t — embedded systems, old databases, legacy C code — will break.
Most modern 64-bit systems are fine. A 64-bit timestamp can represent dates hundreds of billions of years into the future. But if you're working with embedded systems, old Linux kernels, or databases that store timestamps as 32-bit integers, this is a real concern. The fix is to migrate to 64-bit time representations — something the Linux kernel completed in 2020 for 32-bit architectures.
Leap Seconds
Every few years, the IERS (International Earth Rotation and Reference Systems Service) adds a leap second to UTC to keep atomic clocks aligned with Earth's slightly irregular rotation. This means some UTC minutes have 61 seconds.
For most application code, you don't need to worry about this. Most systems handle it at the OS or NTP level. But if you're building high-precision timing systems, financial systems, or anything that reasons about durations at sub-second precision, you should know leap seconds exist. The IERS maintains the full list.
Unix timestamps technically ignore leap seconds — they assume every day has exactly 86,400 seconds. This means a Unix timestamp doesn't precisely encode leap seconds, but for most practical applications this doesn't matter.
Datetime Libraries: What to Use
Don't use Moment.js for new code. The Moment.js team themselves recommend against it for new projects. It's mutable, large, and frozen in maintenance mode.
For modern JavaScript:
date-fns— functional, tree-shakeable, works with nativeDateobjects. Good for most use cases.date-fns-tz— timezone support fordate-fns.Luxon— immutable, timezone-aware, well-documented. From the Moment team themselves.TemporalAPI — the future. Now in Stage 3 of the TC39 proposal and shipping in browsers. It fixes almost every design flaw in theDateobject: immutable types, explicit timezone handling, separate types for date, time, datetime, instant, and duration.
// Temporal (Stage 3 — available via polyfill today)
const now = Temporal.Now.instant();
const inNewYork = now.toZonedDateTimeISO('America/New_York');
console.log(inNewYork.toString());
// → "2024-05-08T08:00:00-04:00[America/New_York]"
Temporal is genuinely a big improvement. It's worth learning even before it's universally available, since you can use the polyfill today.
See also: Number Systems Explained for background on how computers represent integers, and UTF-8 and Unicode Explained for another foundational encoding topic.
If you need to convert between Unix timestamps, ISO 8601, and human-readable formats without writing code, Timestamp Converter handles it instantly — paste in any format and it'll detect and convert automatically.
FAQ
Should I store timestamps as integers or ISO strings?
For databases with native datetime types (PostgreSQL TIMESTAMPTZ, MySQL DATETIME), use the native type — you get indexing, range queries, and timezone handling for free. For JSON APIs and event logs, use ISO 8601 strings (2026-05-08T14:30:00Z) — human-readable, sortable, unambiguous. For high-volume binary protocols and message queues, integers (Unix epoch in seconds or milliseconds) are most compact and parse-fast. Don't store local time without a timezone — it's ambiguous and breaks across daylight saving transitions.
Is the Temporal API ready to use in production?
Yes via polyfill — @js-temporal/polyfill is mature, well-tested, and used in production by many projects. Native browser support is rolling out (Firefox 121+, Safari 18+, Chrome via flag), with broader rollout expected through 2026. For new code, write Temporal-style APIs and use the polyfill; switch to native when available. Temporal is dramatically better than Date and worth the migration cost for any datetime-heavy codebase.
What's the difference between UTC, GMT, and Zulu time?
Practically nothing for most purposes. UTC (Coordinated Universal Time) is the modern atomic-clock-based standard with leap second adjustments. GMT (Greenwich Mean Time) is the historical UK time zone, traditionally synonymous with UTC but technically distinct (it doesn't include leap seconds). Zulu time (Z) is the military/aviation NATO designation for UTC. In code, all three mean "UTC for our purposes." ISO 8601 prefers Z notation.
How do I handle daylight saving time?
Never hardcode DST offsets — use IANA timezone names (America/New_York, Europe/London) and let your library compute the current offset. JavaScript's Intl.DateTimeFormat, Python's zoneinfo, and Java's ZoneId all handle DST transitions correctly using the IANA tzdata. Display times in user timezones; store and compute in UTC. The "local time at point X in the past" problem (especially during DST transition gaps) is genuinely hard — Temporal's ZonedDateTime handles it explicitly.
Why does `new Date('2026-05-08')` produce different results in different timezones?
Because the ECMAScript spec interprets a date-only string as UTC midnight, but parses date-only strings differently depending on engine version. Date strings without timezone info are ambiguous. Best practice: always include the time and Z suffix ('2026-05-08T00:00:00Z') for consistent parsing, or use Temporal.PlainDate.from('2026-05-08') which is unambiguously a calendar date with no timezone.
What's the right way to compare timestamps for "within the last N days"?
Compare epoch values, not date components. Date.now() - record.created_at_ms < 7 * 24 * 60 * 60 * 1000 works for millisecond timestamps. For day-aligned comparisons (calendar days, not 24-hour periods), use Temporal's PlainDate or date-fns differenceInCalendarDays — these handle DST transitions and don't care about the time portion. Don't try to do calendar math with Date.getTime() — DST will break it.
Will Y2K38 actually cause problems?
For 64-bit systems, no — they have decades of headroom. The risk is in 32-bit embedded systems (industrial controllers, old IoT devices, legacy mainframes), 32-bit databases that store time_t as 32-bit signed integer, and certain file formats with 32-bit timestamp fields. Linux kernels switched to 64-bit time_t on 32-bit architectures in 2020 (kernel 5.6). The remaining risk is in places that haven't been audited — embedded systems running for decades.
How precise should application timestamps be?
For most application code, milliseconds are enough — they cover everything from user actions to API requests. For distributed systems and databases, microseconds (PostgreSQL default) help with ordering and uniqueness. For high-frequency trading and scientific instruments, nanoseconds are the standard. Be aware: JavaScript's Date is millisecond-precision; performance.now() is microsecond-precision but relative to navigation start, not Unix epoch.