What is a Unix Timestamp?
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. This reference point is called the Unix epoch.
Unix timestamps are timezone-independent — they represent an absolute point in time, not a local time. This makes them ideal for storing and comparing times across different time zones.
Seconds vs Milliseconds
| Seconds | Milliseconds | |
|---|---|---|
| Example value | 1704067200 | 1704067200000 |
| Digits | 10 | 13 |
| Precision | 1 second | 1/1000 second |
| Used in | Unix/Linux, most APIs | JavaScript, Java, high-precision timing |
Converting Timestamps
// JavaScript
const now = Date.now() // milliseconds
const seconds = Math.floor(Date.now() / 1000) // seconds
const date = new Date(1704067200000) // from ms timestamp
# Python
import time, datetime
now = int(time.time()) # seconds
dt = datetime.datetime.fromtimestamp(1704067200)
// PHP
$now = time(); // seconds
$date = date('Y-m-d', 1704067200);The Year 2038 Problem
32-bit signed integers can store values up to 2,147,483,647. Unix timestamps will overflow this limit on January 19, 2038 at 03:14:07 UTC. Modern 64-bit systems are not affected — a 64-bit timestamp won't overflow for approximately 292 billion years.
✓ Always use 64-bit integers for timestamps in new applications. Most modern languages and databases default to 64-bit, but legacy embedded systems and older databases may still use 32-bit.