Tool
Convert between Unix timestamps and human-readable dates. Handles both seconds (10-digit) and milliseconds (13-digit) automatically.
Accepts seconds (10 digits) and milliseconds (13 digits)
A Unix timestamp is the number of seconds that have passed since January 1, 1970 00:00:00 UTC (the "Unix epoch"). When you see something like 1712300400 in an API response, a log file, or a database column, that's a Unix timestamp. Programs store time this way because it's timezone-agnostic and easy to sort, but for humans it's just a meaningless number.
| Format | Length | Common in |
|---|---|---|
| Seconds (Unix) | 10 digits | Python, PHP, MySQL, most backends |
| Milliseconds (JS) | 13 digits | JavaScript Date.now(), Java, Elasticsearch |