Unix Timestamp Converter

A Unix timestamp is the number of seconds elapsed since January 1, 1970 (UTC), known as the Unix epoch. It is the standard time representation used in databases, APIs, log files, and programming. Use this tool to convert a Unix timestamp to a human-readable date, or convert a date/time string back to its Unix timestamp equivalent — in both UTC and your local timezone.

Current Unix Timestamp LIVE

Unix Timestamp → Human Date

Tip: Press Ctrl+Enter to convert

Human Date → Unix Timestamp

Common Timestamp Formats

What this tool does

Unix timestamps — integers counting seconds (or milliseconds) since January 1, 1970 UTC — are the standard way systems store time. When you see 1713191400 in a database column, API response, log file, or JWT claim, this tool converts it to a readable date in under a second. You can also go the other direction: enter a date and get the corresponding Unix timestamp back.

How to use it

  • Paste a Unix timestamp (seconds or milliseconds) into the top input and click Convert.
  • The tool auto-detects seconds vs milliseconds based on digit count: 10 digits = seconds, 13 = milliseconds.
  • To convert a date to a timestamp: use the date/time picker in the lower section and click Convert.
  • The live clock at the top shows the current Unix timestamp in real time — useful as a quick reference point.

Identifying the precision

  • 10 digits (e.g. 1713191400) — Seconds. JWT exp/iat claims, Python's time.time(), most Unix system calls.
  • 13 digits (e.g. 1713191400000) — Milliseconds. JavaScript's Date.now(), Java's System.currentTimeMillis().
  • 16 digits — Microseconds. Some database profilers and observability tools.

Frequently Asked Questions

Why do systems use Unix timestamps instead of formatted dates?

Unix timestamps are timezone-neutral integers — the same value means the exact same moment worldwide. A formatted string like "2024-04-15 14:30:00" is ambiguous without a timezone. Arithmetic is also simple: "how many seconds between these two events?" is a subtraction. With formatted dates, that requires parsing, normalization, and timezone handling.

My timestamp converts to 1970 or 2286 — what went wrong?

Almost certainly a precision mismatch. A 13-digit millisecond timestamp treated as seconds will show a date in 2286. A 10-digit seconds timestamp divided by 1000 will show 1970. Divide 13-digit values by 1000 to get seconds, or multiply 10-digit values by 1000 for milliseconds.

What is the Year 2038 problem?

Timestamps stored in a signed 32-bit integer overflow on January 19, 2038 at 03:14:07 UTC. The value wraps to the most negative 32-bit integer, which most systems interpret as December 1901. Modern 64-bit systems are unaffected, but older embedded systems, some automotive controllers, and legacy POSIX environments are at risk.

Want the full explanation? Read the guide: Unix Timestamps: The Hidden Clock Inside Every Codebase →