Unix Seconds to Date Converter

Convert Unix timestamp in seconds (10-digit) to date and time. Standard Unix/Linux epoch time converter.

Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates. Calculate time differences and elapsed time. Supports batch operations and all time units.

Current Time
----------
-------------
----/--/-- --:--:--
----/--/-- --:--:--
Timestamp To Date
Convert Unix timestamps to human-readable dates. Auto-detects seconds, milliseconds, microseconds, or nanoseconds.

Supports up to 1000 items. Auto-detects format or select manually below.

About Unix Timestamps

What is a Unix Timestamp?

A Unix timestamp (also known as Epoch time) is the number of time units that have elapsed since January 1, 1970 00:00:00 UTC. It's a universal time representation used in computing systems worldwide.

Common Units

  • Seconds: Standard Unix timestamp (10 digits)
  • Milliseconds: JavaScript/Java standard (13 digits)
  • Microseconds: Python/PHP precision (16 digits)
  • Nanoseconds: Maximum precision (19 digits)

Unix Seconds to Date Converter - Free Online Tool

Convert traditional 10-digit Unix timestamps (seconds since January 1, 1970) to human-readable dates. This is the standard format used in Unix/Linux systems and many databases.

Unix Time in Seconds

The traditional Unix timestamp is a 10-digit number counting seconds since January 1, 1970, 00:00:00 UTC. This is the standard format in Unix/Linux systems, POSIX compliant systems, and many databases.

Command Line Examples

# Get current timestamp
date +%s

# Convert timestamp to date
date -d @1704067200

# Using GNU date
date -d @1704067200 "+%Y-%m-%d %H:%M:%S"

# macOS/BSD
date -r 1704067200

Frequently Asked Questions

What's the difference between Unix timestamp formats?

Unix timestamps come in different precisions: seconds (10 digits), milliseconds (13 digits), microseconds (16 digits), and nanoseconds (19 digits). The format depends on the system and required precision.

How do I know which timestamp format to use?

Check the number of digits: 10 = seconds (Unix/Linux), 13 = milliseconds (JavaScript/Java), 16 = microseconds (Python/PHP), 19 = nanoseconds (high-precision systems).

Are Unix timestamps timezone-aware?

Unix timestamps are always in UTC (Coordinated Universal Time). When converting to a date, you can display it in any timezone, but the timestamp itself represents a single moment in time globally.