Convert Unix Timestamp to Date in Python — datetime and Pandas Methods
Table of Contents
Python has three different ways to convert Unix timestamps, and which one you should use depends on whether you care about timezones, whether you have one timestamp or a million, and whether you are inside a Pandas DataFrame or a regular script.
This guide covers all three: datetime.fromtimestamp for single values, Pandas to_datetime for bulk conversion, and datetime.utcfromtimestamp for the timezone-aware approach. Plus the gotchas that have shipped real bugs.
If you just need a quick one-off conversion without writing Python, the free Unix timestamp converter handles it in your browser.
Convert a Unix Timestamp Using datetime
The standard library has datetime.fromtimestamp() for single conversions. By default it returns a naive datetime in your local timezone — which is almost never what you want.
from datetime import datetime, timezone
# Local timezone (your machine's clock setting)
ts = 1711000000
dt = datetime.fromtimestamp(ts)
print(dt) # 2024-03-21 03:46:40 (local time, no tz info)
# UTC — what you actually want most of the time
dt_utc = datetime.fromtimestamp(ts, tz=timezone.utc)
print(dt_utc) # 2024-03-21 07:46:40+00:00
The second form is the safe one. It returns a timezone-aware datetime with UTC explicitly attached. Compare two timezone-aware datetimes and Python does the right thing. Compare a naive and an aware datetime and Python raises a TypeError — which is the correct behavior because that comparison is meaningless.
Going the other direction
from datetime import datetime, timezone
# Current Unix timestamp (seconds, integer)
now_ts = int(datetime.now(timezone.utc).timestamp())
# A specific datetime to Unix timestamp
dt = datetime(2026, 4, 8, 12, 0, 0, tzinfo=timezone.utc)
ts = int(dt.timestamp())
Always pass tzinfo=timezone.utc when constructing the datetime. Without it, .timestamp() assumes local time, which produces different results on different machines.
Handling Milliseconds and Microseconds
Python's datetime supports microsecond precision but expects timestamps in seconds. If your timestamp is in milliseconds (13 digits) or nanoseconds (19 digits), divide first.
from datetime import datetime, timezone
# Milliseconds (JavaScript-style)
ms_ts = 1711000000123
dt = datetime.fromtimestamp(ms_ts / 1000, tz=timezone.utc)
# Nanoseconds (some Go and Kafka APIs)
ns_ts = 1711000000123456789
dt = datetime.fromtimestamp(ns_ts / 1_000_000_000, tz=timezone.utc)
Watch out for floating point precision. Dividing a 19-digit integer by a billion in floating point loses precision past microseconds. If you need exact nanosecond accuracy, use Pandas pd.Timestamp(ns_ts, unit='ns') instead — it stores nanoseconds as integers and never loses precision.
Quick check: what unit is my timestamp in?
- 10 digits → seconds (year 2001-2286)
- 13 digits → milliseconds (same year range, finer resolution)
- 16 digits → microseconds (Kafka, some Linux logs)
- 19 digits → nanoseconds (Go time, Kafka raw, eBPF)
Convert a Column of Unix Timestamps with Pandas
For DataFrames with thousands of timestamps, use pd.to_datetime with the unit parameter. It vectorizes the conversion and handles seconds, milliseconds, microseconds, and nanoseconds with one argument change.
import pandas as pd
df = pd.DataFrame({'ts': [1711000000, 1711000060, 1711000120]})
# Seconds (default for integer values is nanoseconds — be explicit)
df['datetime'] = pd.to_datetime(df['ts'], unit='s', utc=True)
# Milliseconds
df['datetime'] = pd.to_datetime(df['ts'], unit='ms', utc=True)
# Convert to a specific timezone for display
df['ny_time'] = df['datetime'].dt.tz_convert('America/New_York')
The utc=True argument is what you want 99% of the time. Without it, Pandas returns naive datetimes that you cannot safely combine with timezone-aware data later. If you forget it, your code will work in your test data and break the moment a daylight savings transition happens in production.
Going from datetime back to Unix timestamp
df['unix_ts'] = df['datetime'].astype('int64') // 10**9
Pandas stores datetimes internally as nanoseconds since epoch. Dividing by 10^9 gives seconds. Use // 10**6 for milliseconds, // 10**3 for microseconds.
Python Timestamp Conversion Pitfalls
Pitfall 1: Naive vs aware datetimes
A naive datetime has no timezone. An aware datetime has one. Mixing them causes TypeError on comparison and silently wrong results on arithmetic. Pick aware (UTC) for everything that touches storage, IO, or comparison. Use naive only for display formatting at the very last step.
Pitfall 2: utcfromtimestamp is deprecated
datetime.utcfromtimestamp() still works but is deprecated since Python 3.12. It returned a naive datetime that was technically in UTC but had no tzinfo, which caused exactly the kind of bug it looked like it was preventing. Use datetime.fromtimestamp(ts, tz=timezone.utc) instead.
Pitfall 3: Pandas datetime64 has no timezone by default
Pandas datetime64[ns] is naive. To make it timezone-aware you have to explicitly localize: df['col'].dt.tz_localize('UTC') if it was logged in UTC, or tz_localize('America/New_York').dt.tz_convert('UTC') if it was logged in local time and you want to normalize.
Pitfall 4: strptime is slower than to_datetime by 100x
For bulk parsing of date strings, never loop with datetime.strptime. Pandas pd.to_datetime(series, format='%Y-%m-%d') vectorizes the same operation and runs orders of magnitude faster on large datasets.
For one-off verification when you are debugging a value, the free Unix timestamp converter handles seconds and milliseconds without you having to fire up a Python REPL.
Try It Free — No Signup Required
Runs 100% in your browser. No data is collected, stored, or sent anywhere.
Open Free Unix Timestamp ConverterFrequently Asked Questions
How do I convert a Unix timestamp to datetime in Python?
Use datetime.fromtimestamp(ts, tz=timezone.utc) from the datetime module. The tz argument is critical — without it you get a naive local-time datetime that causes timezone bugs. The result is a timezone-aware datetime in UTC.
How do I get the current Unix timestamp in Python?
Use int(datetime.now(timezone.utc).timestamp()) for seconds, or int(time.time()) from the time module. Both give you the current Unix timestamp as an integer. Avoid datetime.now() without tzinfo because it uses local time.
How do I convert a Pandas column of Unix timestamps to datetime?
Use pd.to_datetime(df["col"], unit="s", utc=True) for seconds, or unit="ms" for milliseconds. Always pass utc=True so the result is timezone-aware. Then use dt.tz_convert() to display in any local timezone.
Why does fromtimestamp give me the wrong time?
You probably called it without a tz argument, which makes it return local time based on your machine's system clock. If your machine is in EST and the timestamp is from a UTC source, you get the time shifted by 5 hours. Always pass tz=timezone.utc.
Should I use datetime.utcfromtimestamp or datetime.fromtimestamp?
Use datetime.fromtimestamp(ts, tz=timezone.utc). The utcfromtimestamp method is deprecated as of Python 3.12 because it returned a naive datetime that hid bugs. The newer form returns an explicit timezone-aware UTC datetime.
How do I handle nanosecond timestamps in Python?
For exact nanosecond precision use Pandas pd.Timestamp(ns_value, unit="ns") which stores nanoseconds as integers without floating point loss. The datetime module only supports microseconds and will lose nanosecond precision.

