Why is 1/1/1970 the "epoch time"?

UnixTimeEpoch

Unix Problem Overview


Why is

> 1 January 1970 00:00:00

considered the epoch time?

Unix Solutions


Solution 1 - Unix

Early versions of unix measured system time in 1/60 s intervals. This meant that a 32-bit unsigned integer could only represent a span of time less than 829 days. For this reason, the time represented by the number 0 (called the epoch) had to be set in the very recent past. As this was in the early 1970s, the epoch was set to 1971-1-1.

Later, the system time was changed to increment every second, which increased the span of time that could be represented by a 32-bit unsigned integer to around 136 years. As it was no longer so important to squeeze every second out of the counter, the epoch was rounded down to the nearest decade, thus becoming 1970-1-1. One must assume that this was considered a bit neater than 1971-1-1.

Note that a 32-bit signed integer using 1970-1-1 as its epoch can represent dates up to 2038-1-19, on which date it will wrap around to 1901-12-13.

Solution 2 - Unix

History.

> The earliest versions of Unix time had > a 32-bit integer incrementing at a > rate of 60 Hz, which was the rate of > the system clock on the hardware of > the early Unix systems. The value 60 > Hz still appears in some software > interfaces as a result. The epoch also > differed from the current value. The > first edition Unix Programmer's Manual > dated November 3, 1971 defines the > Unix time as "the time since 00:00:00, > Jan. 1, 1971, measured in sixtieths of > a second".

Solution 3 - Unix

Epoch reference date

An epoch reference date is a point on the timeline from which we count time. Moments before that point are counted with a negative number, moments after are counted with a positive number.

Many epochs in use

>Why is 1 January 1970 00:00:00 considered the epoch time?

No, not the epoch, an epoch. There are many epochs in use.

This choice of epoch is arbitrary.

Major computers systems and libraries use any of at least a couple dozen various epochs. One of the most popular epochs is commonly known as Unix Time, using the 1970 UTC moment you mentioned.

While popular, Unix Time’s 1970 may not be the most common. Also in the running for most common would be January 0, 1900 for countless Microsoft Excel & Lotus 1-2-3 spreadsheets, or January 1, 2001 used by Apple’s Cocoa framework in over a billion iOS/macOS machines worldwide in countless apps. Or perhaps January 6, 1980 used by GPS devices?

Many granularities

Different systems use different granularity in counting time.

Even the so-called “Unix Time” varies, with some systems counting whole seconds and some counting milliseconds. Many database such as Postgres use microseconds. Some, such as the modern java.time framework in Java 8 and later, use nanoseconds. Some use still other granularities.

ISO 8601

Because there is so much variance in the use of an epoch reference and in the granularities, it is generally best to avoid communicating moments as a count-from-epoch. Between the ambiguity of epoch & granularity, plus the inability of humans to perceive meaningful values (and therefore miss buggy values), use plain text instead of numbers.

The ISO 8601 standard provides an extensive set of practical well-designed formats for expressing date-time values as text. These formats are easy to parse by machine as well as easy to read by humans across cultures.

These include:

Solution 4 - Unix

http://en.wikipedia.org/wiki/Unix_time#History explains a little about the origins of Unix time and the chosen epoch. The definition of unix time and the epoch date went through a couple of changes before stabilizing on what it is now.

But it does not say why exactly 1/1/1970 was chosen in the end.

Notable excerpts from the Wikipedia page:

> The first edition Unix Programmer's Manual dated November 3, 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second". > > Because of [the] limited range, the epoch was redefined more than once, before the rate was changed to 1 Hz and the epoch was set to its present value. > > Several later problems, including the complexity of the present definition, result from Unix time having been defined gradually by usage rather than fully defined to start with.

Solution 5 - Unix

Short answer: Why not?

Longer answer: The time itself doesn't really matter, as long as everyone who uses it agrees on its value. As 1/1/70 has been in use for so long, using it will make you code as understandable as possible for as many people as possible.

There's no great merit in choosing an arbitrary epoch just to be different.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionrahulView Question on Stackoverflow
Solution 1 - UnixMatt HowellsView Answer on Stackoverflow
Solution 2 - UnixStu ThompsonView Answer on Stackoverflow
Solution 3 - UnixBasil BourqueView Answer on Stackoverflow
Solution 4 - UnixDawie StraussView Answer on Stackoverflow
Solution 5 - UnixPaulJWilliamsView Answer on Stackoverflow