Unix Timestamps: What They Are, Why They Break, and How to Convert Them
A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. Right now, that number is somewhere around 1.77 billion. Every date and time in every system you ...

Source: DEV Community
A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. Right now, that number is somewhere around 1.77 billion. Every date and time in every system you have ever used is, at some layer, backed by a number like this. The simplicity is its strength. A single integer represents a moment in time unambiguously, independent of time zones, date formats, calendar systems, or locale. But that simplicity hides a collection of edge cases that have caused some of the most famous bugs in computing history. Seconds vs. milliseconds JavaScript's Date.now() returns milliseconds since epoch. Most Unix systems use seconds. This is the source of approximately 40% of all timestamp bugs I encounter. Date.now() // 1774000000000 (milliseconds) Math.floor(Date.now() / 1000) // 1774000000 (seconds) If you pass a millisecond timestamp to a function expecting seconds, you get a date in the year 58,000. If you pass seconds to a function expecting milliseconds, you get