Timestamp
A timestamp is a specific date and time "stamped" on a digital record or file. While most often used as a noun, the word "timestamp" can also be a verb. For example, "The tweet was timestamped on January 8, 2021, at 10:44 AM."
Timestamps are the standard way to store dates and times on computer systems. For example, operating systems timestamp each file with a created and modified date and time. Digital cameras timestamp each captured photo. Social media platforms store a timestamp with each post, such as the Twitter example above.
While timestamps are universal, there is no universal timestamp format. For example, a programming language may use one method, while a database may use another. Even operating systems have different ways of storing timestamps. For instance, Windows uses the ANSI standard and stores timestamps as the number of seconds since January 1, 1601. Unix stores timestamps as the number of seconds that have elapsed since midnight on January 1, 1970. Because several different timestamp formats exist, most modern programming languages have built-in timestamp conversion functions.
Storing a timestamp as an integer is efficient since it requires minimal storage space. However, the number must be converted to a legible time format when displayed. MySQL has a TIMESTAMP data type, which conveniently stores timestamps in the following format:
YYYY-MM-DD HH:MM:SS
MySQL stores timestamps in UTC, which is based in England. So January 16, 2021 at 10:15:30 AM US Central Time would be stored in a MySQL database as follows:
2021-01-16 16:15:30
If converted to a timestamp in Linux, this time/date would be represented as:
1610813730
Timestamps also have different resolutions or specificity. In some cases, seconds are sufficient, while in others, milliseconds or even nanoseconds are required. The Linux timestamp above would be 1610813730000 in milliseconds, which provides a resolution of one-thousandth of a second. Computing operations may require timestamps with even higher resolution. PHP includes a microtime() function that outputs a timestamp in microseconds, with a resolution of one-millionth of a second.