Timestamp

A timestamp is a specific date and time "stamped" on a digital record or file. While most often used as a noun, the word "timestamp" can also be a verb. For example, "The tweet was timestamped on January 8, 2021, at 10:44 AM."

Timestamps are the standard way to store dates and times on computer systems. For example, operating systems timestamp each file with a created and modified date and time. Digital cameras timestamp each captured photo. Social media platforms store a timestamp with each post, such as the Twitter example above.

While timestamps are universal, there is no universal timestamp format. For example, a programming language may use one method, while a database may use another. Even operating systems have different ways of storing timestamps. For instance, Windows uses the ANSI standard and stores timestamps as the number of seconds since January 1, 1601. Unix stores timestamps as the number of seconds that have elapsed since midnight on January 1, 1970. Because several different timestamp formats exist, most modern programming languages have built-in timestamp conversion functions.

A Unix timestamp is also known as "epoch time," which is equivalent to the number of seconds since January 1, 1970, at 12:00 AM GMT (Greenwich Mean Time). This GMT (or "UTC") date/time may be displayed by default on Unix and Linux devices (including Android) when a valid timestamp is not available. In the western hemisphere, such as the United States, this date will appear as December 31, 1969.

Storing a timestamp as an integer is efficient since it requires minimal storage space. However, the number must be converted to a legible time format when displayed. MySQL has a TIMESTAMP data type, which conveniently stores timestamps in the following format:

YYYY-MM-DD HH:MM:SS

MySQL stores timestamps in UTC, which is based in England. So January 16, 2021 at 10:15:30 AM US Central Time would be stored in a MySQL database as follows:

2021-01-16 16:15:30

If converted to a timestamp in Linux, this time/date would be represented as:

1610813730

Timestamps also have different resolutions or specificity. In some cases, seconds are sufficient, while in others, milliseconds or even nanoseconds are required. The Linux timestamp above would be 1610813730000 in milliseconds, which provides a resolution of one-thousandth of a second. Computing operations may require timestamps with even higher resolution. PHP includes a microtime() function that outputs a timestamp in microseconds, with a resolution of one-millionth of a second.

Updated January 16, 2021 by Per C.

quizTest Your Knowledge

Wha are input boxes within a form called?

A
Placeholders
0%
B
Dialog boxes
0%
C
Paths
0%
D
Fields
0%
Correct! Incorrect!     View the Field definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Timestamp on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.