Gbps

Stands for "Gigabits per second." 1Gbps is equal to 1,000 Megabits per second (Mbps), or 1,000,000,000 bits per second. Gbps is commonly used to measure data transfer speeds between hardware devices.

For many years, data transfer speeds were only measured in Mbps and Kbps. However, modern hardware interfaces can now transfer data over one gigabit per second, which makes Gbps a necessary unit of measurement. Examples of these interfaces include SATA 3 (6Gbps), USB 3.0 (5Gbps), and Thunderbolt (10Gbps). Additionally, Gigabit Ethernet can transfer data up to 1Gbps.

NOTE: The lowercase "b" is Gbps indicates it stands for "Gigabits" rather than "Gigabytes." Since one byte equals eight bits, 1GBps is equal to 8Gbps. While storage capacity is typically measured in bytes, data transfer speeds are typically measured in bits. Therefore, Gbps is much more commonly used than GBps.

Updated July 23, 2011 by Per C.

quizTest Your Knowledge

Which year did the Unix epoch begin?

A
1965
0%
B
1970
0%
C
1975
0%
D
1980
0%
Correct! Incorrect!     View the Epoch Time definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Gbps on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.