Gigabit

A gigabit is 109 or 1,000,000,000 bits.

One gigabit (abbreviated "Gb") is equal to 1,000 megabits or 1,000,000 kilobits. It is one-eighth the size of a gigabyte (GB).

Gigabits are most often used to measure data transfer rates of local networks and I/O connections. For example, Gigabit Ethernet is a common Ethernet standard that supports data transfer rates of one gigabit per second (Gbps) over a wired Ethernet network. Modern I/O technologies, such as USB 3.0 and Thunderbolt are also measured in gigabits per second. USB 3.0 can transfer data at up to 5 Gbps, while Thunderbolt 1.0 can transfer data bidirectionally at 10 Gbps.

While gigabits and gigabytes sound similar, it is important not to confuse the two terms. Since there are eight bits in one byte, there are also eight gigabits in one gigabyte. Gigabits are most often used to describe data transfer speeds, while gigabytes are used to measure data storage.

Updated October 3, 2013 by Per C.

quizTest Your Knowledge

What is another name for base-8 notation?

A
Octal
0%
B
Hexadecimal
0%
C
Binary
0%
D
Denary
0%
Correct! Incorrect!     View the Octal definition.
More Quizzes →

The Tech Terms Computer Dictionary

The definition of Gigabit on this page is an original definition written by the TechTerms.com team. If you would like to reference this page or cite this definition, please use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about this definition or would like to suggest a new technical term, please contact us.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.