The first experimental version of Ethernet wired networking ran at a connection speed of 2.94 megabits per second (Mbps) in 1973. By the time Ethernet became an industry standard in 1982, its speed rating increased to 10 Mbps due to improvements in the technology. Ethernet kept this same rating for more than ten years. Different forms of the standard were named starting with the number ten, including 10-Base2 and 10-BaseT.
The technology colloquially called Fast Ethernet was introduced starting in the mid-1990s. It picked up that name as Fast Ethernet standards support a maximum data rate of 100 Mbps, ten times faster than traditional Ethernet. Other common names for this new standard included 100-BaseT2 and 100-BaseTX.
Fast Ethernet was widely deployed as the need for greater LAN performance became critical to universities and businesses. A key element of its success was its ability to coexist with existing network installations. Mainstream network adapters of the day were built to support both traditional and Fast Ethernet. These so-called “10/100” adapters sense the line speed automatically and adjust connection data rates accordingly.
Just as Fast Ethernet improved on traditional Ethernet, Gigabit Ethernet improved on Fast Ethernet, offering rates up to 1000 Mbps. Although 1000-BaseX and 1000-BaseT versions were created in the late 1990s, it took many more years for Gigabit Ethernet to reach large-scale adoption due to its higher cost.
Maximum Speed versus Actual Speed
The speed ratings of Ethernet have been criticized for being unachievable in real-word usage. Similar to the fuel efficiency ratings of automobiles, network connection speed ratings are calculated under ideal conditions that do not necessarily represent normal operating environments. It is not possible to exceed these speed ratings as they are maximum values