Search for LIMS content across all our Wiki Knowledge Bases.
Type a search term to find related articles by LIMS subject matter experts gathered from the most trusted and dynamic collaboration tools in the laboratory informatics industry.
Law predicting that bandwidth and data rates double every 18 months
Edholm's law, proposed by and named after Phil Edholm, refers to the observation that the three categories of telecommunication,[1] namely wireless (mobile), nomadic (wireless without mobility) and wired networks (fixed), are in lockstep and gradually converging.[2] Edholm's law also holds that data rates for these telecommunications categories increase on similar exponential curves, with the slower rates trailing the faster ones by a predictable time lag.[3] Edholm's law predicts that the bandwidth and data rates double every 18 months, which has proven to be true since the 1970s.[1][4] The trend is evident in the cases of Internet,[1]cellular (mobile), wirelessLAN and wireless personal area networks.[4]
Concept
Edholm's law was proposed by Phil Edholm of Nortel Networks. He observed that telecommunication bandwidth (including Internet access bandwidth) was doubling every 18 months, since the late 1970s through to the early 2000s. This is similar to Moore's law, which predicts an exponential rate of growth for transistorcounts. He also found that there was a gradual convergence between wired (e.g. Ethernet), nomadic (e.g. modem and Wi-Fi) and wireless networks (e.g. cellular networks). The name "Edholm's law" was coined by his colleague, John H. Yoakum, who presented it at a 2004 Internet telephony press conference.[1]
Slower communications channels like cellphones and radio modems were predicted to eclipse the capacity of early Ethernet, due to developments in the standards known as UMTS and MIMO, which boosted bandwidth by maximizing antenna usage.[1] Extrapolating forward indicates a convergence between the rates of nomadic and wireless technologies around 2030. In addition, wireless technology could end wireline communication if the cost of the latter's infrastructure remains high.[2]
Underlying factors
In 2009, Renuka P. Jindal observed the bandwidths of online communication networks rising from bits per second to terabits per second, doubling every 18 months, as predicted by Edholm's law. Jindal identified the following three major underlying factors that have enabled the exponential growth of communication bandwidth.[5]
The MOSFET was invented at Bell Labs between 1955 and 1960, after Frosch and Derick discovered and used surface passivation by silicon dioxide to create the first planar transistors, the first in which drain and source were adjacent at the same surface.[6][7][8][9][10] Advances in MOSFET technology (MOS technology) has been the most important contributing factor in the rapid rise of bandwidth in telecommunications networks. Continuous MOSFET scaling, along with various advances in MOS technology, has enabled both Moore's law (transistor counts in integrated circuit chips doubling every two years) and Edholm's law (communication bandwidth doubling every 18 months).[5]
Laser lightwave systems – The laser was demonstrated by Charles H. Townes and Arthur Leonard Schawlow at Bell Labs in 1960. Laser technology was later adopted in the design of integrated electronics using MOS technology, leading to the development of lightwave systems around 1980. This has led to exponential growth of bandwidth since the early 1980s.[5]
Information theory – Information theory, as enunciated by Claude Shannon at Bell Labs in 1948, provided a theoretical foundation to understand the trade-offs between signal-to-noise ratio, bandwidth, and error-free transmission in the presence of noise, in telecommunications technology. In the early 1980s, Renuka Jindal at Bell Labs used information theory to study the noise behaviour of MOS devices, improving their noise performance and resolving issues that limited their receiver sensitivity and data rates. This led to a significant improvement in the noise performance of MOS technology, and contributed to the wide adoption of MOS technology in lightwave and then wireless terminal applications.[5]
In recent years, another enabling factor in the growth of wirelesscommunication networks has been interference alignment, which was discovered by Syed Ali Jafar at the University of California, Irvine.[15] He established it as a general principle, along with Viveck R. Cadambe, in 2008. They introduced "a mechanism to align an arbitrarily large number of interferers, leading to the surprising conclusion that wireless networks are not essentially interference limited." This led to the adoption of interference alignment in the design of wireless networks.[16] According to New York University senior researcher Dr. Paul Horn, this "revolutionized our understanding of the capacity limits of wireless networks" and "demonstrated the astounding result that each user in a wireless network can access half of the spectrum without interference from other users, regardless of how many users are sharing the spectrum."[15]
^Webb, William (2007). Wireless Communications: The Future. Hoboken, NJ: John Wiley & Sons, Ltd. p. 67. ISBN 9780470033128.
^ abDeng, Wei; Mahmoudi, Reza; van Roermund, Arthur (2012). Time Multiplexed Beam-Forming with Space-Frequency Transformation. New York: Springer. p. 1. ISBN 9781461450450.
^ abO'Neill, A. (2008). "Asad Abidi Recognized for Work in RF-CMOS". IEEE Solid-State Circuits Society Newsletter. 13 (1): 57–58. doi:10.1109/N-SSC.2008.4785694. ISSN1098-4232.
^Jafar, Syed A. (2010). "Interference Alignment — A New Look at Signal Dimensions in a Communication Network". Foundations and Trends in Communications and Information Theory. 7 (1): 1–134. CiteSeerX10.1.1.707.6314. doi:10.1561/0100000047.