Moore's law

From Citizendium
Jump to navigation Jump to search
This article is developing and not approved.
Main Article
Discussion
Definition [?]
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Moore's law is the prediction that the transistor density inside integrated circuits will double every two years, and that prices will decline at the same time[1]. The phenomemon predicted by Moore's law, first described in 1965, has held remarkably true to date, and experts predict that this trend might continue until ~2020 or so, declining at the point where switching element sizes reach the molecular level. Moore's law is not really a law, but rather more a “rule of thumb” or a practical way to think about something.

Another way to think about it is that density is inversely proportional to the distance that signals must traverse, so Moore's Law does not only address the computational capability of more circuit elements, but workarounds to speed-of-light limitations.

186,300 miles per second. It's not just a good idea. It's the Law. — Seen at a T-shirt at an IETF meeting

Moore's law is named for Gordon Moore, a co-founder of Intel, who wrote about it in "Cramming more components onto integrated circuits", Electronics Magazine 19 April 1965[1]:

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer

Although named for him, Gordon Moore may not have invented Moore's law; instead, he may have heard Douglas Engelbart, a co-inventor of the mechanical computer mouse, discuss the projected downscaling of integrated circuit size in a 1960 lecture.[2]. Moore's observation was named a 'law' by the Caltech professor and VLSI pioneer Carver Mead[1].

In 1975, Moore projected a doubling only every two years. He is adamant that he never said "every 18 months", but that is how it has been quoted. The SEMATECH roadmap follows a 24 month cycle. In April 2005, Intel offered $10,000 for a copy of the original Electronics Magazine. [3]

It has become common practice to cite Moore's Law as a predictor for the rapid advance in computing power per unit cost in a variety of computer-related technologies, such as hard disk storage cost per unit of information, even though such advances may have little to do with transistor technology advances. Moore's Law has become synonymous with saying "things very quickly gain capabilities, while cost falls".

Processor speed not always the limiting factor

A new important chapter started for Moore's law in 2003. The cycles per second for microprocessors went flat, and processor designers began to keep Moore's law alive by adding extra cores. multicore microprocessors in personal computers. [4]

Power consumption

The problem of absolute power consumption surfaced at about the same time. The power consumption of computers is now a non-trivial fraction of each modern nation's total power output. (about 2% in 2005) The current limitations to Moore's law are not due to hardware engineering, rather the burden is on programmers to make use of multiple cores and the burden is on society to pay the cost of the extra electrical load. In response, metrics like flops/watt are being used to help keep computation affordable. Note: 1.2% of total US electrical output in 2005 went for servers alone. http://dl.klima2008.net/ccsl/koomey_long.pdf (Koomey, 2008)

Processor faster; software slower

There is a joke about "Gates' Law", a sort of inverse of Moore's Law:

“The speed of software halves every 18 months.” This oft-cited law is an ironic comment on the tendency of software bloat to outpace the every-18-month doubling in hardware capacity per dollar predicted by Moore's Law. The reference is to Bill Gates; Microsoft is widely considered among the worst if not the worst of the perpetrators of bloat. Jargon file

Memory

Some applications are memory, not processor limited. Admittedly, memory is also dependent on many of the same semiconductor technologies as processors, but a given computer may not be able to accept more physical memory. Desktop computers with Microsoft operating systems often improve more with more memory than a replacement faster processor -- although the situation can be confusing if a different processor is needed to accept more memory.

In the early 1990s, the crisis in Internet routing, which was worked around with Classless Inter-Domain Routing (CIDR), was a practical consequence of the Cisco AGS, then the most common Internet core router, holding a maximum of 16 megabytes of memory. Until the CIDR techniques went into operational use, the number of routes was doubling every five months and threatening to overload the routing tables of those routers.

Networks

Another set of laws deals with the growth of connectivity in computer networks. Sarnoff's Law deals with the value of a one-to-many radio or television broadcast network, as opposed to Metcalfe's Law about any-to-any internets.

References

  1. 1.0 1.1 1.2 Excerpts from A Conversation with Gordon Moore: Moore’s Law (PDF) 1. Intel Corporation (2005). Retrieved on May 2, 2006.
  2. NY Times article April 17 2005
  3. Michael Kanellos (2005-04-12). $10,000 reward for Moore's Law original. CNET News.com. Retrieved on June 24, 2006.
  4. Multi-Core