Computer: Difference between revisions
imported>Fredrik Johansson m (spelling) |
imported>Pat Palmer (ref to invention of paper) |
||
Line 1: | Line 1: | ||
During World War II, the first '''computer'''s (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. | During World War II, the first '''computer'''s (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. | ||
The [[history of computing]] is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the [[abacus]]<ref name="Abacus">{{cite web|url=http://portal.acm.org/citation.cfm?id=320962|title=Origin and Development of the Chinese Abacus|publisher=Journal of the ACM (JACM) Volume 6 , Issue 1 (January 1959) Pages: 102 - 110|year=1959|accessdate=2007-04-24}}</ref>, the [[slide rule]]<ref name="SlideRule">{{cite web|url=http://www.oughtred.org/history-new.shtml|title=Slide Rule History by the Oughtred Society|year=2006|accessdate=2007-04-24}}</ref>, and a host of mechanical [[adding machine]]s<ref name="AddingMachine">{{cite web|url=http://www.hpmuseum.org/adder.htm|title=Adding Machines by The Museum of HP Calculators, text and images Copyright David G. Hicks, 1995 - 2005|year=2005|accessdate=2007-04-24}}</ref>. But the electronic computer's rapid evolution forever changed [[science]], the [[military]], and [[business]]. The computer has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millennia ago), or with the invention of the [[printing press]] (~1450). Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the [[digital revolution]]. | The [[history of computing]] is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the [[abacus]]<ref name="Abacus">{{cite web|url=http://portal.acm.org/citation.cfm?id=320962|title=Origin and Development of the Chinese Abacus|publisher=Journal of the ACM (JACM) Volume 6 , Issue 1 (January 1959) Pages: 102 - 110|year=1959|accessdate=2007-04-24}}</ref>, the [[slide rule]]<ref name="SlideRule">{{cite web|url=http://www.oughtred.org/history-new.shtml|title=Slide Rule History by the Oughtred Society|year=2006|accessdate=2007-04-24}}</ref>, and a host of mechanical [[adding machine]]s<ref name="AddingMachine">{{cite web|url=http://www.hpmuseum.org/adder.htm|title=Adding Machines by The Museum of HP Calculators, text and images Copyright David G. Hicks, 1995 - 2005|year=2005|accessdate=2007-04-24}}</ref>. But the electronic computer's rapid evolution forever changed [[science]], the [[military]], and [[business]]. The computer has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millennia ago)<ref name="Paper">{{cite web|url=http://www.wipapercouncil.org/invention.htm|title=The Invention of Paper Copyright © 2004 Wisconsin Paper Council|year=2004|accessdate=2007-04-24}}</ref>, or with the invention of the [[printing press]] (~1450). Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the [[digital revolution]]. | ||
==The nature of computing== | ==The nature of computing== |
Revision as of 16:11, 25 April 2007
During World War II, the first computers (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects[1][2]. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word computer may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.
The history of computing is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the abacus[3], the slide rule[4], and a host of mechanical adding machines[5]. But the electronic computer's rapid evolution forever changed science, the military, and business. The computer has vastly expanded human ability to store and share information; as such, its invention may be a milestone for humanity on a par with the advent of writing and materials to write on (millennia ago)[6], or with the invention of the printing press (~1450). Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the digital revolution.
The nature of computing
Some people define a computer as a machine for manipulating structured data according to instructions known as a program. However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are universal information-processing machines, but at the deepest level, what they really do is perform arithmetic. Computers and mathematics are closely related. The theory of computation is a branch of mathematics, and its evolution, pioneered by brilliant twentieth-century mathematians such as Alan Turing (among many others), enabled the invention of electronic computers. And as usual in mathematics, their work built on that of earlier mathematicians as described in the history of computing.
Today, most computers do arithmetic using the binary number system, because a binary number can be represented by an array of on-off switches, with each 0 or 1 digit, or bit, stored in one switch. In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays. Later, vacuum tubes replaced electronic relays, and eventually transistors replaced both relays and tubes. Transisters can now be manufactured as tiny devices, almost molecular in size, embedded within silicon chips. These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings). More information on how electronic computers work is available in computer architecture.
Initially, mathematicians and scientists were the only users of computers. But today, what we tend to think of as a computer consists not only of the underlying hardware, with its limited instruction set that performs arithmetic, but also an operating system, which is a set of programs which allow people to use the computer more easily. The operating system is software (programs running on a computer). Without an operating system, a computer is not useful; the operating system helps people to write new programs for the computer and to perform many other activities on a computer.
Academia and professional societies
Since the early 1980's, universities started offering degrees in the academic disciplines such as computer science or computer engineering, devoted to the design of hardware and software for computers. These general fields of study soon came to consist of many sub-fields. In addition, most academic disciplines, and most businesses, use computers as tools.
Below are some of the professional and academic disciplines that teach the techniques to construct, program, and use computers. There is often overlap of functions and terminology across these categories:
- artificial intelligence or machine learning (two sub-fields for solving difficult problems in software)
- computer architecture (the study of how computers work, and how specific computers can be built)
- programming languages (specifications for how people ought to write computer programs)
- compilers (writing programs that allow people to use a programming language)
- computer engineering (a branch of electrical engineering that focuses both on hardware and operating system design)
- computer science (the academic study of computers and computation, including aspects of both theory and implementation)
- software engineering (management of the process of creating complex software systems)
- information systems or information technology (study of computer systems, usually in a business or organizational context)
- geographical information systems (combining latitude and longitude information with computer mapping programs)
- machine translation (software for translating one natural language into another)
Professional societies dedicated to computers include the British Computer Society, the Association for Computing Machinery (ACM) and the IEEE Computer Society.
The economics of the computer industry
Since the 1950's, a vigorous cycle of business activity has arisen from the development of computers, including many corporations engaged in creating computer hardware, operating systems, or other software. The business climate has evolved rapidly along with the technology, with some companies being born and meeting their demise in rapid succession, while other companies survived for decades (though usually by changing their focus rapidly in response to industry growth).
The ability of many different companies to make computer parts, hardware or software, comes from industry-wide adoption of standards. Various consortiums and United States or international standards organizations serve as arbitrators of computing standards, including ANSI, WC3, ECMA and ISO. In addition to formal standards, many informal standards have arisen due to consumer "voting" by purchasing certain products. The first written standards arose from the Internet Engineering Taskforse (IETF)[7], born in the late 1960's as a result of the U. S. Defense Advanced Reseach (DARPA) initiative, and leading eventually to the development of the internet. The open nature of the IETF, in which any person could submit a proposal (called a Request for Comment, or RFC) was remarkable, and the IETF proved to be about as effective as formally endorsed standards bodies at creating usable and widely adopted standards. The non-proprietary nature of the RFC process also foreshadowed the later development, in the 1980's, of the open source software movement. Some standards also resulted from a deliberate sharing of specifications by industry participants, notably the open specifications leading to the industry-wide IBM compatible PC beginning in the early 1980's.
The quick pace of growth in computer engineering was codified into a widely quoted rule of thumb, called Moore's law[8], first publicized by Gordon Moore (for many years CEO of Intel). For decades after the invention of the computer, this economic boom centered in the United States, but beginning in the 1990's, it also spread rapidly overseas, especially into Europe, China and India. Computers are now a world-wide phenomenon.
References
- ↑ Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
- ↑ The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
- ↑ Origin and Development of the Chinese Abacus. Journal of the ACM (JACM) Volume 6 , Issue 1 (January 1959) Pages: 102 - 110 (1959). Retrieved on 2007-04-24.
- ↑ Slide Rule History by the Oughtred Society (2006). Retrieved on 2007-04-24.
- ↑ Adding Machines by The Museum of HP Calculators, text and images Copyright David G. Hicks, 1995 - 2005 (2005). Retrieved on 2007-04-24.
- ↑ The Invention of Paper Copyright © 2004 Wisconsin Paper Council (2004). Retrieved on 2007-04-24.
- ↑ "IETF: History, Background, and Role in Today's Internet". Gary C. Kessler (1996). Retrieved on 2007-04-23.
- ↑ Moore's Law © Intel Corporation. Intel Corporation (date_unknown). Retrieved on 2007-04-23.