SEARCH CONTENTS

Custom Search

What is the history of the computer

ADS

What is the history of the computer?


First of all, we should define what a computer to further investigate on its history: it is understood as a machine capable of manipulating data according to a predetermined list of instructions; the ability to run this list of instructions or "programs" is what distinguishes them from calculators.

It is difficult to establish what the early history of the computer were, as the very definition of what a computer has evolved, and devices that were once called "computers" today would not be classified as such. In the opinion of many, the first computer by today's standards, ie the first electronic and digital, was created by John Atanassoff between 1933 and 1942. For full details of this invention can read our article "Who invented the Computer? ".

The first models, as recently mentioned, come from the forties, although the computer concept comes from many years before. In fact the term before the digital era meant performing calculations with the aid of mechanical devices such as the abacus, the astrolabe and the slide rule; and the concept of mechanically assisted calculation can be found in history from 150 BC, which by no means is new to him era we live. Now if we narrow our historical those devices can be "programmed", we find that one of the first mechanisms precursors in our history dating from 1800. In 1801, Joseph Marie Jacquard, an inventor and manufacturer of hats, was responsible search one breakthrough to create "Jacquard loom", using punch cards to create diverse and creative patterns on her loom; punch cards are a primitive form of programmability.

Charles Babbage in 1837, devised a programmable mechanical computer called "analytical engine", but due to financial and design problems, never got to build. Although he had, for many have not been by today's standards the first computer, because it was not electronic.

During the 1890 census in the United States, was performed for the first time large-scale processing of information using punch cards in conjunction with tabulating machines, manufactured by the corporation "Computing Tabulating Recording Corporation", which later became the well-known IBM.

Just between 1930 and 1950, the history of the computer as such really began and the true precursors of current models, with features that make them recognizable by today's standards were created: In 1941 Konrad Zuse creates electromechanical machines called "Z3 "which were the first machines capable of processing binary arithmetic, and for some it was the first computer to comply with the parameters of a system" Turing complete "standard studied in computer science. As mentioned, the Atanassoff-Berry computer, which used vacuum tubes emerged in 1941. In 1944 the British created the computer "Colossus", to decode German messages.

In 1946 the US ballistic research laboratory created the ENIAC (Electronic Numerical Integrator And Computer), which was the first programmable digital computer in history a large scale, capable of solving a wide range of computing problems. Was constructed in order to perform calculations for artillery. While this already had common features with modern computers, was inflexible and permanent restructuring required to change its programming. Some developers of it, created a more flexible design, called "von Neumann architecture" in honor of the great mathematician who first proposed in 1945. The ability to store programs, this architecture we mentioned, is the common feature of these first prototypes with modern machines.

During the 50 computers were built using these principles, based on vacuum tubes, and therefore used large physical spaces to function (the size of several rooms). Following the story line, replacing these tubes by the transistor in the 1960s, allowed for smaller, fast, and affordable computers, leading computing into the commercial sphere.

For the decade of the 70s, the advent of integrated circuits and then microprocessors (the first commercial microprocessor was the Intel 4004) caused a revolution in terms of size, speed and cost. Upon reaching the 1980s, computers were already integrated appliances and other devices in common use, while the concept of "personal computer" or household been greatly expanded. With the help of the massification of Internet in the 90s, the Private computers have become so common goods such as television and telephone.

The history and generations of computers


Another form of tabular and understand the history of the computer is dividing into "generations" of them, grouping them according to their outstanding technical characteristics for each season. The first generation between 1951 and 1958 was characterized by the use of vacuum tubes, which gave him a large these models, which also generated a lot of heat when running.

The second generation (1959 - 1964), was characterized by the use of the transistor, which gave computers faster, and made them smaller. Instead of using punch cards and revolving drums, computers of this generation magnetic cores used to store information. Programming languages ​​were also introduced as COBOL.

The third generation computers (1964 - 1971) was born with the invention of integrated circuits on silicon wafers, which allowed the grouping thousands of miniature electronic components. The machines of this era became even smaller, faster and cheaper. They also began using magnetic tapes to store information.

For the history of the computer, the fourth generation (1971 - 1981) is marked by two technological advances: using chips for storing information, and the ability to integrate an even greater number of electronic components in these same chips. The first microcomputer, IBM, and known processors starting from 8088, and including the popular Pentium (I, II, II, ..) and Celeron are displayed.

The fifth generation (1982 - 1989) stands out from the former by the creation in 1982 by Cray Seymouy the first computer with the ability to process in parallel. During this period, efforts to develop artificial intelligence will also be formalized, mainly driven by the Japanese government's project "Fifth Generation".

The sixth generation (1990-present) is characterized by the growing ability of computers to allow simultaneous operation of hundreds of microprocessors, the massive development and implementation of networks and the integration of the Internet.

As we can see, divide the history of computer generations is quite arbitrary, especially for the fifth and sixth generation, where the boundaries between them are not clear and seem to melt, but still a useful resource for understanding the historical context in exciting world of computing.
Published for educational purposes
Culture and Science

SEARCH CONTENTS

Custom Search