Saturday, 3 March 2012

History of Computer(part-II)

In the part-1 of history of computer described about the earlier stage of the computer. The era of modern computer started from 1940. Development of electronic circuit replaced mechanical and elctromechanical devices.

The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US(in 1946). It is regarded as the first general purpose electronic computer, the ENIAC was initially commissioned for the use in World War II, but not completed until one year after the war had ended. Installed at the University of Pennsylvania, its 40 separate eight-foot-high racks and 18,000 tubes were intended to help calculate ballistic trajectories. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC was built under direction of  John P. Eckert and John W. Mauchly.

Due to the development of Transistor, it slowly replaced vacuum tubes. The invention of transistor in 1947 giving rise to the "second generation" of computers. John Bardeen, Walter Brattain, and William Shockley   won the 1956 Nobel Prize in physics for this discovery. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). Many new programming languages were invented at that time.

The next great advancement in computing came with the advent of the integrated circuit. Although transistors were a great advance on vacuum tubes but problem remained. Like other electronic components, it needed to be soldered together and Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. In 1958, Jack Kilby  at Texas Instruments successfully demonstrated implementation of ICs. He manufactured the first integrated circuit or chip. A chip is really a collection of tiny transistors which are connected together when the transistor is manufactured. Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components.

This discovery made the door open and that  led to the invention of the microprocessor and the world came to see third generation of computer. With this invention computers became smaller, more powerful more reliable and they are able to run many different programs at the same time. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves. The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. Intel’s 4004 was the first microprocessor. A 1/16-by-1/8-inch chip called had the same computing power as the massive ENIAC.


A veritable explosion of personal computers occurred in the early 1970s. The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time. A company called Micro Instrumentation and Telemetry Systems (MITS) introduced a computer kit called the Altair.
Thousands of people bought the $400 kit. In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. Although the Altair spawned an entire business, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. The computer was called Apple 1.


The rest is history. Scientific invention and development made a new height of this device and each day it is advancing and changing the world. Today, laptops, smart phones and tablet computers allow us to have a PC with us wherever we go.

No comments:

Post a Comment

What Is The Difference Beween Numerals and Number?

Number is a concept, it is a mathematical concept. To express the quantitative value of the object, this  is developed in ancient history. S...