Friday, 30 March 2012

Numeral system

Numerals or numeral systems is the collection of notations or symbols to represent numbers. Like letters in a language which represent our speech graphically, numerals also represent number graphically. So A numeral is a symbol or name that stands for a number. For example 6, 70 and fourteen etc. All are numerals. 
 
Numeral systems are sometimes called number systems but they are not as same. A number is an abstract concept and a numeral is a way to express a number. A number two can be written in different numerals such as 2, II, or 10 (in binary system). What we write is a numeral, but most often we call them as numbers. That's why a confusion always remains over numeral and number.

There are different kinds of numeral system. It can be classified as positional or by base. Suppose Roman numerals, Indian numerals etc. are positional numeral system and binary, decimal are by base. Binary numeral has base 2 and decimal have base 10.

In real world, Decimal Numeral System is widely used. In every calculations at home, office, and in business, this numeral system is used. It has base 10(ten). Decimal numeral system includes a zero (0) and use symbols (1, 2, 3, 4, 5, 6, 7, 8, and 9)  to represent any number, no matter how large or how small. 

Binary numeral system has base 2 (two). It use 0 and 1 to represent any number. It is used in digital device (in computer and other electronic devices).

Below is example of Decimal and Binary numerals :
Decimal 0 -> Binary  0
Decimal 1 -> Binary  1
Decimal 2 -> Binary  10
Decimal 3 -> Binary  11
Decimal 4 -> Binary  100
Decimal 7 -> Binary  111
Decimal 20 -> Binary  10100


Hexadecimal numeral system has base 16. It uses sixteen distinct symbols. 09 to represent values zero to nine, and A, B, C, D, E, F  to represent values ten to fifteen. Hexadecimal numerals are widely used in computer system design.

Below is example of Decimal and Hexadecimal numerals :
Decimal 0  -> Hexadecimal  0
Decimal 1  -> Hexadecimal  1
Decimal 30 -> Hexadecimal  1E
Decimal 41 -> Hexadecimal  29

Saturday, 3 March 2012

History of Computer(part-II)

In the part-1 of history of computer described about the earlier stage of the computer. The era of modern computer started from 1940. Development of electronic circuit replaced mechanical and elctromechanical devices.

The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US(in 1946). It is regarded as the first general purpose electronic computer, the ENIAC was initially commissioned for the use in World War II, but not completed until one year after the war had ended. Installed at the University of Pennsylvania, its 40 separate eight-foot-high racks and 18,000 tubes were intended to help calculate ballistic trajectories. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC was built under direction of  John P. Eckert and John W. Mauchly.

Due to the development of Transistor, it slowly replaced vacuum tubes. The invention of transistor in 1947 giving rise to the "second generation" of computers. John Bardeen, Walter Brattain, and William Shockley   won the 1956 Nobel Prize in physics for this discovery. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). Many new programming languages were invented at that time.

The next great advancement in computing came with the advent of the integrated circuit. Although transistors were a great advance on vacuum tubes but problem remained. Like other electronic components, it needed to be soldered together and Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. In 1958, Jack Kilby  at Texas Instruments successfully demonstrated implementation of ICs. He manufactured the first integrated circuit or chip. A chip is really a collection of tiny transistors which are connected together when the transistor is manufactured. Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components.

This discovery made the door open and that  led to the invention of the microprocessor and the world came to see third generation of computer. With this invention computers became smaller, more powerful more reliable and they are able to run many different programs at the same time. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves. The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. Intel’s 4004 was the first microprocessor. A 1/16-by-1/8-inch chip called had the same computing power as the massive ENIAC.


A veritable explosion of personal computers occurred in the early 1970s. The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time. A company called Micro Instrumentation and Telemetry Systems (MITS) introduced a computer kit called the Altair.
Thousands of people bought the $400 kit. In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. Although the Altair spawned an entire business, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. The computer was called Apple 1.


The rest is history. Scientific invention and development made a new height of this device and each day it is advancing and changing the world. Today, laptops, smart phones and tablet computers allow us to have a PC with us wherever we go.

Thursday, 1 March 2012

History of Computer (part-I)

The computer has long history that begun near about 2000 years ago. People have been using mechanical devices to aid calculation for thousands of years, for example, the abacus probably existed in Babylonia about 3000 B.C.E. The abacus was initially used for arithmetic tasks. In 1617 an Scotsman named John Napier invented logarithms, which are a technology that allows multiplication to be performed via addition. In 1641 the French mathematician and philosopher Blaise Pascal built a mechanical adding machine. It is considered as first mechanical calculator. He named it Pascaline. Gottfried Wilhelm Leibniz also came up with a machine for doing calculations. It could do much more than Pascaline. Leibniz has another important contribution to computing. He was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. In France(1801), Joseph Marie Jacquard invented a Power Loom that used wooden slat "punch cards" to make patterns on the loom.

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. In 19th century, English mathematician and professor name Charles Babbage designed the Analytical Engine which was the basic framework of the computers of today. This device, large as a house and powered by 6 steam engines. The analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities able to interpret a programming language with loops and conditional branching. Babbage is also considered as 'Father of Computer'. Babbage was more fortunate in receiving help from Augusta Ada Byron, Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable -- and this is why she is still, sometimes, referred to as the world's first computer programmer.American statistician Herman Hollerith built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data in 1890. In 1936, Alan Turing wrote a mathematical paper called 'On computable numbers' and there he presented the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable.  He proved that some such machine would be capable of performing any conceivable mathematical computation The central concept of the modern computer was based on his ideas.

What Is The Difference Beween Numerals and Number?

Number is a concept, it is a mathematical concept. To express the quantitative value of the object, this  is developed in ancient history. S...