The History Of The Computer

The computer as we know it has gone through a lot of changes, not only recently, but also in the past. While digital computer have their origins in the 1940s, the concept of the computer goes back much farther than that.

Before There Were Digital Computers

The oldest computational machine that we know of is the abacus, which is dated circa 2400 BC. The abacus was used to help in the addition and subtraction of numbers by merchants in the Fertile Crescent(modern-day Iraq) and is considered a computational device due to its uses in computing numbers, as well storing data.
The concept of a computer as we understand it today, however, was done by the man who many today consider to be the father of the computer, Charles Babbage of Britain. In the 1820s, he worked on a machine that he called 'The Difference Engine', a machine that was made to mechanically calculate polynomial equations. Several years later, he began designing a much more ambitious machine called 'The Analytical Engine', which would have contained many of the functions standard to computers nowadays such as being digital, programmable, integrated memory, loops, and conditional branching, things fundamental to the computers we use today. Sadly, neither of his concepts were ever actually built until long after his death, and long after there were more advanced, fully functional computers.

The Early Computer

The 1940s saw many computers that evolved conceptually and had some, but not all the components of a modern computer. The notable advancements in these computers is that they stopped being mechanical and used electrical components to function. Experimental computers were built in the UK, Germany, and the US over the course of the decade that utilized things such as a binary numeral system, and used things such a reel of paper or the predecessor to the infamous vacuum tube to store data.

The First Proper Computers

The fifties saw the rise of stored program computers, or first-generation computers, which used electronic memory to store programs, the invention of firmware, and IBM releasing the first hard drive disk(HDD), which had the disk space of 5 megabytes, and cost $10,000 per megabyte. Second-generation computers saw the use of transistors replacing vacuum tubes in computers, as well as the use of telephone connections to establish the very beginnings of inter-computational connections; today known as the internet.

The Modern Era

After 1960, the capabilities of computers and computer hardware grew in leaps and bounds, exponentially even, and gave way to theory of Moore's Law, saying that computer transitor will double every two years. Computers saw rise to the use of integrated circuits, which in turn gave rise to the modern microprocessor. Other innovations followed, such as electronic output displays, removable memory(floppy disks), and sound functions. Multi-core CPUs hit markets in the 21st century, while many of the previously listed innovations have managed to increase their capacity and productivity.

Author Bio:
Sem Burke currently works at Vernon Technology Solutions and has a background in hardware development. He finds the technological progress we have made over the past 50 years in the IT field fascinating and is always looking to find ways to satisfy his craving for more knowledge.

Powered by Blogger.