Until late last night,I had forgotten Central Processing Unit had turned 40.However,a rare phone call from my lecturer at Royal Melbourne Institute of Technology refreshed my memory about Intel corp milestone.CPUcame into the fore in 1971 and very few computer users have taken keen interestthat it is now four decades into existence.They are various types of CPU and for your information they are the brainsbehind the likes of traffic lights,calculators,computer,mobilephones,Internet to mention but a few.Although the mainframe computers existedbefore CPU they weremade up of a mass of wires and vacuum tubes, all of which meant that computers took up entire rooms and sucked up tremendous amounts of power andcost an astronomical amount of money to run.
However,all that changed when 4o years ago, when IntelCorp which is short for Integrated Electronics, then a producer ofsemiconductor memory chips, was contracted to build an integrated circuit for arange of calculators by a company called Busicom.Instead of making a simpleintegrated circuit hard-coded for performing calculator functions, Intelengineers created the first complete CPU on one chip. It was a multipurposeprogrammable CPU that could perform a variety of functions depending on theinstructions it was given.
Thus, instead of being hard-coded with a limitednumber of specific functions, a separate read-only memory well known as ROM chipcontained the instructions to tell the CPU what to do.This meant that insteadof having to build a whole new chip to add new functions to the calculator, thecompany could simply flash a new set of instructions into the ROM chip to tellthe CPU to perform other additional functions.According to technology historybooks,the first Intel CPU was called the 4004 and it was a 4-bit processor,which meant it handled data in four-bit chunks, thus giving it the ability tocreate 16 different values.
The 4004 had a then whopping 2,300 transistors,followed by some 3,500 transistors in its successor, the 8008, the first 8-bitmicroprocessor.Intel's latest second-generation Core processor has about 1.48billion transistors, giving more than 350,000 times the performance of theoriginal processor.The 8080, an improved version of the 8008, was themicroprocessor that took the CPU from powering calculators to being used in thefirst consumer microcomputer, the MITS Altair 8800.Interestingly, the high costof the 8080 was what prompted Steve Wozniak and Steve Jobs to opt for theMotorola 6800 microprocessor to power the first Apple computer, as the 6800 couldbe had for as low as US$40.
By the turn of 1990s, a CPU's performance wasexpressed by its clock speed, expressed in megahertz (MHz), and later gigahertz(GHz).However, while it is true that the clock speed of the microprocessor doescontribute to performance, many other factors apart from clock speed alonedetermined how well CPUs really performed.AMD, Intel's long-time rival,developed their range of Athlon microprocessors with a model-naming conventionthat expressed a notional clock rate that the company thought was equivalent tohigher-clocked Intel Pentium 4 microprocessors.
During the same decade, microprocessor designerswere said to have reached a limit as to how far a microprocessor could bepushed in clock speed and the same time try to keep power consumption and heatat a manageable level. The media branded it the "megahertz myth" because many companies, including Intel, werelocked in a battle to produce microprocessors with higher and higher clockrates.Once the megahertz wars were over, microprocessor designers started with anew idea that was instead of a having a microprocessor with a single core doingmore and more work at higher speeds.
At this juncture,we are in multicore revolutionin microprocessors which started with just two cores in Intel's Core Duo chip,and is now moving to four cores, and soon eight cores.Intel has already shownin labs that a microprocessor with up to 50 cores is possible and this meansthere could be no limit as to how many cores it can hold.My prediction is that we are goingto get more cores in our microprocessors, and power consumption is definitelygoing to go down with each subsequent generation.This has been helped by advancesin miniaturisation and power.
All images by Alison Keys ,San Francisco.




No comments:
Post a Comment