Evolution of the Microprocessor
Published: Last Edited:
A microprocessor is that programmable IC that is used for executing instructions to the process of digital data or exercise digital control over the devices. It is primarily works as the central processing unit of computer system. The complexity of today's microprocessors make even modest description how microprocessors work beyond scope of the page.
The world's 1st microprocessor 4004, was co-developed by Buisson i.e. Japanese manufacturer of calculators, & Intel U.S. manufacturer of semiconductors.After the development of a general-purpose LSI not only the desktop calculators but also the business machines are originally based on the decimal computer with the stored program methods a basic architecture 4004 was developed on August 1969; a plan for the 4004 system was finalized in Dec 1969; and the 1st microprocessor was successfully developed in Mar 1971. Microprocessors which becomes the "technology to open up a new era," brought 2 outstand impacts, "power of intelligence" and "power of computing".
1st microprocessors opened up the new era of programming through replacing with the software, the hardwired logic based on I of the former era of logic. At same that time, microprocessors allowed young engineers use "power of computing" for creative development of the personal computers and computer games & which in turn led to the growth in software industry & they paved the way to the development of the high-level performance of microprocessors. Also an engineer must be the armed with firm belief that on his mission is nothing but the development & must be determined to the go of his own way, never following others's track. 4004 performance was only 0.06MPS with 2238 transistors & 750 KHz operating frequency.
Microprocessors evolved 4 bit - 64 bit microprocessor, introducing computer technologies such pipeline & super-pipeline & super scalar & VLIW& cache memory, and virtual memory system. Now is possible integrate 16 sets of the microprocessor with the 64GB of memory on board.
In the 20th century, microprocessors are used for increase in the power of intelligence.
In 21st century, microprocessors will evolve into the "tool to bring forth wisdom" for whole mankind .
The Breakthrough In Microprocessors
The switching units in computers that were used in the early 1940s were the mechanical relays. These were devices that opened and closed as they did the calculations. Come the 1950, and the vacuum tubes took over. The Atanasoff Berry Computer used in vacuum tubes as its switching units rather than relays. The switch from mechanical relay to vacuum tubes was an important technological advance as vacuum tubes could perform calculations considerably faster and more efficient than relay machines.This technological advance was short-lived because the tubes could not be made smaller than they were being made and had to be placed close to each other because they generated heat.
Then came the transistor which was acknowledged as a revolutionary development. In “Fire in the Valley”, describe the transistor as a device which was the result of a series of developments in applications of physics. Transistor has changed the computer from giant electronic brain to commodity like TV set. As a result of the technological breakthrough of transistors, the introduction of minicomputers of the 60s & the personal computer revolution of the 70s was made it possible.
However, researchers did not stop at the transistors. They wanted a device that could perform more complex tasks—a device that could integrate a number of transistors into a more complex circuit. Hence, the terminology, integrated circuits or ICs. Because physically they were tiny chips of silicon, they came to be also referred to as chips. Initially, the demand for ICs was typically the military and aerospace industries which were great users of computers and who were the only industries that could afford computers
Later, an engineer at Intel, developed a sophisticated chip. This chip could extract data from its memory and interpret the data as an instruction. The term that evolved to describe such device was “microprocessor”.
Therefore,“microprocessor” first came into use at Intel in 1972. A microprocessor was nothing more than an extension of the arithmetic and logic IC chips corporating more functions into one chip. Today, the term still refers to an LSI single-chip processor capable of carrying out many of the basic operations of a digital computer.
Development Of Microprocessors
Microprocessors essentially evolved from mechanical relays to IC what aspects of the computing industry led to the development of microprocessors.
(1) Digital Computer Technology
We know the computer industry learned to make large, complex digital computers capable of processing more data and also how to build and use smaller, less expensive computers. The digital computer technology is been growing steadily since the late 1940s.
It had also been growing steadily since the invention of the transistor in the late 1940s.In 1960s we saw the integrated circuit develop from just a few transistors to many complicated tasks, all of the same chip.
(3) The Calculator Industry
It appears as if this industry grew overnight during the 1970s from the simplest of four-function calculators to very complex programmable scientific and
Generation Of Microprocessor
Microprocessors were categorized into five generation Their characteristics are described below:
The microprocessors that were introduced 1971 to 1972 were referred as the first generation systems. They processed their instructions serially fetched the instruction, decoded it, then executed it. When an instruction was completed, microprocessor updated the instruction pointer and fetched the next instruction, performing this sequential drill for each instruction in turn.
B. Second Generation
By the late 1970s , enough transistors were available on the IC to usher in the second generation of microprocessor sophistication: 16-bit arithmetic and pipelined instruction processing. Motorola's MC68000 microprocessor, introduced in 1979, is an example. Another example is Intel's 8080.
This generation is defined by the overlapped fetch, decode, and execute steps . As the first instruction is processed in the execution unit, the second instruction is decoded and the third instruction is fetched. The distinction between the first and second generation devices was primarily the use of newer semiconductor technology to fabricate the chips. This new technology resulted in a five-fold increase in instruction, execution, speed, and higher chip densities.
C. Third Generation
The third generation, introduced in 1978, was represented by Intel's 8086 and the Zilog Z8000, which were 16-bit processors with minicomputer-like performance. The third generation came about as the IC transistor counts approached 250,000. Motorola's MC68020, for example, incorporated an on-chip cache for the first time and the depth of the pipeline increased to five or more stages. This generation of microprocessors was different from the previous ones in that all major workstation manufacturers began developing their own RISC-based microprocessor architectures .
D. Fourth Generation
As the workstation companies converted from commercial microprocessors to in-house designs, microprocessors entered their fourth generation with designs surpassing a million transistors. Leading-edge microprocessors such as Intel's 80960CA and Motorola's 88100 could issue and retire more than one instruction per clock cycle
E. Fifth Generation
Microprocessors in their fifth generation, employed decoupled super scalar processing, and their design soon surpassed 10 million transistors. In this generation, PCs are a low-margin, high-volume-business dominated by a single microprocessor
V. Micro From Vacuum Tube To Today's Dual-Core Multithreaded Madness
Before The Flood In The 1960s
Just on the scant in the few years after the first laboratory IC's .Fairchild introduce their first commercial use integrated circuit
Already start of decade, process that would last until present day commercial ICs. There is a no doubt but the technology & design & process were rapidly evolved.
Observing the trend Fairchild director of R& D Gordon observed that density of elements in the ICs doubling annually & predicted that thread would continue next 10 years. With certain amend, this is came to be known Moore's Law.
The first ICs contained just few transistors per wafe by the dawn of the 70s for 1000 of the transistors per wafer. It's the only matter of the time before the someone would use this capacity to put entire computer on a chip & several someones, indeed.
Development Explosion: The 1970s
The idea of the computer on single chip described in the literature earlier. Finally process had caught up to thinking & the computer on the chip was made it possible. The air was an electric with the possibility .
Once feat establish rest of the decade saw the prolife of the companies old & new getting into semiconductor business as first pc, first arcade games & even the first home video game sys are also spreading consumers contacts with the electronics & paving way for continued rapid growth in b/w 80s.
At beginning of the 70s microprocessors yet not introduced. By end of the decade saturated market led to price wars & processors were already introduced16-bit.
Three groups claim for the first to put the computer on a chip. The Central Air Data Computer & the Intel & the Texas Instruments TMS 1000.
Where Are They Now ?
- CADC spent about 20 years in the top secret cold war of the era until finally being classified in the 1998. Thus even it was first that remained under people's radar even today & didn't have chance to influence other until the early microprocessor design.
- Intel 4004 have the short & the mostly history to supersed by the 8008 & other early Intel chips.
- Chip was not finally marketed in standalone form in 1974 for the low price of US $2 per piece. In the year of 1978 a special version of the TITMS 1000 is being brains of educational Speak & Spell toy which E.T. jerry rigged to the phone home.
Early Intel- 4004& 8008 & 8080
Intel released their 1st single 4-bit all purpose chip the Intel 4004 in Nov 1971. Its clock speed of 108KHz & 2,300 transistors with ports for ROM & RAM & I/O. Originally it was designed for use in the calculator Intel had to negotiate its contract to become able to market its product & stand alone processor.
Intel 8008 was introduced in the month of April 1972 & didn't make much splash being more or less 8-bit 4004. Its primary claim was form the basis for 8080 later 8086 architecture.
Intel put back with the 8080 which was used the same instruction set earlier by the 8008 and is generally considered to be first usable microprocessor.
Where is Intel stand now?
Last time when we checked Intel was on the still - around.
In 1974 RCA released the 1802 of the 8-bit processor with an different architecture than the other 8-bit processors. It consist of a register file of 16 registers that is of 16 bits each & using the SEP instruction you can select any of the registers be the program counter.
Interesting variation was the two or more than two subroutines a ring so that they were to the be called as the roundrobin order. The RCA 1802 was considered one of the first risc chips.
Where Is Now ?
RCA chip was the biggest market failure due to slow clock cycle speed. But it can be the fabricated to the radiation resistant so used on Voyage Viking & Galileospace probes.
In 1975 IBM produced some earliest efforts to build microprocessor based on RISC design. IBM 801 was named after address of where the chip designed but suspect that IBM systems already numbered 601 & 701.
Where Is 801 Now Gone?
801 chip family never saw main use & was primarily used in other IBM hardwares.
The Evolution Of RISC
RISC Relegate the Important Stuff to the Compiler & known as load store architectures.
In the 1970 research at IBM produced result that the some of the operations are actually slower than the number of smaller operations doing same thing. A famous eg. was the VAX'sINDEX instruction which ran slower than loop implementing same code.
Motorola introduced 6800 chip in the 1978 with the 78 instructions & probably the first microprocessor with index register.
When comes to program flexibility & maintenance they are the wasteful when it was comes using scarce computer memory.
Where Is 6800 Now Gone?
Motorola is the only stand alone processors & microcontrollers trace their lineage to mbdew 6800 including popular & powerful 6809 of 1979.
Where Is Industry Of Microprocessors Going?
Almost immediately after their introduction, the microprocessors became the heart of the personal computer. Since then, the improvements have come at an amazing pace. The 4004 ran at 108 kHz that's kilohertz, not megahertz and processed only 4 bits of data at a time.
Today's microprocessors and the computers that run on them are thousands of times faster. Effectively, they've come pretty close to fulfilling Moore's Law which states that the number of transistors on a chip will double every 18 month Performance has increased nearly same rate
The microprocessor are their around for more than 20 years already. It is now comes in many forms sizes & levels of sophistication powering all the kinds of applications that they rely on control of computer.althouh the cpu of computer system it needs to interact with some other semiconductor device in order to perform functions. And devices include memory & input/output devices constitute rest of the computer system.Thus we know from where microprocessor evolutes & till where goes.
U.S Shah , “Microprocessor and its applications'
Ramesh Goankar, “Microprocessor Architecture , Design and applications of 8085”
B.Ram, “8085 Microprocessor”
Cite This Essay
To export a reference to this article please select a referencing stye below: