This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
An English mathematician, inventor and mechanical engineer who is thought to be the "father of the computer" had the idea of a mechanical programmable computer called the "Difference Engine".
The First Generation Electrical Computers
Modern computing began in the 1940s with improvements before and during the second world war as electrical circuits (relays and electric valves) replaced mechanical equivalents. The Z3 and Colossus computers were built using circuits that consisted of electric valves and relays, and used punched paper tape for input.
The Second Generation Computers
The next major improvement came in 1947 with invention of the Transistor. This replaced the unreliable and delicate vacuum valve. The transistor was much smaller and much more reliable.
A miniature electronic device that contains arithmetic, logic and control circuitry needed to function as a digital computers CPU. Microprocessors are integrated circuits (Ic) that can interpret and execute programme instructions as well as handle arithmetic operations.
Microprocessors (integrated circuits)
Before integrated circuits (Ic) were invented computers used circuits of individual electrical components (transistors, resistors, capacitors and diodes) all connected to a circuit board. In 1959 Jack Kilby at Texas Instruments Inc and Robert Noyce at Fairchild Semiconductor Corporation filed Patents for integrated circuits. Kilby found out how to make all the circuits out of Germanium, the semiconductor material then commonly used for transistors. Noyce used silicone which is now almost universally used and found a way to build the interconnecting wires as well as the other components on a single silicone chip therefore eliminating the requirement for soldered joints.
Integrated Circuit Definition
Also called microelectronic circuits or chip, an assembly of electronic components fabricated as a single unit in which miniaturised active devices(transistors and diodes) and passive devices (resistors and capacitors) and their interconnections are built up on a very thin substrate of semiconductor material typically silicon.
Microprocessors have changed the function of technology across the globe--originally stemming from integrated circuits that were first commercially developed in 1959 by Texas Instruments and Fairchild Semiconductor. Gradually increasing in advanced technology, microprocessors continued throughout the '70s and '80s.
Developed in 1971, the Intel 4004 is thought to have been the first microprocessor. This first microprocessor was very expensive to produce. The chip integrated the first use of dynamic RAM storing data, which is still the method that computers used to store data.
RCA, IBM and Motorola
Soon after the Intel 4004 appeared, other companies started to manufacture microprocessors. In 1974 the RCA 1802 was released; in 1975, the IBM 801 was released, as well as the Motorola 6800. Each showed improvements in function and speed. The Motorola 6800 included the first index register, which allowed it to store 78 instructions.
1970s & '80s
Microprocessors continued to develop through the 1970s and 1980s, but prices were still very high, in the hundreds and thousands of dollars. Their functions were very limited compared to microprocessors of today. These devices were used in large-scale computer systems, such as Air Force planes, and NASA space shuttles. In 1979, Motorola introduced the Motorola 68000, which found its way into Apple Macintosh, Atari, and Sun Microsystems computers.
During the '90s, Advanced Micro Devices (AMD) created a copy of Intel's leading microprocessor. This lead to lawsuits, which were eventually judged in AMD's favour. This lead to a boom in the making of clone microprocessors. Technology continued to increase and microprocessors continued to shrink in size.
The structure and architecture of microprocessors has remained relatively similar to what it was 30 years ago. As of 2010, Intel and AMD still dominate the microprocessor market and continue to increase speeds and functions. Since the advent of the Internet, computers have become common in households. This has led to more research and rapid advancement.
Microprocessor, also known as a CPU (central processing unit), is a small semiconductor chip on a piece of silicon that handles basic logic and storage tasks for a computer. The microprocessor is the heart of any computer system, and is responsible for personal computing as we know and understand it today.
History of Microprocessor
The microprocessor originated in 1969 when Busicom, a Japanese company, hired Intel to produce a chipset for its new calculator product. Ted Hoff, the head engineer of the project, believed that the proposed 7 chip set could be simplified to 4, with one handling all the processing of data.
Together, Busicom and Intel developed a series of new technologies and techniques to produce the first central processing unit, a multi-function chip to handle all data processing. This was called the Intel 4004.
How Does Microprocessor Work?
A microprocessor is composed of a number of very simple sections that each handles a task. The address bus states an address in memory, the data bus reads or writes to the memory, the read and/or write line tell the memory to get the location. There is an internal clock based off some number of hertz to calculate time and use counters to track timing on different programs. There is also a reset line to reset programs.
Computers also have random access memory (RAM). This is where microprocessors can put instructions as long as they need to be used, so that they don't get overwritten with new instructions before whatever job they have to do finishes. When a computer runs out of RAM, it slows down, because there is no more space to store instructions at that moment and the CPU does not overwrite previous instructions without permission from programs.
The last major function of the microprocessor is to handle write back. Whenever the microprocessor makes an instruction, it makes a copy of that instruction in the computer's main hard drive. This allows not only the system to have a backup of all instructions made, but it also allows the user to have a place to check when something deep in the system internals has gone wrong and want to figure out what it is.
Moore's Law is a popular theory originally developed by the co-founder of Intel, Gordon Moore. Moore looked at the rate with which microprocessors were advancing and noted that the speed seemed to double every 18 months. He expanded this into a theory, stating that the general speed of computers would double every 18 months until such time as it became a physical impossibility.
For nearly the last 40 years, this statement has held true, often making past statements--such as Bill Gates proposition no one would ever need more than 20 megabytes of hard drive space--seem silly in the present day.
Microprocessors are a central part of computers and have the general purpose of processing information from the user and generating output. Microprocessors are used in personal computers, automobiles, calculators, mobile phones, video game systems and many home appliances.
The first computers were developed during World War II and used vacuum tubes. Konrad Zuse developed the first computer in 1938 for the Nazi government. In 1943, Alan Turing developed a computer for breaking Nazi codes for the British army. ENIAC was built in 1946 by the University of Pennsylvania for solving missile trajectory equations. In the 1950s, transistors were developed, which were much smaller, more powerful and easier to replace than vacuum tubes. In the 1960s, integrated circuits were developed, which placed all of the components onto a single chip. In 1971, Intel developed the first microprocessor, the Intel 4004.
Microprocessors are made from silicon, aluminium, copper, gold and other metals. The base of the microprocessor is made of silicon. Circuits are etched into the chip using light, a process called photolithography. An outer packaging is applied to the processor once it is finished. This outer packaging has the label and a series of pins or other connectors that are made from copper, aluminium or another metal and are frequently gold-plated.
The microprocessor is essentially the "brain" of the computer. It performs calculations on data based on instructions from memory. For example, if the computer user is using a calculator and enters "36", "+" and "4", each of these values in stored in memory. The microprocessor gets the value "36" from memory, gets the instruction "add" from memory, gets the value "4" from memory and sends the output of "40" to memory. This value is then displayed on the user's screen.
A microprocessor contains an arithmetic logic unit (ALU) that is used to perform calculations on data, Data and instructions for what to do with the data are obtained from the address bus, which is connected to the computer's memory outside of the microprocessor. A data bus is used for transmitting data to and from the computer's memory.
Three main types of microprocessors exist: digital signal processors (DSP), microcontrollers and general purpose microprocessors. DSP and microcontrollers are small and are typically used for embedded devices, such as mobile phones or in automobiles. General purpose microprocessors are high speed and used in personal computers, workstations and servers.
Microprocessors are typically described by the name of the manufacturer, date of manufacture, number of transistors on each chip, microns (the width of the smallest wire), clock speed (speed of processing information in hertz), data width (the size of data that can be sent to the processor at a time) and MIPS, a measurement of the number of instructions per second in millions that the computer can process.
Processor models that are commonly used in home computers include AMD and Intel processors based on the x86 architecture, Intel IA-64 architecture and the PowerPC architecture, which is commonly found in older Macs.