This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
Random access memory (RAM)--an essential component for running personal computers--determines the computers speed. Computers possess five key functions: input, output, processing, storage and retrieval. RAM is designed for storage and retrieval functions. RAM provides a location for a computers central processing unit (CPU) to access current information. Typically, RAM is referred to as a computer's memory. The physical structure of RAM provides some advantages and disadvantages. RAM has several characteristics that distinguish it from other storage types.
The Structure of RAM
RAM is composed of transistors and capacitors. Transistors act as switches, while capacitors store information. Transistors and capacitors are paired and create a memory cell. Each memory cell can hold one bit of information. The capacitor loses information quickly and the central processing unit (CPU) must read the information and write the information back. As a result, you can consider RAM short-term memory. Information is stored in RAM to facilitate the transfer of information to long-term memory or to run applications and programs.
Computer memory is either volatile or non-volatile. Compared to the human mind, volatile memory is short-term while non-volatile memory is long-term. Volatile memory requires a power source to maintain the stored information. Non-volatile memory is not limited by power restrictions and retains information for extended time periods. RAM is volatile memory and, as a result, serves as a short-term storage location. The central processing unit (CPU) accesses information stored in RAM in random order so you can work with several programs simultaneously.
Types of Data Storage
Computers use RAM to store information about currently running applications, documents and data. For example, while you are using word processing software, your CPU stores data to RAM each time you type and during auto-save. Once you are finished and save the file to a location, the information is moved to a long-term storage option like your hard disk
Random access memory (usually known by its acronym, RAM) is a type of data store used in computers that allows the stored data to be accessed in any order Â- that is, at random, not just in sequence. In contrast, other types of memory devices (such as magnetic tapes, disks, and drums) can access data on the storage medium only in a predetermined order due to constraints in their mechanical design.
Generally, RAM in a computer is considered main memory or primary storage: the working area used for loading, displaying and manipulating applications and data. This type of RAM is usually in the form of integrated circuits (ICs). These are commonly called memory sticks or RAM sticks because they are manufactured as small circuit boards with plastic packaging and are about the size of a few sticks of chewing gum. Most personal computers have slots for adding and replacing memory sticks.
Most RAM can be both written to and read from, so "RAM" is often used interchangeably with "read-write memory." In this sense, RAM is the "opposite" of ROM, but in a more true sense, of sequential access memory.
Computers use RAM to hold the program code and data during computation. A defining characteristic of RAM is that all memory locations can be accessed at almost the same speed. Most other technologies have inherent delays for reading a particular bit or byte.
Many types of RAM are volatile, which means that unlike some other forms of computer storage such as disk storage and tape storage, they lose all data when the computer is powered down. Modern RAM generally stores a bit of data as either a charge in a capacitor, as in dynamic RAM, or the state of a flip-flop, as in static RAM.
Software can "partition" a portion of a computer's RAM, allowing it to act as a much faster hard drive that is called a RAM disk. Unless the memory used is non-volatile, a RAM disk loses the stored data when the computer is shut down. However, volatile memory can retain its data when the computer is shut down if it has a separate power source, usually a battery.
Some types of RAM can detect or correct random faults called memory errors in the stored data, using RAM parity.
Early main memory systems built from vacuum tubes behaved much like modern RAM, except that they failed frequently. Core memory, which used wires attached to small ferrite electromagnetic cores, also had roughly equal access time. The term Â“coreÂ” is still used by some programmers to describe the RAM main memory of a computer. The basic concepts of tube and core memory are used in modern RAM implemented with integrated circuits.
Alternative primary storage mechanisms usually involved a non-uniform delay for memory access. Delay line memory used a sequence of sound wave pulses in mercury-filled tubes to hold a series of bits. Drum memory acted much like the modern hard disk, storing data magnetically in continuous circular bands.
Currently, several types of non-volatile RAM are under development, which will preserve data while powered down. The technologies used include carbon nanotubes and the magnetic tunnel effect.
In summer 2003, a 128 kB magnetic RAM chip was introduced, which was manufactured with 0.18 Âµm technology. The core technology of MRAM is based on the magnetic tunnel effect. In June of 2004, Infineon Technologies unveiled a 16 MB prototype again based on 0.18 Âµm technology.
As for carbon nanotube memory, a high-tech startup Nantero built a functioning prototype 10 GB array in 2004.
The Memory Wall
The term "memory wall", first officially coined in Hitting the Memory Wall: Implications of the Obvious (PDF), refers to the growing disparity between CPU and memory speed. From 1986 to 2000, CPU speed improved at an annual rate of 55% while memory speed only improved at 10%. Given these trends, it was expected that memory latency would become an overwhelming bottleneck in computer performance.
Currently, CPU speed improvements have slowed significantly partly due to major physical barriers and partly because current CPU designs have already hit the memory wall in some sense. Intel summarized these causes in their Platform 2015 documentation: "First of all, as chip geometries shrink and clock frequencies rise, the transistor leakage current increases, leading to excess power consumption and heat (more on power consumption below). Intel's new Tri-Gate could solve this problem. Secondly, the advantages of higher clock speeds are in part negated by memory latency, since memory access times have not been able to keep pace with increasing clock frequencies. Third, for certain applications, traditional serial architectures are becoming less efficient as processors get faster (due to the so-called Von Neumann bottleneck), further undercutting any gains that frequency increases might otherwise buy. In addition, resistance-capacitance (RC) delays in signal transmission are growing as feature sizes shrink, imposing an additional bottleneck that frequency increases don't address."
The RC delays in signal transmission were also noted in Clock Rate versus IPC: The End of the Road for Conventional Microarchitectures which projects a maximum of 12.5% average annual CPU performance improvement between 2000 and 2014. The data on Intel Processors clearly shows a slowdown in performance improvements in recent processors. However Intel's new processors, Core 2 (codenamed Conroe) shows a significant improvement over previous Pentium 4 processors.
Shadow RAM is RAM whose contents are copied from read-only memory (ROM) to allow shorter access times , as ROM is in general slower than RAM. The original ROM is disabled and the new location on the RAM is write-protected. This process is called shadowing.
This section is a stub. You can help by expanding it.
For economical reasons, the large (main) memories found in personal computers, workstations, and non-handheld game-consoles (such as playstation and xbox) normally consists of dynamic RAM (DRAM). Other parts of the computer, such as cache memories and data buffers in hard disks, normally use static RAM (SRAM).
General DRAM packaging formats
Dynamic random access memory (DRAM) is produced as integrated circuits (ICs) bonded and mounted into plastic packages with metal pins for connection to control signals and buses. Today, these DRAM packages are in turn often assembled into plug-in modules for easier handling. Some standard module types are:
* DRAM chip (Integrated Circuit or IC)
o Dual in-line Package (DIP)
* DRAM (memory) modules
o Single In-line Pin Package (SIPP)
o Single in-line memory module (SIMM)
o Dual in-line memory module (DIMM)
o Rambus modules are technically DIMMs, but are usually referred to as RIMMs due to their proprietary slot.
o Small outline DIMM (SO-DIMM). Smaller version of the DIMM, used in laptops. Comes in versions with:
+ 72 pins (32-bit)
+ 144 pins (64-bit)
+ 200 pins (72-bit)
o Small outline RIMM (SO-RIMM). Smaller version of the RIMM, used in laptops.
* Stacked v. non-stacked RAM modules
o Stacked RAM chips use two RAM wafers that are stacked on top of each other. This allows large module (like a 512mb or 1Gig SO-DIMM) to be manufactured using cheaper low density wafers. Stacked chip modules draw more power.