This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
The gaming industry is standing on some pillars, the gamers, the developers, the designers. This all pillar needs some of the supports to make them grow stronger which includes the software and hardware. One of the most important hardware is the “GRAPHIC CARD” which can be called as the right hand of the game industry.
The topic chosen by me is all about the graphics cards which is on the other hand is an interesting topic to discuss about it. The topics included in this paper are the discussion and comparison of the most successful company in this field which are nVIDIA and ATI Radeon. Getting the information about the graphics card is not a very tough task but getting the correct information about them is not a child task. There are several other companies that also make graphics card which are Asus, Zion, etc. But the main difference between this and the former are that nVIDIA and ATI Radeon makes their own graphics processing chipsets while the others used nVIDIA’s and Redeon’s chipset to make their cards with some customs in processing speed, memory and the incoming hardware ports. But if you look at the performance of nVIDIA and Radeon is much better than those cards. That is why these two are standing at the high end in this industry. And also they are the primary competitor to one another.
The main components of a graphic card are the GPU (Graphics Processing Unit), the G-RAM (Graphics Random Access Memory), the Cooling System (fans, heat sink and the heat pipes). Older days it has AGP to connect to the mother board but now it has PCI (16 pins) and PCI Express (4 pins). Most of the graphic card has two major outputs (1 VGA, 1 DVI).
NVIDIA is a multinational corporation specializing in the expansion of graphics dispensation devices and chipset technologies for computers, workstations, and mobile devices. Situated in Santa Clara, California, the company has turn out to be a major purveyor of graphics circuits (GCs) such as graphics processing units (GPUs) and chips used in video cards, and video-game consoles and computer motherboards.
“Prominent Nvidia products include the Quadro series for CAD and GeForce series for hardcore gaming, and digital content formation on workstations, and integrated motherboard chipsets known as nForce.
In 1993 Jen-Hsun-Huang the present CEO, Curtis Priem and Chris Malachowsky and
In 2000 Nvidia acquired the rational possessions of its past rival 3dfx, largest graphics companies of 1990s.
On 14 dec, 2005, Nvidia owned ULI Electronics, which at the time manufacture third-party Southbridge parts for chips to ATI. In March 2006, Nvidia acquired Hybrid Graphics and on January 5, 2007, it declared that it had finished the acquirement of PortalPlayer, Inc.
At the end of 2006 Nvidia, all along with its key competitor in the graphics industry AMD (which owned ATI), received subpoenas from the Judicial Department concerning probable antitrust violations in the graphics card industry.
According to Forbes magazine, Nvidia is named as Company of the Year 2007; cite the comings and goings it made during the said era as well as during the preceding 5 years.
February 2008, Nvidia owned Ageia Technologies for an unidentified figure. "The purchase reflects in cooperation companies shared goal of creating the most astonishing and charming game experiences", as addressed by Jen-Hsun Huang, CEO of Nvidia; "By grouping the members that fashioned the world's most omnipresent graphics processor and physics engine company; we can bring the PhysX accelerated by GeForce to twelve million gamers worldwide."
The company's name combines an preliminary — a letter usable as a pro-numeral in mathematical statements — and meaning of video is come from Latin “videre”, "to see", implies “the best visual experience" or perhaps "immeasurable display" The name NVIDIA means "envy" (in Spanish its envidia or in Italian, Latin and Romanian its invidia); and GeForce 8 series product uses the tag line "Green with envy". The company name always appears in upper-case ("NVIDIA") in company technical credentials.” –http://en.wikipedia.org/wiki/nvidia
In many way Nvidia resemble its rival ATI: Both companies begin with a centre of attention in the PC market and later prolonged their performance into chips for non-PC applications. Since Nvidia is a fables semiconductor company, chip developed is provided under contract by Taiwan Semiconductor Manufacturing Company. As part of their operation, both Nvidia and ATI produce "reference designs" (circuit board schematics) and provide built-up samples to their board partners. Developers of Nvidia graphics cards include foxconn, BFG, EVGA, and PNY. ASUS, XFX, ECS, Gigabyte Technology, and MSI represent developer of both Nvidia and ATI cards.
At the end of 2004 saw the declaration that Nvidia would help Sony with the model of the graphics processing unit (RSX) in the PS 3 gaming console. In the first half of 2006 it emerge that Nvidia would distribute graphics processor (RSX) to Sony as an IP-core, and that Sony alone would be answerable for producing the RSX. Under the agreement, Nvidia will offer partial support to port the RSX to Sony's fabs of choice as well as die shrinks to 65 nm processors. This is a removal from Nvidia's business agreement with Microsoft; in which Nvidia manage manufacture and release of the Xbox graphics processor through Nvidia's usual intermediary foundry contract.
On February 4, 2008, Nvidia declared tactics to obtain physics collision software producer AGEIA, whose PhysX engine program used in hundreds of games modification or in expansion for PS 3, Xbox 360, Nintendo Wii, and gaming PCs. This transaction ended on 13 February, 2008 and labour to install PhysX into the GeForce 8800's CUDA system begins.
On June 2, 2008 Nvidia formally declared its new Tegra product-line. The Tegra is a System-on-a-Chip (SoC) that integrates an ARM Processor, graphics processor, Southbridge and Northbridge into a single chip. Commentator opines that Nvidia will mark this object in the smart-phone and Internet device (mobile) sector.
NV1 the first graphics card of NVDIA was released in 95. Its model used quadratic surfaces, with an incorporated sound-card with play back only and ports for gamepads from Sega Saturn. Since the Saturn also used forward graphics rendered quadratics, programmers modified several Saturn games to play on a computer with NV1, such as Panzer Dragoon and Virtual Fighter Remix. Though, the NV1 fought in a market-place full of several challenging proprietary standards.
Market attention in the product ended when Microsoft declared the DirectX conditions, based upon polygons. Later NV1 development continued on the inside as the NV2 project, funded by s millions of dollars of conjecture from Sega. Sega hoped that an incorporated sound-and-graphics chip would cut the developing cost of their next console. However, Sega ultimately realized the necessity in implementing quadratic face surfaces, and NV2 was never completely urbanized.
Jen-Hsun Huang CEO of nvidia realized at the moment that after two unsuccessful products, a bit had to modify for the company to stay alive. He booked David Kirk as main Scientist from Crystal Dynamics. Kirk shared the company's experience in 3D hardware with an cherished indulgent of practical knowledge of rendering.
As part of the corporate renovation, Nvidia wanted to full support to DirectX, and lowed multimedia functionality in turn to reduce developing costs. Nvidia also take on the goal of an internal half year product-cycle, under the belief that the failure of any one product could be mitigated by having a replacement waiting in the pipeline.
Having ultimately developed and exported in volume the market-leading IGC (integrated graphics chipset, Nvidia set itself the destination of doubling-up the amount of pixel pipelines in its chip, in instruct to realize a substantial performance-gain. The former case authorized for better visual output quality and the latter for doubling the max fill-rate.
New features incorporated a Z-buffer of 24-bit with 8-bit pattern support, filtering with anisotropic, and MIP mapping. In certain way the TNT had begin to compete with Intel's Pentium processors for complexity. Though, TNT offered an amazing range of superiority incorporated features, it botched to move the market leader, Voodoo 2 from 3dfx, because the actual clock-speed wrecked up to only 90 MHz, about 35 percent; less than as usual.
Nvidia answered with a refresh part : A die shrivel for the TNT design from 350 nm to 250 nm. A stash TNT2 now ran at speed of 125 MHz, an Ultra at 150 MHz. While the Voodoo 3 strike Nvidia to the market, 3dfx's contribution proved below par: It was not a great deal and lacked quality that was becoming benchmark, such as 32-bit colour and textures of resolution greater than 256 x 256 pixels.
The RIVA TNT2 manifested a key turning-point for Nvidia. They at last delivered a product spirited with the best on the market, with an advanced feature-set, strong 2D functionality, all integrated into a single chip with strong yields, which ramped to remarkable clock-speeds. Nvidia's half year cycle rejuvenate took the rivalry by revelation, giving it the proposal in rolling out new products.
The GTS benefited from the fact that Nvidia had by this time acquired wide developing experience with their highly incorporated cores, and for this they succeeded in optimizing the core for clock-speeds. The quantity of chips produced by Nvidia also enabled it to bin-split parts, substitute out the highest-quality cores for its first-class range.
Soon afterthat Nvidia launched the GeForce 2 MX, planned for the resources and OEM souk. It had two pixel-pipelines minor quantity, and run at 165 MHz and afterwards at 250 MHz. Presenting stronger performances at a mediocre price, the GeForce 2MX from nVidia become one of the majority flourishing graphics chipsets.
Nvidia's victory proved too much for 3dfx to get well its previous period market share. The Voodoo 5, the descendant to the Voodoo 3, did not match up favourably with the GeForce 2 also price or presentation, and failed to produce the sales necessary to keep the company live. With 3dfx on the edge of insolvency near the end of 2000, Nvidia brought most of intellectual property from 3dfx. Nvidia also owned anti-aliasing experts and about 100 engineers.
ATI Technologies Inc. (ATI) was a Canadian dealer of graphics processing units and motherboard chipsets. In 2006, the company was owned by Advanced Micro Devices (AMD) and named was changed the AMD Graphics Product Group or ATI Technologies ULC, even though the ATI brand was booked for graphics cards.
This Graphics Product Group is a fables Semiconductor Company conducting in-house investigate and development and outsourcing the development and congregation of its products. Its main contestant is NVIDIA in the graphics and hand held market. The flagship manufactured goods, the Radeon sequence of graphics cards, in a straight line competes with NVIDIA's GeForce. These two companies' ascendancy of the market forced other developer into niche roles.
“In 1985, Lee Ka Lau founded ATI as Array Technologies Incorporated, Benny Lau and Kwok Yuen Ho. Functioning first and foremost in the OEM field, ATI created integrated graphics cards for PC developers such as IBM and Commodore. From mid of 1987, ATI had full-grown into an sovereign graphics card seller, releasing EGA Wonder and VGA Wonder video card manufactured goods appearance beneath its brand that year. In May 1991, the company launched the Mach8, ATI's first manufactured goods able to course graphics without the CPU. In 1992, the Mach32 obtain better memory bandwidth and graphical user interface acceleration performance.
The All-in-Wonder manufactured goods line introduced in 1996 was the first grouping of IGC with TV tuner card and the first chip that can display computer graphics on a TV set. The cards have 3D acceleration maintained by ATI's second age group 3D Rage II, 64-bit 2D performance, TV-class video increase of rate, analog video capturing, TV tuner feature, disturbance-free TV-out and stereo output .”- http://en.wikipedia.org/wiki/ATI_Technologies
ATI made an entry into the mobile computing division by introducing 3D-graphics increase of rate to notebooks in 1996. The Mobility product had to meet needs different from desktop PC, such as lower power practice, abridged heat production, TMDS output ability for notebook screens, and addition maxed. In 1997, ATI owned Tseng Labs's graphics assets, which included 40 engineers.
The Radeon row of graphics goods was unveiled in 2000. The preliminary Radeon graphics dispensation item was a latest design with DX 7.0 3D increase of rate, video increase of rate, and 2D acceleration. Technology urbanized for a definite Radeon age group could be built in changing levels of character and presentation in order to afford goods suitable for the whole advertises variety.
On July 24, 2006, AMD and ATI got a plan to merge jointly in a contract valued at $5.4 billion. The amalgamation closed on October 25, 2006. The attainment deliberation built-in over $2 billion investment from a loan and 56 million shares of AMD stock. ATI reserved its name, logos and trademark. Dave Orton, CEO of ATI was promoted to Executive Vice President of Visual and Media affairs.
In December 2006 it is been found that AMD/ATI down with its major competitor NVIDIA, external subpoenas as of the United States Department of Justice concerning probable antitrust violations in the graphics card manufacturing.
In July 2007, AMD affirmed the receipt of Dave Orton. ATI, an extra of AMD, is named as Graphics Product Group (GPG) in the company. The top-level association of the GPG consists of Rick Bergman, Senior Vice President and General Manager and Adrian Hartog, Senior Vice President and General Manager of Consumer Electronics Group.
EGA / VGA Wonder - IBM "EGA/VGA-competing" display adapters (1987)
Mach Series - Introduced ATI's initial 2D GUI "Windows Accelerator". As the sequence evolves, GUI increase of rate enhanced radically and early video hastening appear.
The series evolve from elementary 3D with 2D GUI increase of rate and MPEG-1 ability, to a highly bloodthirsty Direct3D 6 accelerator with the "best-in-class" DVD (MPEG2) acceleration. A range of chips were much admired with OEMs of the occasion. The Rage II was worn in the initial ATI All-In-Wonder multi featured video card, and more superior All-In-Wonders based on Rage series graphics processor followed. (1995–2004)
Rage Mobility – modelled for use in low-power facility, such as notebook. These chips were working parallel to their desktop counterpart, but had accompaniments such as higher power organization, LCD display, and dual monitor features.
Radeon Series - released in 2000, the Radeon line is ATI's made for their customer 3D accelerator add-in cards. The unique Radeon DDR was ATI's newest DirectX 7 3D accelerator, which introduce their foremost hardware T&L engine. ATI frequently shaped 'Pro' versions with elevated clock speeds, and from time to time an extreme 'XT' version, and still more freshly 'XT PE' and 'XTX' version. The Radeon sequence was the foundation for a lot of ATI’s Wonder boards.
Mobility Radeon - A sequence of energy-optimized version of Radeon graphics chips for use in notebooks. They got new features such as optimized RAM chips, DVD (MPEG2) increase of rate, notebook GPU card sockets, and "PowerPlay" power organization technology.
ATI CrossFire - This skill was ATI's counter to NVIDIA's SLI technology. It allowed, by using a secondary graphic card and a dual PCI-Express main board based on an ATI Crossfire friendly chipset, the skill to combine the authority of the two video cards to augment presentation through a diversity of dissimilar rendering options. There is an option for extra PCI-Express graphic card plug into the third PCI-Express slot for better physics, or one more option to do physics on the other graphic card.
FireGL - Released in 2001, next ATI's gaining of FireGL Graphics from Diamond Multimedia.
FireMV - For workstations, featuring multi-view, a knowledge for the want of numerous displays for workstations with 2D increase of rate only, usually based on the low-end merchandise of the Radeon series.
Flipper - The Nintendo GameCube comes with a 3D accelerator developed by ArtX. Inc, a company owned by ATI throughout the expansion of the GPU. Flipper is parallel in ability to a DX 7.0 accelerator chip. It involves 4 rendering pipelines, with T&L, and some restricted pixel shade support. Productively the chip has 3 MB of embedded 1T-SRAM for use as very-fast low-latency (6.2 ns) texture and frame buffer/Z-buffer storage allowing data transfer of 10.4 GB/second. Flipper was designed by Nintendo 64 Reality Coprocessor team. Radeon 9700 was developed by the member of the flipper team.
Xenos – Xenos or “R500” is a custom graphics card from ATI which used in Microsoft’s Xbox 360. In which DRAM is embedded. It has also true unified shader architecture which dramatically loads and process pixel and also has a capability to balance the processing phenomenon. Another noticeable feature is that it can converts flat surfaces into several small triangles. Latest graphics processor Radeon GPU core has all the feature of Xenos, except eDRAM.
Comparing Nvidia and ATI:
When Nvidia’s market place was overriding. Though, ATI Technologies remained spirited due to its new Radeon manufactured goods, which performed chiefly on parity with the Nvidia GeForce 2 GTS. However, ATI's respond to the nvidia’s GeForce 3, the ATI Radeon 8500, came later to market and in the beginning suffered from driver issue, the 8500 proved a superior contestant due to its lesser price and unused potential for growth. Nvidia countered ATI's gift with the GeForce 4 Ti column, but not prior to the 8500 carved out a position. ATI choose to work on its next gen Radeon 9700 relatively than on a direct contestant to the GeForce 4 Ti.
Throughout the maturity of the next-generation GeForce FX chips, a lot of of Nvidia’s best engineers purposeful on the Xbox agreement, counting the API used as part of the Sound Storm platform. Nvidia also had a contractual urge to enlarge latest and more cracking-resistant NV2A chips, and this obligation further short changed the FX task. The Xbox agreement did not permit for falling developing costs as cpu technology improved, and Microsoft wanted to re-confer the terms of the agreement, with investing the DirectX 9 specifications as leverage. Relations between the two companies, which had previously been incredibly good, weaken as a result. Both party afterwards established the argument through mediation and the terms were not released to the general public.
Due to the Xbox argument, no discussion with Nvidia took leave during the manufacuring of the DirectX 9 requirement. ATI partial rendering color hold up to 24-bit floating point, and emphasized shaded presentation. Manufacturers built the shaded-compiler by means of the ATI Radeon 9700 as the base card.
In difference, Nvidia’s cards offered 16- and 32-bit buoyant point modes, contribution either lower ocular quality or slower presentation. The 32-bit support made nvidia much more pricey to produce, require a higher transistor count. Shaded presentation frequently remained at lower the speed provided by ATI's challenging products. Having made its status by modelling easy-to-develop DirectX-compatible parts, Nvidia had judged wrongly Microsoft’s next criterion and paid a weighty price: as more and more games in full swing relied on DirectX 9 quality, the poor shared presentation of the GeForce FX series became more understandable. With the omission of the FX 5700 series, the FX series did not contend well next to ATI cards.
Nvidia unrestricted an "FX only" protest called Dawn, but a hacked packaging enabled it to sprint on a 9700, where it sprinted faster despite conversion overhead. Nvidia began to use request discovery to optimize their drivers. Hardware review sites in print articles describing that Nvidia’s driver auto-detected standard, and created synthetically exaggerated scores that did not narrate to real world presentation. Frequently it was tips from ATI’s driver manufacturing team that lay behind these articles . While Nvidia did partly close the performance gap with new teaching reordering capabilities introduced in later drivers, shaded presentation remain weak and too sensitive to hardware-specific code assemblage. Nvidia worked with Microsoft to discharge efficient DirectX compilers that generate commands optimized for the GeForce FX.
In addition, GeForce FX plans also ran hot, since they drew as much as twice the amount of current as equivalent parts from ATI . The GeForce FX 5800 Ultra turn into disreputable for its fan sound, and named itself as "dustbuster" and "leafblower" - Nvidia teasingly approved these accusation with a cartridge in which the advertising members compare the cards to a Harley-Davidson bikes. Even though the quieter 5900 dashed the 5800 with no fanfare, the FX ICs still compulsory large and expensive fans, insertion Nvidia's partner at a developing cost inconvenience compared to ATI. Consequently of Microsoft's actions, and the ensuing FX series' weaknesses, Nvidia lost its market headship position to ATI.
With the GeForce 6 series, Nvidia had evidently stimulated beyond the DirectX 9 performance problems that overwhelmed the previous gen. The Nvidia GeForce 6 series not only performed greatly where Direct 3D shedders were troubled, but also support DX Shader 3.0, while ATI's challenging X800 series chips only supported the previous shader 2.0 requirement. This proved an unimportant advantage, mainly because games of that time did not use extensions for Shade Model 3.0. On the other hand, it established Nvidia's wish to design and follow through with the latest features and bring them in a specific time limit. What became more obvious during this period was that the products of the two companies, ATI and Nvidia, presented corresponding presentation. The two firms trade blow in detailed title and detailed criteria — resolution, image excellence, processing speed, anisotropic filtering/anti-aliasing — but dissimilarity were fetching more nonfigurative, and the reigning concern became price-to-presentation . The mid-range donations of the two firms confirmed the patrons hunger for reasonably priced, high-end graphics cards, and it is now this price division in which much of the firms' prosperity is firm. The Nvidia GeForce 6 series were released worldwide in a very remarkable period: the famous game Doom 3 was just free where ATI's Radeon 9700 fought at the OpenGL presentation.
Nvidia unconfined the 8-series chip just before the end of 2006, assembling the 8-series the newest to support Microsoft's next-era’s DirectX 10 requirement. The 8-series GPUs also featured the radical fused Shaded structural design, and Nvidia leveraged this to provide an supplementary functionality for its graphics cards: improved hold for universal reason Computing on GPU (GPGPU).
Nvidia unconfined two model the 8800GTS and the 8800GTX (768MB). Afterwards, Nvidia unconfined the 8800 Ultra . All three of these cards obtain as of the 90 nm G80 core . The GTS model had 96 graphics processors and 20 ROPS and the GTX/Ultra had 128 brook processors and 24 ROPS.
In near the beginning 2007 Nvidia unconfined the nvidia 8800GTS 320mb. This card bear a resemblance to an 8800GTS 640, but with 32MB memory chips in its place of 64MB .
In October 2007 Nvidia unconfined the 8800GT. This card used the latest 65 nm G92 graphics processor and had 112 stream processors. It restricted 512Mb of Video RAM and operated on a 256bit system bus.
Afterwards in December 2007 Nvidia unconfined the 8800GTS G92. It shows a larger 8800GT with advanced clocks and all of the 128 brook processors of the G92 not closed.
In February 2008 Nvidia unconfined the 9600-series chip, which ropes Microsoft's DirectX 10 requirement, in answer to ATI's discharge of the Radeon HD3800 series. Afterwards in March Nvidia free the GeForce 9800 GX2, which, approximately combined, two GeForce 8800 GTS G92s cards into a single card.
In between June 2008 Nvidia unconfined their latest flagship graphics processor named the GTX 280 and GTX 260. The cards worn the similar essential fused structural design deployed in the preceding 8 and 9 series cards, previous than with a tune-up in authority. In cooperation of the cards take as their foundation the GT200 graphics processor. This processor contain 1.4 billion transistors on a 65 nm manufacture. The GTX 280 has 240 shades and the GTX 260 has 192 shades .
In January 2009, Nvidia unconfined a 55 nm die get slighter of GT200 called the GT200b. The modernize to the GTX 280 purportedly as long as 1062.72 GFLOPS of buoyant direct authority; an information to the GTX 260 with 216 shades and a dual-chip card featuring two GT200b which are a mixture of the GT200 cores that were featured on the unique GTX 280 and GTX 260. The dissimilarity here is that each entity graphics processor features 240 processors, but only a 448-bit system memory bus. The latest GTX 295 has 1.75GB of GDDR3 VRAM. The GTX 295 supposedly provides roughly 1788.48 GFLOPS of buoyant point authority.
March 2009 proverb the let go of the GTS 250 main brook chip, based on the preceding age group G92s but 55 nm die shrivel code named the G92b. The GTS 250 is based on the 9800GTX+ and has 128 shades with a 256-bit system memory bus and 0.5GB or 1GB of GDDR3 of Video RAM.
On May 12th 2009, Nvidia launched images of a fresh revised version of the GTX295. This design, being comparable to ATI's HD4870x2, is dissimilar to the original. In the first manufacturing run of the GTX295, it was factually two divide graphics accelerators sandwiched in the same covering and linked by a ribbon SLI cable. The fresh plan encompasses together GPU's on the one PCB. The card motionlessly has the similar stipulation of the initial creation run, although conjecture admits it will be fewer pricey, due to lesser developing costs as of being a more compact device.
Future of Gaming Cards:
Prediction of any technology can’t be that easy because with the current rate of improvements seen in the field of science and technology. It is very easy to say the future but it is very hard to perfectly predict the future world with the technology, the cause behind this is very simple and also very complicated. The main cause is that “No one can see or say the future according to the current science and technology” and the other cause may that “the certain field may or may not be there in the future according to its necessity”.
The gaming industry is at the top level of entertainment media now in this earth. The reason behind this is that the gaming industries not only target a particular group of people, they always target for a majority of the people and develops games in that certain manner. It is seen that there are varieties of games can be found in the market according to the demand from the public. Making or developing a perfect game involves hard work from the technical people besides that it also involves major amount of survey from the people by releasing alpha version or through the internet or it may be some other medium.
According to me all future card will have to implement them in virtual 3D environment and now few cards from NVIDIA have that ability in a research or alpha version. This technology basically improved for the training U.S. Army with a new and very different view of perspective. Now as this technology is open for the general public. The whole gaming world will have a complete different aspect of playing games. The player will feel that he or she in that 3D environment. As said by many of the professionals and research persons that this technology is better with the simulation games and they also added that using this most of the games can be simulated. I think this will going happen a little far from now. Installing thousand of powerful cores inside single processor also makes these card a very strong in its field of processing and also for rendering a complete real 3D environment surrounding the player need very powerful processing speed.
Another prediction of the graphics card is that the size of the card may be made very compact with powerful processor just like a pen drive (can be named as GAME DRIVE) in our pocket, just plug in and play the most hi-end graphics game and it may also come with certain amount of the space of 20-40 GB (it may be increased with the necessity of the future games), so that games can be installed in that drive and can be directly played from it. This technology also needs to improve the operating system because of the drivers and data transfer rate (which should be equivalent to the rate of a hard disk or at least more than 200 MBps). The driver for the hardware is saved in the ROM inside the drive, which gets installed in the computer when it is plugged in. This improves the mobility and you can give your hardcore identity in gaming without your big customized system. The main problem, according to me, in its manufacture will be the cooling system because for data transfer and processing lots of graphics, the processors may be overheated. But if we think little wisely the process of the refrigerator (using dry ice ) can also be implemented over the drive or coating the drive with the coolest or heat absorbing chemical over the drive. The phenomenon may in this type that when it connects to the computer using that very little amount of electricity it runs the heat ejaculating process.
Finding the difference between series of graphics cards is not an easy task because it is difference to find out the difference between its core performances. It is also very difficult to differentiate them by only playing a game on both of the card of similar products from NVIDIA and ATI.
At first of the documentation I added the history of the companies individually. The cards description and other hardware details are also added later in the documentation.
I was only able to test some cards among all I listed which are:
NVIDIA 6600 GT: In this card I tested two games that are prince of Persia 4 and Farcry 1 with 1GB and 2.0 GHz Intel processor. The noticeable part was that in case of Prince of Persia 4 the frame rate is not suitable in some of the environments with far sight and more graphics. In case of farcry it was running swift until I found decrease in frame rate during some of the massive explosion animation and lots of graphics included there in the scene to render. But for most of the game that was release during the time of 6600 GT that was the better card. But when NVIDIA release 6600 GTX Ultra Edition it actually have boost to all games of that period and also have its attention in the market.
NVIDIA GT 130: This card comes with the iMAC with Intel processor 3.0 GHz and 3.0 GB of physical memory. The games I played with this card are prince of Persia 4, deadspace, burnout paradise and crysis. It is a new card from NVIDIA with their new technology for rendering with CUDA technology, it is good with the autodesk Maya rendering but it also unable unlock all hi-performance for some new release game. But with the all above mentioning game it is working fine. It is good with all big bug animation and detailing in texturing. The problem which are there in 6600 GT it overcomes all those and gave a swift movements and rendering qualities. The performance I feel in this card is better for most of the game. Or this may be optimized using software for most of the game. I tried all those with DirectX 9.0c but it better to enjoy those with DirectX 10.0 or DirectX10.1 because it support shader model 4.0.
For the ATI series the performance for gaming it is amazing piece (according to most of the gamer speak). But in case of processing other things like rendering, using core processor for brute force attack and other hash cracking process it a bit slower than NVIDIA cards.
Finally it can be said that for experiencing game it is good with the ATI series but for processing application or calculation (ALU) binary code it is better to go for a NVIDIA cards.
------from my experience.
For some of the details in gaming I took reference from my friends:
Pranjal Konwar (hardcore gamer)
Dithak Khaklari (interested in gaming and creating game music)