Reinventing the G80 - the NVIDIA GeForce 8800 GT 512MB

Just in time for the holiday season, NVIDIA has launched its latest update of the GeForce 8800 series, the 512MB GeForce 8800 GT. With a new 65nm core and full PureVideo HD capabilities, this is the product refresh enthusiasts have been waiting for. Read on for the full performance benchmarks.

Reinventing the GeForce 8800

Moore's Law demands it, enthusiasts speculate about it and the head honchos at NVIDIA are probably rubbing their hands in anticipation at the prospects of world domination. In case you haven't guessed, the next evolution in the GeForce 8 series is set to be unleashed today and this new arrival has been predictably dubbed the GeForce 8800 GT.

There is an air of inevitability about this new high-end card. A different core from the ones found in the GeForce 8800 series (G92 instead of the G80) and a die shrink (NVIDIA's first 65nm core). Same old story, you may think and it won't be far off the mark, as it seems to be the tradition among chipmakers to produce a more efficient variant of their architecture. History also suggests that it is this second generation that will have the superior price performance ratio, which probably explains all the pre-release excitement surrounding this card.

The new numbering in the core hints that there have been significant changes in the G92 and browsing the documents released by NVIDIA, that seems to be the case. For one, despite the die shrink, the G92 core actually packs more transistors than the G80, at a whopping 754 million compared to the 681 million on the G80. It also exceeds the 700 million transistors on the 80nm ATI Radeon HD 2900 XT. What could be the reason behind the expanded core?

At the heart of the NVIDIA GeForce 8800 GT is the G92, a new core manufactured using a 65nm process but featuring even more transistors than the G80.

At the heart of the NVIDIA GeForce 8800 GT is the G92, a new core manufactured using a 65nm process but featuring even more transistors than the G80.

From a conceptual point of view, the G92 is a 'hybrid' (in an extremely loose sense of that word) of the G80 and the G84/86 cores. While the G80 undoubtedly retained superior performance in benchmarks, one feature it lacked compared to its lower end brethren is the presence of NVIDIA's second-generation video processor (VP2), which took the processing burden off the CPU when decoding high-definition video streams, leading to lower CPU utilization rates. Now this dedicated video processor has been embedded into the GeForce 8800 GT core and you'll get the full PureVideo HD deal, meaning that along with VP2, there's also the bitstream processor and the AES128 engine. This is a significant improvement over the older GeForce 8800 cards and the addition of this hardware is a major reason behind the transistor count boost. Yet another reason is the G80's supporting NVIO chip that took care of some of the display controller logic is also now embedded within the G92 core.

Of course, NVIDIA is not merely remaking a GeForce 8800 GTX Version 2 with the GeForce 8800 GT. The company is not replacing the G80-based cards yet. Hence, while the new card has the performance credentials of a GeForce 8800, it is slightly less well equipped than the reigning king of the GeForce 8800 series. This is reflected in the slightly lower number of stream processors on the GeForce 8800 GT (112 vs 128) and a lesser bunch of ROP units (16 vs 24) on the new card. However, the GeForce 8800 GT compensates with more aggressive clock speeds, with core and stream processor clocks running higher than the GeForce 8800 GTX while the memory on both cards are similar. If anything, it's the other high-end cards that should be more worried: the GeForce 8800 GTS looks to be eclipsed while ATI's Radeon HD 2900 XT may find itself knocked from its niche by the newcomer. Below is a comparison of the various high-end cards on the market with the debut of the GeForce 8800 GT.

Model
NVIDIA GeForce 8800 GT 512MB
NVIDIA GeForce 8800 GTX 768MB
NVIDIA GeForce 8800 GTS 320/640MB
ATI Radeon HD 2900 XT 512MB
Core Code
G92
G80
G80
R600
Transistor Count
754 million
681 million
681 million
700 million
Manufacturing Process (microns)
0.065
0.09
0.09
0.08
Core Clock
600MHz
575MHz
500MHz
742MHz
Stream/Shader Processing Units
112 Stream Processors (operating at 1500MHz)
128 Stream Processors (operating at 1350MHz)
96 Stream Processors (operating at 1200MHz)
64 Shader units consisting of 320 Stream Processors (operating at 740MHz)
Texture Mapping Units (TMU) or Texture Filtering (TF) units
56
64
48
16
Render Backend Units (ROP)
16
24
20
16
Memory Clock
1800MHz DDR3
1800MHz DDR3
1600MHz DDR3
1650Hz DDR3
DDR Memory Bus
256-bit
384-bit
320-bit
512-bit
Memory Bandwidth
57.6GB/s
86.4GB/s
64.0GB/s
105.6GB/s
Ring Bus Memory Controller
NIL
NIL
NIL
512-bit
PCI Express Interface
Supports PCIe 2.0 (x16)
x16
x16
x16
Molex Power Connectors
Yes
Yes (dual)
Yes
Yes (dual)
Multi GPU Technology
Yes (SLI)
Yes (SLI)
Yes (SLI)
Yes (CrossFire)
DVI Output Support
2 x Dual-Link
2 x Dual-Link
2 x Dual-Link
2 x Dual-Link
HDCP Support
Yes
Yes
Yes
Yes
Street Price
US$199 (256MB) US$259 (512MB)
~ US$529 - US$549
US$279 - 349
~ US$399

Notable Highlights of the GeForce 8800 GT

Besides what we have mentioned earlier, the GeForce 8800 GT is also a milestone in being the first consumer graphics card to support the new PCI Express 2.0 specifications. Promising to double the bandwidth from a total of 8GB/s (for PCIe x16) to 16GB/s (with PCIe 2.0), this requires a compatible PCIe 2.0 motherboard (Intel's X38, AMD's RD700, NVIDIA's MCP72) to realize its maximum potential but naturally, it is also backwards compatible with the prevalent PCIe 1.1 standard that all of the other motherboards support today. Whether this will mean anything in practical application is moot at such an early point in time but you can count on more of such supported cards in the future. Our hunch is that the extra bandwidth won't play any significance anytime soon, but it's good to know that the card can handle such bandwidth when the need arises.

Another notable feature for the GeForce 8800 GT is that it is the first member of the GeForce 8800 series to support HDCP over dual link DVI. While some of the mid and lower range cards like the GeForce 8600 and 8500 series came with similar HDCP support (depending on the vendor while all the GeForce 8600 GTS cards supported it), the GeForce 8800 series lacked HDCP over dual-link. Instead, only single link HDCP is supported, meaning that HDCP content played over a display requiring dual-link DVI connection (such as the Dell 30-inch displays) will downscale to lower resolutions on these cards (1280 x 800 usually). This is because that particular monitor was designed to handle resolutions of 1280 x 800 and below on single-link connection while anything higher used a dual-link DVI connection. However when used with single link DVI connection such as a 24-inch LCD monitor or any HDTV, this issue doesn't exist and 1080P protected content can be viewed in their full bliss. Compared to ATI's Radeon HD series, all of which support HDCP over dual link, it is obviously not the best of situations for NVIDIA even though the user group for dual-link DVI displays is small.

This has been rectified for the new GeForce 8800 GT, with NVIDIA adding HDCP support over dual link and it will even scale the HDCP content to the display's native resolution such as 2560 x 1600 in the case of 30-inch LCD monitors, surpassing the supported video display resolution of 1920 x 1200 on the Radeon HD 2900 XT. Only a few expensive, 30-inch monitors currently support such a resolution for HDCP content so those with the budgets should find the GeForce 8800 GT the ideal high-end card for both HD video playback and gaming.

Somewhat related to the launch of the GeForce 8800 GT is the introduction of new ForceWare drivers (167.37 & above). First, there is a new transparency multisampling (TRMS) algorithm that has a lesser performance penalty compared to the older version. Hence, users may find now it feasible to use the new TRMS mode in addition to conventional anti-aliasing for better image quality. NVIDIA has also added anti-aliasing support for games using Unreal Engine 3 and this can be enabled from within the NVIDIA Control Panel.

As for the actual hardware itself, the GeForce 8800 GT is a slim, albeit relatively lengthy graphics card outfitted with a single slot cooler. The fan is quiet by any standard, though it does spin at its peak during boot up. The black metallic shroud on the cooler keeps most of the components on the GeForce 8800 GT under wraps, though most of the cooler felt scalding to our touch after a benchmarking session. At idle however, this was not an issue.

Like the GeForce 8800 Ultra, black is the color for the reference GeForce 8800 GT, though bet on vendors adding their own designs (we mean stickers). It's also a slim, single slot graphics card.

Like the GeForce 8800 Ultra, black is the color for the reference GeForce 8800 GT, though bet on vendors adding their own designs (we mean stickers). It's also a slim, single slot graphics card.

The quiet, single slot cooler is a mixture of aluminum and copper.

The quiet, single slot cooler is a mixture of aluminum and copper.

1.0ns DDR3 memory chips from Qimonda are found on our reference card.

1.0ns DDR3 memory chips from Qimonda are found on our reference card.

Like most high-end graphics cards nowadays, solid capacitors are the norm.

Like most high-end graphics cards nowadays, solid capacitors are the norm.

The power connector is almost disguised by the cooler's shroud. Only one connector is required and NVIDIA rates the power requirement of the GeForce 8800 GT as around 105W.

The power connector is almost disguised by the cooler's shroud. Only one connector is required and NVIDIA rates the power requirement of the GeForce 8800 GT as around 105W.

The GeForce 8800 GT comes with a pair of dual link DVI outputs supporting HDCP.

The GeForce 8800 GT comes with a pair of dual link DVI outputs supporting HDCP.

NVIDIA rates the TDP of the GeForce 8800 GT as 105W, lower than the 140W on the GeForce 8800 GTX. The 65nm core probably contributes to this figure in a positive sense, though the GeForce 8 series seems like advertisements for the eco-friendly movement when compared to the 215W rating on the Radeon HD 2900 XT.

Test Setup

The NVIDIA GeForce 8800 GT was tested on a system with the following configuration:

  • Intel Core 2 Duo E6700 (@2.66GHz)
  • Intel D975XBX 'Bad Axe' motherboard
  • 2GB DDR2-800 Kingston HyperX
  • Seagate 7200.7 SATA hard drive
  • Windows XP Professional with Service Pack 2
  • DirectX 9.0c

Given the high-end pedigree of the new GeForce 8800 GT, we obviously had to trot out the top guns for our benchmarking. This meant all the existing members of the GeForce 8800 series together with ATI's sole high-end contender, the Radeon HD 2900 XT. In lieu of the reference GeForce 8800 GTS 320MB, we used XFX's extremely overclocked Fatal1ty edition instead, since it happens to be the fastest GeForce 8800 GTS card we have tested.

As for the drivers used, we were unable to procure a Radeon HD 2900 XT to retest with the latest Catalyst, hence the results are based on our initial review done a couple of months back. The other GeForce 8 cards were running on ForceWare 158.19 while the new GeForce 8800 GT was on 167.37. The complete list of cards and drivers are as follows:

  • NVIDIA GeForce 8800 GT 512MB (ForceWare 167.37)
  • NVIDIA GeForce 8800 GTX 768MB (ForceWare 158.19)
  • NVIDIA GeForce 8800 GTS 640MB (ForceWare 158.19)
  • XFX GeForce 8800 GTS 320MB Fatal1ty (ForceWare 158.19)
  • ATI Radeon HD 2900 XT 512MB (8.37 Catalyst drivers)

Meanwhile, the benchmarks tested are as follows:

  • Futuremark 3DMark06 (ver. 102)
  • Tom Clancy's Splinter Cell: Chaos Theory (ver 1.3)
  • F.E.A.R
  • Supreme Commander
  • Company of Heroes (ver 1.3)
  • Unreal Tournament 3 Beta Demo

     

Result - 3DMark06 Pro (ver 102)

With the older drivers (ForceWare 158.19) that we had for the GeForce 8800 GTX, the new GeForce 8800 GT in fact narrowly edged out the GTX at the lower resolutions. When anti-aliasing was enabled, this advantage disappeared. Since we suspected that our drivers were the reason for the GeForce 8800 GT's slim victories at 1600 x 1200 and below, we reran the GeForce 8800 GTX with the same 167.37 drivers and this time, the GTX was ahead, albeit by a similarly minor margin. Thanks to its high clock speeds, this GeForce 8800 GT looks to be almost on par with the GeForce 8800 GTX. The importance of its 512MB memory buffer was also seen in this benchmark, as the highly overclocked XFX card came within a whisker of matching the GeForce 8800 GT, even with older drivers. This margin however widened at higher resolutions, which is a good advert for having more memory, especially for high-end cards. The original 640MB GeForce 8800 GTS looked well beaten and its days may well be numbered.

Results - Splinter Cell: Chaos Theory & F.E.A.R (DirectX 9 Benchmarks)

In Splinter Cell, the GeForce 8800 GT was neck to neck with the overclocked XFX GeForce 8800 GTS. Without anti-aliasing, the GeForce 8800 GTX maintained its perch at the top of charts, though things got more competitive when we enabled it. ATI's Radeon HD 2900 XT started catching up with the leaders and at 1920 x 1440, even edged out the GeForce 8800 GTX.

F.E.A.R was another game where the Radeon HD 2900 XT showed that it shouldn't be underestimated, especially without anti-aliasing. The GeForce 8800 GT and the XFX were both slightly slower than the ATI card, though the tables were turned later. Again, the overclocked XFX was a threat to the GeForce 8800 GT at the lower resolutions but faded once the settings were ramped up. The GeForce 8800 GTS meanwhile had to be contend with last position for most of the settings and it should also reflect how a 320MB version of the reference GeForce 8800 GTS will fare in these benchmarks.

Results - Supreme Commander

The GeForce 8800 GT was again impressive in this real time strategy game and came rather close to the top GeForce 8800 GTX on many occasions. However, it was still unable to break away from the Fatal1ty GeForce 8800 GTS, with both cards more or less equal at many resolutions. Ignoring this special edition card however, one could see the GeForce 8800 GT leaning heavily towards the GeForce 8800 GTX in benchmarks rather than the GeForce 8800 GTS (represented by the 640MB version here).

Results - Company of Heroes & Unreal Tournament 3 Beta Demo

Company of Heroes saw the GeForce 8800 GT taking a rather large lead over the GeForce 8800 GTX, though remember the GTX was on older drivers. The XFX however showed its weakness here as it finished last despite the higher clocks. When it came to the Radeon HD 2900 XT, the GeForce 8800 GT should be at least on par with its ATI rival.

We encountered some problems on the GeForce 8800 GT in Quake 4, when 4xAA was enabled. While we could run it, the frame rates were much lower than expected. It could likely be a driver glitch, but we couldn't confirm at this point of time. In its place, we have the beta demo of the widely anticipated Unreal Tournament 3 instead. As the actual game is not available yet, the beta demo did not have a complete set of benchmarking tools like the previous versions. There is however, a repeatable flyby demo that reported the frame rates and we used the VCTF map Suspense in this case.

It turned out that high clock speeds mean a lot in UT3, with the XFX churning out higher frame rates than the GeForce 8800 GT, due to its excessively high clocks. The GeForce 8800 GT remained slightly slower than the GeForce 8800 GTX and there was no doubting its superiority over the vanilla GeForce 8800 GTS, whether it be 320MB or 640MB.

Temperature

Temperature could be a potential sticking point for the GeForce 8800 GT. From our observations, the single slot cooler for the GeForce 8800 GT is stretched to its limits here, with recorded temperatures of the core much higher than the GeForce 8800 GTX at full load. The memory chips were also found to be among the highest for the high-end cards. From the looks of the many overclocked versions to be released by vendors, heat doesn't seem to worry the manufacturers, though custom coolers could be an option.

Power Consumption

On paper, the TDP for the GeForce 8800 GT alone is 105W but in our test system, the complete system's power draw reached up to 225W at full load. This is still the lowest for all the cards featured here, with only the Radeon HD 2900 XT having a lower idle power consumption figure.

Overclocking

As evident from the overclocked editions of the GeForce 8800 GT soon to pop up in retail, this card does seem capable of much higher clocks than its default, probably due to its 'cooler' 65nm core, which didn't seem that cool to us during testing. We did not actually have the time to try out the absolute limits of our reference card, but we did test it with an overclocked BIOS provided by NVIDIA. This BIOS pushed the core clock to 650MHz while the memory clocks were raised to 2000MHz DDR. Our 3DMark06 scores are shown below and it is quite apparent that the GeForce 8800 GTX faces serious competition here, particularly if overclocked editions are as popular as they are nowadays.

The New Slim Shady

With the holiday season almost upon us, the new releases for PC games are starting to resemble a flood of who's who in the gaming industry. Titles like Gears of Wars, Hellgate: London, Unreal Tournament 3 and Crysis are slated to be released in the next month or so. More importantly, these are highly anticipated A-list titles DirectX 10 supported games that Microsoft is probably betting the bank on to jumpstart its Games For Vista initiative. Similarly for the graphics cards industry, some of these games are the 'killer apps' that everyone is hoping will lead to greater adoption of DirectX 10 hardware.

There's no questioning its performance. The GeForce 8800 GT combines the power of the 8800 series with the PureVideo capabilities of the GeForce 8600/8500, making it a compact, albeit warm solution for all purposes.

There's no questioning its performance. The GeForce 8800 GT combines the power of the 8800 series with the PureVideo capabilities of the GeForce 8600/8500, making it a compact, albeit warm solution for all purposes.

Which means that almost a year after the first DirectX 10 graphics card, the G80 was unveiled, the early adopters that had gotten the GeForce 8800 GTS/GTX will finally get a spread of DirectX 10 games to choose from. And if that's not enough of a slap to early adopters, NVIDIA's new GeForce 8800 GT threatens to undercut their prized and expensive purchases. No doubt, the higher bandwidth and slightly superior hardware of the GeForce 8800 GTX will continue to have its place at the top of the charts. What the GeForce 8800 GT has shown us today, is that there is competition even at the top, with scores that are almost at the level of a GeForce 8800 GTX in all but the highest resolutions and settings. As for the GeForce 8800 GTS, forget about it. Only the most extremely overclocked ones, like the XFX Fatal1ty edition might survive the impending purge.

And this is solely from a performance aspect. What about the additional PureVideo HD hardware that is found on the GeForce 8800 GT but not on older GeForce 8800 cards? Lower CPU utilization rates for HD content is a definite plus and arguably more useful now, with the specter of lower prices for Blu-ray and HD DVD media and hardware, compared to a year before. Not to mention that HDCP over dual link, a desired requirement for HDCP content playback at the highest of resolutions (2560 x 1600), is supported by the GeForce 8800 GT, but not on the GeForce 8800 GTS/GTX. With the exception of being warmer than expected for a 65nm core (not like it's stopping the overclockers and the GeForce 8800 GTX better fear the overclocked GeForce 8800 GT), the GeForce 8800 GT is the slimmer, more attractive option.

The contrast is especially stark when you consider the US$259 price point that NVIDIA has mandated for the 512MB edition of the GeForce 8800 GT. NVIDIA expects 256MB versions to be released in the next few weeks, with a recommended price of US$199. Meanwhile, retail availability looks good, with stocks already in stores as we speak. For those who do not have large displays (read 24-inch and above LCD monitor sizes), the 256MB edition is adequate - for current games at least. We can't yet write-off the fact that newer games might have much higher frame buffer requirements and for those planning ahead, you would be better off topping up US$50 more for the 512MB edition. Compared to the US$500 that you can expect to pay for a GeForce 8800 GTX and one may wonder why not get two GeForce 8800 GT instead and try out SLI? ATI will probably be feeling very anxious about its upcoming high-end Radeon HD 3800 since NVIDIA has certainly played its card here with the GeForce 8800 GT and it's looking like an ace.

Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.

Share this article