NVIDIA GeForce 8600 GTS: The Full Review!
NVIDIA gives you one less excuse not to go Windows Vista and DirectX 10 by introducing its new, mid-range lineup GeForce 8600 series with DirectX 10 support and an improved PureVideo HD engine. We took the top card in the series, the GeForce 8600 GTS for a spin and here's our verdict.
By HardwareZone Team -
Mainstream DirectX 10
When NVIDIA introduced the first DirectX 10 graphics card for the PC last year, the GeForce 8800 GTX, most of us could only gawk at its powerful performance and equally impressive price tag from the sidelines. It was undoubtedly the fastest card then and it still is, but it also remains out of the reach of most users. Those with limited budgets could only look forward to the inevitable lower and mid-range variants of NVIDIA's new architecture.
Fast forward six months and the highly anticipated mid-range GeForce 8 cards are finally ready to be unveiled, with two separate series (the GeForce 8600 and 8500) catering to a range of budgets. Leading the charge is the GeForce 8600 GTS, equipped with a new 80nm core (G84) designed for NVIDIA's GeForce 8 architecture and promising even more PureVideo HD enhancements. This series of graphics cards looks set to be the intended mid-range successor to NVIDIA's highly successful GeForce 7600 series, especially given its similar naming convention. Presently, NVIDIA has three different GeForce 8600/8500 cards shipping before May, namely the GeForce 8600 GTS, the 8600 GT and the 8500 GT (in decreasing order of performance and price). For our article today, we shall be focusing mostly on the fastest of the three, the GeForce 8600 GTS. First, let's take a look at how this newcomer stacks up on paper against some of its likely competitors in the market now.
Model | NVIDIA GeForce 8600 GTS 256MB | NVIDIA GeForce 8800 GTS 320MB | NVIDIA GeForce 7950 GT 512MB | ATI Radeon X1950 PRO 256MB | ATI Radeon X1950 GT 256MB |
Core Code | G84 | G80 | G71 | RV570 | RV570LE |
Transistor Count | 289 million | 681 million | 278 million | 330 million | 330 million |
Manufacturing Process (microns) | 0.08 | 0.09 | 0.09 | 0.08 | 0.08 |
Core Clock | 675MHz | 500MHz | 550MHz | 575MHz | 500MHz |
Vertex Shaders | 32 Stream Processors (operating at 1450MHz) | 96 Stream Processors (operating at 1200MHz) | 8 | 8 | 8 |
Rendering (Pixel) Pipelines | 24 | 12 | 12 | ||
Pixel Shader Processors | 24 | 36 | 36 | ||
Texture Mapping Units (TMU) or Texture Filtering (TF) units | 16 | 48 | 24 | 12 | 12 |
Raster Operator units (ROP) | 8 | 20 | 16 | 12 | 12 |
Memory Clock | 2000MHz DDR3 | 1600MHz DDR3 | 1400MHz DDR3 | 1380MHz DDR3 | 1200MHz DDR |
DDR Memory Bus | 128-bit | 320-bit | 256-bit | 256-bit | 256-bit |
Memory Bandwidth | 32.0GB/s | 64.0GB/s | 44.8GB/s | 44.1GB/s | 38.4GB/s |
Ring Bus Memory Controller | NIL | NIL | NIL | 512-bit (for memory reads only) | 512-bit (for memory reads only) |
PCI Express Interface | x16 | x16 | x16 | x16 | x16 |
Molex Power Connectors | Yes | Yes (dual) | Yes | Yes | Yes |
Multi GPU Technology | Yes (SLI) | Yes (SLI) | Yes (SLI) | Yes (Native CrossFire ready) | Yes (Native CrossFire ready) |
DVI Output Support | 2 x Dual-Link | 2 x Dual-Link | 2 x Dual-Link | 2 x Dual-link | 2 x Dual-link |
HDCP Output Cable? | Yes | Yes | Yes | Yes | Yes |
Street Price | US$199 - 229 (SRP) | ~ US$299 - 309 | ~ US$219 - 249 | ~ US$159 - 199 | ~ US$149 |
The G84 Core
The NVIDIA GeForce 8600 GTS uses a brand-new variant of the GeForce 8 core, the G84, manufactured using an 80nm process compared to the 90nm on the original G80. As might be expected from a mid-range card, the G84 core is a watered down version of the G80, with only 32 unified shaders (or as NVIDIA calls them, stream processors). This is less than half of the 96 found on the GeForce 8800 GTS 320MB. NVIDIA did compensate somewhat by clocking these stream processors at 1450MHz instead of the 1200MHz on the GeForce 8800 GTS. The core clock is the highest we have seen for a GeForce 8 card so far at 675MHz, a full 100MHz faster than the GeForce 8800 GTX.
Memory Matters
All these high clocks however are constrained by the narrow 128-bit memory bus on the new cards. Despite hopes that we will see an increase in memory bandwidth for this new generation of graphics cards, NVIDIA has kept with a 128-bit bus for the GeForce 8600 and 8500. This is the same as the GeForce 7600 series and gives the GeForce 8600 GTS a total bandwidth of 32.0GB/s even with its vastly superior 2000MHz DDR memory clock, and compares unfavorably against existing 256-bit memory bus equipped cards like the GeForce 7900 GS. Appropriately, the memory size is set at 256MB as given these constraints we doubt having a larger memory buffer in this case will significantly boost performance. The crux of the matter is to keep costs low and the specs of this new series clearly looks to be doing just that.
Power and Other Notes
As you may have noticed too, the transistor count for the G84 core has been signifacantly reduced, from the record breaking 681 million on the GeForce 8800 series (G80 core) to the more reasonable 289 million on the new core for the midrange parts. Power consumption will definitely be lower than its high end counterparts (max TDP for the GeForce 8600 GTS is rated at 71W by NVIDIA) and so only the GeForce 8600 GTS requires the mandatory 6-pin Molex power connector. The slower GeForce 8600 GT and 8500 GT will not need the power connector. Minimum PSU requirements as recommended by NVIDIA is a 350W unit for a single GeForce 8600 GTS and 450W if you intend to setup a pair of these cards in SLI.
On a related note, NVIDIA has also said that in a break from the past, these new cards will have identical clocks for both 2D and 3D mode. In other words, the clock speeds are no longer lower for 2D and then increased when running 3D applications. NVIDIA feels that with the modest TDP of the GeForce 8600/8500 series, there is little power savings to be had by doing so. While we would beg to differ on this matter, the fact is that Windows Vista does continuously utilize the GPU to keep its fancy Aero user interface running slick and pretty. Drawing notes from history, Vista (fortunately or unfortunately) will very likely be the defacto operating in time and as such NVIDIA's decision sounds valid. However, Win XP users could have still benefited from further power savings if NVIDIA opted to still operate the GPU with dual clock speeds. Looking at the future usage model and in the perspective of easing GPU design, we can understand this decision.
So far, looking at the modest specifications, the NVIDIA GeForce 8600 GTS doesn't look like it will be setting any speed records in the gaming department but NVIDIA has something else up its sleeves and that is none other than a major boost to its PureVideo HD technology. We'll discuss the enhancements next.
More PureVideo HD Goodness
Starting with the GeForce 6 series, NVIDIA begun to bolster the video processing capabilities of its graphics cards with the addition of a dedicated processor along with software and driver support, collectively named PureVideo technology. This move was in line with the growing use of high-definition video codecs like H.264, VC-1 in movies and HD media content. PureVideo helps to accelerate this as well as other common MPEG 2 and WMV media content by relieving some of the processing off the CPU and on to the GPU. Additionally, PureVideo also had processing to ensure quality of the video stayed true as it was intended. In more recent times with the various developments in HD media formats such as Blu-ray and HD DVD, NVIDIA has subsequently added to this technology for the GeForce 7 series, with a new version known as PureVideo HD that brings proper hardware acceleration to the table for HD media playback.
While it is true that having a PureVideo HD capable GeForce 7 graphics card reduced CPU utilization when playing back HD movies on a PC, it is not perfect and NVIDIA itself acknowledges that by stating that a dual core processor was still recommended for optimal HD video playback. The next step is obvious and that is something NVIDIA has now claimed to achieve with the new GeForce 8600 and 8500. These cards feature a newer and presumably improved video processor along with an integrated bitstream processor to enable full HD decoding. According to NVIDIA, these mainstream GeForce 8600 and 8500 cards are the world's first video processors to almost completely offload the processing needed for both HD DVD and Blu-ray video playback. Although we're always wary of superlative claims like these, we believe that at the least, the new cards will massively reduce the HD video decoding workload from the CPU. As seen in the CPU utilization chart below presented by NVIDIA, we concur with their findings for a system with and without PureVideo HD as . All that's left now is to verify if processing overload on the new GeForce 8600/8500 class stays true to the chart and we'll look into that area as soon as we've acquired the right components again.
The GeForce 8600/8500 series will be the first to come with NVIDIA's 2nd generation video processor, aided by an integrated bitstream processor to lessen the CPU workload by taking over the HD video processing.
The diagram illustrates the benefits of having a GeForce 8600/8500 card. As you can see, thanks to the additional bitstream processor, HD video decoding can now be fully offloaded to the graphics card, where previously certain processes still needed the attention of the CPU.
According to NVIDIA, the CPU utilization falls quite dramatically once a GeForce 8600/8500 is used to handle HD playback, compared to the older GeForce 7 series.
The new video processing engine will only be found on the GeForce 8600 and 8500 at the moment and are actually embedded within the G84 and G86 cores respectively (and soon the upcoming GeForce 8400 series using the G86 core will also support the new video processing engine). Hence, the older but faster GeForce 8800 series will not have this hardware, giving rise to the possibility that there will be lower CPU utilization numbers on a system equipped with a GeForce 8600/8500 as compared to a more expensive system featuring a GeForce 8800 GTX. NVIDIA has said that they believe users who have chosen the more powerful (in 3D performance) GeForce 8800 GTX are enthusiasts who are likely to possess very capable systems. For these users, gaming performance is paramount and the GeForce 8800 is still extremely competent for HD video playback, especially coupled with the high-end CPUs found in these systems. Therefore, for the near future, we are unlikely to find the new VP2 and bitstream processor on the GeForce 8800 (not even in their upcoming GeForce 8800 SKU).
Furthermore, NVIDIA has assured us of PureVideo HD support in Windows Vista for both the GeForce 7 and 8 series. However for those using Windows XP, the new GeForce 8600 and 8500 cards will only have PureVideo HD support in June 2007 so you may want to wait a bit if you're on the older operating system.
The NVIDIA GeForce 8600 GTS 256MB
Compared to the gigantic GeForce 8800 cards, the GeForce 8600 GTS is drastically reduced in both length and width. A conventional single slot cooler is sufficient for this card, though it was quite audible. However, we have already seen some passively cooled retail versions available from vendors so if noise is an issue, you may opt for those special editions.
A slim mid-sized cooler is sufficient for this new addition to the GeForce 8 family.
The standard clock for the core is 675MHz while the DDR3 memory is at 2000MHz DDR. Again, expect some retail versions to come overclocked out of the box but prices will of course be likely higher than the standard versions. As it is, NVIDIA has specified a broad price range of between US$199 and $229 for the GeForce 8600 GTS. Overclocked or versions with exotic cooling solutions will probably be selling at the higher end of the spectrum.
The 80nm G84 core.
SLI support is present and the usual NVIDIA SLI bridge will link two GeForce 8600 GTS cards in tandem. At the moment, DirectX 10 SLI support is still missing but NVIDIA has said that this would be rectified by early May when new drivers will be released. Of course, this is a minor issue now, since there are no DirectX 10 applications to speak of and there should be no problems with existing DirectX 9 programs.
Samsung's 1.0ns DDR3 memory chips are used, enabling the memory clocks to start at a speedy 2000MHz DDR.
The GTS will require the usual 6-pin Molex power connector though the other cards in the GeForce 8600 and 8500 series will not. A mix of solid and electrolytic capacitors are used, with the majority being solid.
You can expect both dual-link DVI outputs on this card to be HDCP compliant. The bonus here is that both these outputs support HDCP over dual-link which means that they can output high definition content to 30-inch and above HDCP compliant displays. Also, while the GeForce 8600 GTS is HDCP compliant, NVIDIA has left it to individual vendors to implement HDCP for the GeForce 8600 GT and 8500 GT. So this is not a standard feature for these 'lesser' cards.
Test Setup
Our usual graphics test system was used to benchmark the GeForce 8600 GTS . Consisting of an Intel Core 2 Duo E6700 (2.66Ghz) matched with an Intel D975XBX 'Bad Axe' motherboard, this high-end system had up to 2GB of DDR2-800 Kingston HyperX memory modules running in dual channel mode. A Seagate 7200.7 SATA 80GB hard drive completed the setup, installed with Windows XP Professional updated with Service Pack 2 and DirectX 9.0c.
We tested the GeForce 8600 GTS using ForceWare 158.16 drivers provided by NVIDIA (which will soon be official). The other cards and the respective drivers used are listed below. We did not have scores for some of the older cards at higher resolutions like 1920 x 1440. For those cases, we have labeled them with a 'N.A'.
- NVIDIA GeForce 8600 GTS 256MB (ForceWare 158.16)
- NVIDIA GeForce 8800 GTS 320MB (ForceWare 97.02)
- NVIDIA GeForce 7950 GT 512MB (ForceWare 93.71)
- NVIDIA GeForce 7900 GS 256MB (ForceWare 93.71)
- ATI Radeon X1950 PRO 256MB (Catalyst 7.2)
- ATI Radeon X1950 GT 256MB (Catalyst 7.2)
The benchmarks tested are listed below:
- Futuremark 3DMark05 (ver. 120)
- Futuremark 3DMark06 (ver. 102)
- Tom Clancy's Splinter Cell: Chaos Theory (ver 1.3)
- F.E.A.R
- Company of Heroes (ver 1.3)
- Quake 4 (ver 1.2)
Results - 3DMark05 Pro & 3DMark06
The GeForce 8600 GTS got off to a decent start in 3DMark05, with the card taking second spot in 1280 x 1024 behind the GeForce 8800 GTS. However, as we progressed up the resolutions, the card was overtaken by the Radeon X1950 series at 1920 x 1440. Similarly, this trend was repeated for 3DMark06, but not to the same extent.
Things did improve slightly in 3DMark06 with anti-aliasing enabled and we found the GeForce 8600 GTS taking the runners up position again. However, it could not complete the benchmark at 1920 x 1440 due to its inherently small frame buffer and dropped out from the test. Overall, the GeForce 8600 GTS' processing core holds a lot of promise but it does seem that the 128-bit memory bus may be stifling the GeForce 8600 GTS, particularly at the higher resolutions where memory bandwidth and size increasingly become more important. While it managed to stay in-line with the performance class cards of previous generation (to a certain degree), you can also notice that the GeForce 8800 GTS was up to 40% faster than the GeForce 8600 GTS in the benchmarks! One wonders what NVIDIA has in store to bridge this huge gap as leaving it open will only encourage competition from it competitor and we are sure NVIDIA knows that already.
Results - Splinter Cell: Chaos Theory (DirectX 9 Benchmark)
Following the somewhat disappointing 3DMark scores, the GeForce 8600 GTS continued to fare poorly against its competitors. In Splinter Cell, it was around the performance level of the GeForce 7900 GS and the Radeon X1950 GT. When we enabled anti-aliasing with HDR, it was even eclipsed by the Radeon X1950 GT. It seems the newcomer is not much of a price-performance competitor as of now.
Results - F.E.A.R (DirectX 9 Benchmark)
F.E.A.R proved to be another stern test for the new mid-range contender, with the GeForce 8600 GTS only narrowly edging out the GeForce 7900 GS in the first set of results. When anti-aliasing was factored in, the GeForce 7900 GS actually managed to beat the GeForce 8600 GTS. Again, the limited memory bandwidth may be playing a large factor here since the performance dipped at more intensive settings.
Results - Quake 4 & Company of Heroes (SM2.0+ Benchmarks)
In both these games, the GeForce 8600 GTS did not impress us with its numbers. Performance faltered again when the resolutions were increased in Quake 4 while the GeForce 8600 GTS was simply not competitive with cards like the Radeon X1950 GT and PRO in Company of Heroes. In fact, the GeForce 8600 GTS scored almost identical to the GeForce 7900 GS in Company of Heroes. We also had an issue in Quake 4 whereby the game would crash to desktop whenever we tried to change the in-game resolution through the game settings menu. This is probably a driver related problem but it was quite annoying all the same.
Temperature Testing
Although its clock speeds are considered high by the standards of the GeForce 8, the temperatures that we recorded were quite cool. None of them exceeded 50 degrees Celsius. The core itself was at a relatively cool at 47.7 degrees, almost 10 degrees lower than competing products like the GeForce 8800 GTS or even the 80nm core based Radeon X1950 PRO.
Overclocking
We had problems overclocking the GeForce 8600 GTS in our initial experience with the card. This issue seems to have evaporated after we used the newer 158.16 drivers provided and we could overclock the card through NVIDIA's nTune utility easily. There was quite a fair amount of overclocking tolerance in this mainstream graphics card and from its default core of 675MHz, we managed to hit a decent 750MHz. At these speeds, the GeForce 8600 GTS could match the Radeon X1950 XT at lower resolutions in 3DMark06.
Conclusion
Not impressed. That's how we would describe our benchmarking experience with the GeForce 8600 GTS so far. Although we have had a hint of this when testing an overclocked GeForce 8600 GTS in our performance preview article, we had hoped that the problem was with immature drivers. Now that we have tested the standard model using ForceWare 158.16 (a strangely large leap from the 100.95 beta drivers we used previously), the performance has not improved much at all.
To sum up, the GeForce 8600 GTS is still struggling against budget high-end cards from the previous generations, like NVIDIA's own GeForce 7900 GS and ATI's Radeon X1950 PRO. These cards have a superior 256-bit memory bus and in many cases, overclocked and with 512MB of frame buffer. These buffed versions will easily beat the GeForce 8600 GTS in most games and 3D applications now. The 128-bit memory bandwidth of the GeForce 8600 GTS is probably quite a serious limitation here. Of course, it may not be fair to compare these cards directly but with the low prices of these former high end cards, the market has placed them head to head and in some cases, these cards are much cheaper than the new GeForce 8600 GTS (US$199 - US$229)
A chief concern is the yawning performance gap between the GeForce 8600 GTS and the GeForce 8800 GTS 320MB. Unless there's another model waiting in the shadows that we are unaware of, a GeForce 8600 Ultra for example, then the GeForce 8600 GTS is presently the bridge between the mid-range 8600 series and the higher end enthusiast oriented 8800s. And there is a rather wide disparity between the two. This difference cannot be easily narrowed even with the expected overclocking from vendors and enthusiasts alike (as we've seen in our preview article).
Speaking of overclocking, we feel that this aspect could well be the hidden 'weapon' for this mid-range card, at least from the perspective of the enthusiasts. We got a decent overclocking margin in our testing and we bet that others could do better, especially with more extreme methods. At the very least, expect to find a variety of clock speeds on retail models available soon.
The main appeal of this mid-range GeForce 8600 GTS could lie with its PureVideo HD enhancements. If the figures presented by NVIDIA are indeed the case, then HD video playback on a PC will no longer require dual core processors for a smooth and lag free experience. A single core could do just fine, provided it has a GeForce 8600/8500 card to take over all the decoding. This focus on multimedia application, especially for high definition playback could make it suitable for HTPC setups and we think it's the right approach for a graphics card targeted at the mainstream.
In the end, the GeForce 8600 GTS is still a decent graphics card for gaming and as added 'insurance' it supports DirectX 10 unlike most existing cards now of this class and price range. Who knows what its performance will be like in those future games. PureVideo HD meanwhile looks to have taken a giant step forward with the new video processors onboard the GeForce 8600/8500 and that is perhaps the most interesting aspect of this new mid-range series. Finally, even as we speak, the retail boxes are probably being put on display at retailers worldwide. Judging from the many complete retail versions that have landed in our labs for review, it's looking like another hard product launch from NVIDIA and for that, we have to take our hats off.
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.