On the Bench: nVidia's AGP-based GeForce 6600 GT

By Dave Salvator  |  Posted 2004-11-15 Email Print this article Print
 
 
 
 
 
 
 

Review: nVidia's long-awaited price/performance GPU-based GeForce 6 now comes in an AGP flavor. How does it stack up versus its PCI Express-based cousin? We find out.

In the latest running of the GPU bulls, there was a bit of a snag in the red cape—the platform transition from AGP to PCI Express. Because large OEMs were locking and loading to put PCI Express systems in the market, and because OEM business carves out 85–90% of the discrete PC graphics pie, both ATI and nVidia were compelled to offer PCI Express midrange GPUs first, with AGP-based offerings playing the part of follow-on. That was great for OEMs, but left many upgraders to in the waiting game. Well if you were waiting for the GeForce 6600 GT to make its way to AGP, your wait is just about over.

The new AGP-based offering has arrived, and it brings some good (though not unexpected) news: There isn't yet any performance difference between PCIe and AGP. Given that the current generation games were spec'ed out in an AGP universe, this isn't earth-shattering news, but it is nonetheless good new for gamers with AGP systems who have been chomping at the bit to get NV40 goodness into their systems. nVidia is first out of the gate with its AGP-based midrange offering, which depending on how quickly it can populate the retail channel with product, could give it a leg up this holiday season.

Today, we take an abbreviated look at the 6600 GT and show you most of what you'd expect, though we did uncover one surprise. Let's take a look.

Here's the tale of the tape for the AGP-based GeForce 6600 GT as well as the latest GPUs from both nVidia and ATI.

GeForce 6800 Ultra
GeForce 6800 GT
GeForce 6800 non-Ultra
GeForce 6600 GT (PCIe)
GeForce 6600 GT (AGP)
GeForce 6600
ATI Radeon X800 XT
ATI Radeon X800 Pro
Transistor Count:
222 million
222 million
160 million
146 million
146 million
146 million
160 million
160 million
Manufacturing Process:
0.13-micron
0.13-micron, low-k dielectric
0.13-micron
0.11-micron
0.11-micron
0.11-micron
0.13-micron, low-k dielectric
0.13-micron, low-k dielectric
Core Clock Speed:
400MHz
350MHz
325MHz
500MHz
500MHz
300MHz
520MHz
475MHz
Number of Pixel Pipes:
16
16
12
8
8
8
16
12
Number of Texturing Units:
16
16
12
8
8
8
16
12
Number of Vertex Pipelines:
6
6
6
4
4
3
6
6
Peak Pixel Fill Rate (theoretical):
6.4Gpixels/sec
5.6Gpixels/sec
3.9Gpixels/sec
4Gpixels/sec
4Gpixels/sec
2.4Gpixels/sec
8.3Gpixels/sec
5.7Gpixels/sec
Peak Texture Fill Rate (theoretical):
6.4Gtexels/sec
5.6Gtexels/sec
3.9Gtexels/sec
4Gtexels/sec
4Gtexels/sec
2.4Gtexels/sec
8.3Gtexels/sec
5.7Gtexels/sec
Memory Interface:
256-bit
256-bit
256-bit
128-bit
128-bit
128-bit
256-bit
256-bit
Memory Clock Speed:
1.1GHz DDR-3
1GHz DDR-3
700MHz DDR-2
1GHz DDR-3
900MHz DDR-3
TBD by board maker, DDR-1
1.2GHz DDR-3
730MHz DDR-3
Peak Memory Bandwidth:
35.2GB/sec
32GB/sec
22.4GB/sec
16GB/sec
14.4GB/sec
Will vary with memory clock speed
36.8GB/sec
23.4GB/sec

In terms of straight pixel-pumping horsepower, the 6600 GT actually stacks up pretty well versus the 6800 non-Ultra, although it has a good bit less peak memory bandwidth. The 6600 GT's 128-bit memory interface does limit it to "only" 16GB/sec of peak memory bandwidth, which is still a boatload of bandwidth—it just pales in comparison to the high-end GPUs whose wider interfaces are 30GB/sec or more in many cases.

One difference to note here is that the PCIe version of the 6600 GT has an 11% memory clock advantage, and that difference made itself felt in the game tests we ran (more on that in a bit). The AGP version also needs a single 12-volt Molex connector to provide additional power not available from the AGP slot. The PCIe version gets all the power it needs from the PCIe slot. Those points aside, the parts are otherwise identical.

The AGP version uses nVidia's High-Speed Interconnect (HSI) technology to run on AGP platforms, and from our performance data, isn't introducing any performance overhead. As part of nVidia's ramp, it first rolled out PCI Express versions of the 6600 GT and 6600 to system OEMs, and add-in card makers are now preparing to ship AGP-based 6600 GT offerings. Despite its HSI bridge strategy, nVidia had ramped to meet anticipated demand for PCI Express cards for new systems using Intel's Grantsdale and Alderwood platforms, since OEM business comprises between 85–90% of the total graphics market.

The main purpose of this article was to verify that the performance of the AGP-based GeForce 6600 GT aligned with its PCIe cousin. We ran a subset of our 3D GameGauge test suite, focusing on the three most fill-rate-bound games: Doom 3, Far Cry, and Unreal Tournament 2004 (UT2004).

We also tested with Futuremark's 3DMark05 and Massive's AquaMark3 as additional data points.

For this round, we tested only at a resolution of 1280x1024 with both 2X AA and 4X AF enabled.

We've been quite impressed with the gaming performance from Athlon 64-based systems, and so our current AGP reference test system has the following components:

CPU:
AMD Athlon64 3400+
Motherboard:
ASUS K8V motherboard (Via KT800 chipset)
Memory:
1GB 800MHz DDR SDRAM
Audio:
Sound Blaster Audigy 2
Hard Drive:
Maxtor 250GB S-ATA hard drive
Optical Drive:
Plextor PX-504UF
Display:
ViewSonic P220f
Operating System:
Windows XP Pro w/SP2 and all updates installed
DirectX version:
9.0c

For comparison purposes, we pitted the AGP-based GeForce 6600 GT with its PCIe counterpart. Unfortunately at press time, ATI doesn't currently have an AGP-based Radeon X700 card out yet, which is the logical competitor to the 6600 GT. That cat-fight will have to wait for another day. We tested both 6600 GT cards using the newest ForceWare 66.93 drivers. When testing the PCIe card, we used an Intel-based system with the following load-out:

CPU:
3.4GHz Pentium 4 Extreme Edition
Motherboard:
Intel 925CXV Motherboard (with Intel 925 chipset)
Memory:
1GB 533MHz DDR2 SDRAM
Audio:
Sound Blaster Audigy 2 ZS
Hard drive:
Maxtor 250GB S-ATA hard drive
Optical drive:
Plextor PX-504UF
Display:
ViewSonic P220f
Operating system:
Windows XP Pro w/SP2 and all updates installed
DirectX version:
9.0c

Despite the ostensible disparity between these two systems, the AGP card is actually somewhat favored in this matchup because of the Athlon64's stronger gaming performance. And because we're testing only at 1280x1024 with 2X AA/4X AF, and focusing our attention on fill-rate-intensive tests, the system CPU shouldn't have much effect on the actual test results.

AquaMark3 Frame Rate

The PCIe 6600 GT holds a slight 6% performance lead, and that margin is likely due to the PCIe part having an 11% memory clock advantage.

AquaMark3 Triangle Rate

Here again, we see a slight performance difference, but it's nothing to write home about.

3DMark05 is a synthetic benchmark that tests DX9 performance using three "game tests," which are scenes designed to look like actual games being played.

Here we see the PCIe version is ahead about 14%, due in part to PCIe part's memory clock advantage. These are both 128MB cards, and so the other factor may have been some amount of texture memory churn where the cards had to page data in from system memory. In this case, the PCIe-based card would have a decided advantage.

Here we encountered our one surprise. The PCIe version was considerably faster than its AGP counterpart, as in 43% faster. After re-running some tests where we disabled both AA and AF, the cards were running even. When we enabled only AF, the cards still ran even. But when we added 2X AA to the mix, the AGP card took a big hit, whereas the PCIe card was barely fazed. What's even stranger is that other games like Far Cry didn't have this issue, so it's a specific issue involving AA, Doom 3, and the AGP implementation of the 6600 GT.

We know what the culprits are here (AA and AGP), but what we don't know is why. And at this point, neither does nVidia. At press time, both we and they were investigating, and we'll have an update when they (or we) get more insight as to what's causing this.

Far Cry

The PCIe card is again holding a slight advantage here, but one that barely warrants a mention. It looks like the two cards are running even.

Unreal Tournament 2004

Here we see more of the same. The PCIe version is clinging to a narrow lead, but the two cards are for all intents and purposes running even.

So what we have here is a case of no news is good news. There's nothing that earth-shattering to report here, but that lack of news is good news for gamers with AGP-based systems ready to make their next upgrade. You can get nearly all of the 6600 GT goodness in an AGP package, and leave very little on the table. The only downside to the AGP version is a small memory clock deficit, and some industrious add-in card maker might decide to be more aggressive with its memory selection, evening the stakes there as well. As our tests show, that memory clock delta doesn't cause large performance swings to one side or another, though the PCIe card does gain a slight edge from it.

There's still a bit of mystery surrounding the AGP version and Doom 3 with AA enabled, and once we get that unraveled, we'll let you know what was causing it. For now, the big unanswered question is, when will we see a Radeon X700 come to the market that's AGP-friendly? At this point, ATI is staying mum, but conventional wisdom says that if the company is capable, expect an AGP version of the X700 in time for the holidays. For nVidia's part, AGP cards based on the 6600 GT are available today direct from BFG Technology, www.evga.com, and XFX Technologies. They're also available through retailers Buy.com, New Egg, and Tiger Direct. An AGP-based 6600 GT card will serve you well, but we're going to hold off making a definitive recommendation until we've had a chance to do a head-to-head comparison with an AGP-based X700 card. Stay tuned.

Product:
GeForce 6600 GT AGP
Company:
Pros:
Just about all the performance of the PCIe version.
Cons:
Memory clock a bit slower on the AGP version.
Summary:
Finally, a GeForce 6600 GT that doesn't require a motherboard upgrade.
Price:
Around $200 street for a 128MB board.
Score:
 
 
 
 
Dave came to have his insatiable tech jones by way of music—,and because his parents wouldn't let him run away to join the circus. After a brief and ill-fated career in professional wrestling, Dave now covers audio, HDTV, and 3D graphics technologies at ExtremeTech.

Dave came to ExtremeTech as its first hire from Computer Gaming World, where he was Technical Director and Lead (okay, the only) Saxophonist for five years. While there, he and Loyd Case pioneered the area of testing 3D graphics using PC games. This culminated in 3D GameGauge, a suite of OpenGL and Direct3D game demo loops that CGW and other Ziff-Davis publications, such as PC Magazine, still use.

Dave has also helped guide Ziff-Davis benchmark development over the years, particularly on 3D WinBench and Audio WinBench. Before coming to CGW, Dave worked at ZD Labs for three years (now eTesting Labs) as a project leader, testing a wide variety of products, ranging from sound cards to servers and everything in between. He also developed both subjective and objective multimedia test methodologies, focusing on audio and digital video. Before all that he toured with a blues band for two years, notable gigs included opening for Mitch Ryder and appearing at the Detroit Blues Festival.

 
 
 
 
 
























 
 
 
 
 
 

Submit a Comment

Loading Comments...
























 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date