Friday, July 18, 2008
VIA C7-D Gaming ? Why The Heck Not.
The VIA VB7001 + C7-D @ 1.5Ghz:
The Sparkle PCI8500GT, as rare as they get:
Introduction: A while ago I decided to build a small mini-ITX system for use as a secondary home computer for my parents. We have recently purchased a Sharp 32" LCD TV, and I thought hooking up the computer to the TV for living room use as an internet browser would be neat. Once I started exploring the rather fascinating world of mITX systems, I realized that a whole breed of video cards exists tailored for use in such systems. Obviously, I am talking about PCI bus cards, that extinct breed many thought died years ago with the advent of AGP.
Like it often happens, the many were wrong: Not only the PCI video card lives on, but there are even DirectX 10.0 cards for the aging bus. These, as of today, are the PowerColor/Diamond/Visiontek HD2400Pro and the Sparkle 8500GT, with Albatron recently announcing the 8400GS, 8500GT and, surprisingly, the 8600GT for the PCI bus. I spent some time (It was extremely hard to find) and some money (90 USD), and got the Sparkle 8500GT via internet order from Canada. I bought a VB7001 motherboard off ebay and completed the rest with local purchases. When I was researching the topic of PCI bus video cards I realized that while there are many people (as the techPowerUp forum thread aptly titled: "So you only have PCI slots and want to game?" indicates) who are interested in the subject, but information is scarce. I decided that a benchmarking session of the VIA C7-D, and most importantly the Sparkle 8500GT, was in order.
I stocked up on beer, snacks, an old CRT monitor (for resolution flexibility) and set to work on what is, to my knowledge, the one and only VIA C7-D + PCI8500GT benchmark on the web.
NOTE: The performance of the PCI bus 8500GT displayed here often suffers from the CPU bottleneck phenomena. On a more powerful system like the Pentium 4 these results are likely to improve, but I would not hazard a guess as to by how much.
The Test Setup:
CPU: VIA C7-D (Esther) at 1.5Ghz.
Motherboard: VIA CB7001 mini-ITX Motherboard.
Video Card: Sparkle 8500GT PCI, 256Mb DDR2, 128-bit memory bus.
Memory: 1Gb Kingston DDR2 667Mhz at 533Mhz CL4 (The board does not support 667Mhz).
Hard Drive: Seagate ST3500320AS 500Gb HDD, 32Mb cache.
DVD-RW: Plextor PX-608CU Ultra-Slim external DVD-RW drive.
Wireless: Edimax USB High-Gain wireless adapter.
Wireless Keyboard: Scorpius-P20MT w/Trackball.
Power Supply: HEC 250SRT Flex ATX (250W).
Case: Custom mini-ITX case (Homebuilt).
Case Cooling: Thermaltake Thunderblade 120mm blue LED fan.
Monitor: Daewoo 19" CRT.
Overclocking:
Note: While most reviews leave overclocking for last, due to the lack of free time, I could not run all tests both overclocked and non-overclocked, so I decided to focus on the overclocked tests.
Overclocking the VIA board is an elusive business. In short, the VB7001 board has no overclocking support whasoever, and I was loathe to deal in BIOS editing on this board. Thus, the C7-D was left at the stock clock of 1.5Ghz. The Sparkle 8500GT has proven far more docile: ATITool allowed me to bring up the core clock to 719Mhz (From a stock speed of 459Mhz, a 56% increase !) with a maximum load temperature of 70C. Backed up by an hour of artifact testing, I decided to move onto the memory. The memory was far less generous, with the DDR2 used on this board giving me only a modest 52Mhz increase, from 400Mhz to 452Mhz (A 13% overclock). With another 26 minutes of artifact searching, I decided this would be as far as I would attempt to get. With my overclocking options pretty much exhausted, I decided to move onto the benchmarks.
The overclocked system running Super Pi at a very leisurely pace:
Stress Testing:
More of it:
3DMark06:
As a synthetic benchmark, I usually do not rely on 3DMark software and their equivalents, but in this case it is quite interesting to see what the result of the test is. The test was run on the basic and free version of 3DMark06 at a resolution of 1280X1024 pixels. The FPS averaged at around 4 FPS during the GPU test scenes, while the CPU test was absolutely horrible with FPS way below 1 frame per second (And closer to one frame per minute !), resulting in a painful to watch slideshow. Clearly, 3DMark06 is way more than this system can manage, but this is to be expected. The final results surprised me, however, since the system managed to get 1,038 marks. While a low score by any measure, this has given me a little hope since a little research shows a stock Athlon X2 3600+ with an overclocked 8500GT (PCI-E DDR3 model) scores around 2,700 marks.
Gaming:
First, I have to note that I have run into major problems with trying to run games. Many would simply refuse to start and this list includes Homeworld 2, Rome: Total War, Hellgate London, Spellforce 2 and Doom 3. I do not know what is the reason for the games not starting, but I can only assume some incompatibility with the CPU, chipset, GPU drivers or the PCI-to-PCI-E bridge used by the PCI8500GT.
Mechwarrior 4: Mercenaries:
This game is old and serves as the starting point of our benchmarks. It was quite playable all the way through 4XAA on 1280X1024 (Forced via the nVidia panel) on maximum settings, but any less would have been a disaster, since the game has been released in 2002. I played through the Jungle Assault Class Match mission on the different tested resolutions. As the game did not support higher resolutions, they were not tested.
Nexus: The Jupiter Incident:
This game is a bit more "modern", having been released on November 5th, 2004. Despite being flawed by a myriad of bugs, the game offered engaging (Even if, at times, extremely difficult) space combat and beautiful graphics. It is one of the games that have really aged well, as even today, at the tested resolutions, the game looked visually appealing. I have played through the second mission, and recorded the results with FRAPS.
F.E.A.R:
This first person shooter, released on October 17th, 2005, was one of the most graphically intensive games of its time. This is the first real challenge for our test setup, since the CPU falls below the system requirements (Which are a 1.7Ghz Pentium 4 CPU, or an equivalent). The benchmarking was done via the game's built in test scene. On the minimum settings, the game is playable on all tested resolutions, up to 1600x1200, but the visual appeal of running F.E.A.R on such settings is very low. On medium settings, the game is quite playable on 1024x768. With all settings maxed out (without the high performance ones such as Volumeric lighting and soft shadows) the game is playable at 800x600 and 1024x768, albeit the second one might get choppy at heavy action scenes. In all cases the CPU related settings were kept to the minimum, as to not stress the CPU, which spent most of the benchmark session on 100% in any case.
While the results from running F.E.A.R aren't amazing, they are quite impressive for a system way below the minimum requirements for the game. I am sure that with a more powerful CPU, even this PCI video card would be able to pull off more impressive results.
Minimum settings, all extra options off:
No high-end features on, all GPU settings on medium, all CPU settings on low:
Auto-Detected settings, GPU settings maxed out, but without volumetric lighting and soft shadows, which are off, and with 2xAS filtering:
You can see that the performance differences are next to non-existent, probably due to the extremely weak CPU.
Oblivion:
Released in 2006, Oblivion was a very demanding game graphics wise, even if one that has aged less than gracefully, looking very dated in 2008. As we shall shortly see, the VIA C7 is completely unable to cope with this game spending the entire benchmarking session at full load. The GPU itself should have been able to achieve playable framerates with a more powerful CPU, since I played Oblivion on a 3Ghz Pentium 4 with a GeForce 6600 (non-GT) back in the day.
In order to record FPS information, I played repeatedly through the initial dungeon. Since the FPS was extremely low, I decided to not do outdoor benchmarks, as it would have been quite pointless at this point.
Everything maxed out, no soft shadows and no volumetric lighting, HDR enabled, indoors:
Default, auto-detected settings, bloom lighting, all settings at medium, indoors:
Lowest settings, indoors:
Call of Duty 4:
If you are going to ask why bother, why not ? I had the game, I had the computer and I had a spare 30 minutes. Playing the game was possible, but very not fun. Still, it is impressive it would run in the first place on a system this weak. The FPS recording was done on the "Crew Expendable" (Cargo Ship) mission, which was replayed in its entirety for each FPS measurement.
Bilinear filtering, automatic texture settings, all other settings on normal, all high-performance features off:
Again, you can see that the different graphics settings made next to no difference in the actual frame rate.
Crysis:
If at this point you think that I am absolutely insane, you are probably right. Running the world's most demanding game on the performance equivalent of a Pentium III has to border on absolute madness, but I did it anyway. Not only did I do it, but it also ran, and it was even playable in the early game with slowdowns usually being caused by the CPU processing sound. Quite frankly, I was more than amazed, I barely expected the game to start, much less run ! I ran through the beginning of the game, up to clearing the jamming station on the beach after landing and repeated a benchmark with a savegame of the Onslaught chapter (The tank battle). The latter was not playable, but this is beside the point - The world's most demanding game runs on the C7 and PCI8500GT combo.
All settings low:
I decided to repeat the measurements again, this time at a resolution of 848x480, which displays less pixels on the screen than 800x600. The results have not changed by any felt margin. I also have to add that in the early game, the game seems to be running far smoother than the 16-17 average FPS the FRAPs record gives it, probably due to the slowdowns occurring when there is little action and thus they are not felt.
All settings low:
Conclusions:
Well, there isn't much to conclude. The mini-ITX systems based on processors like the VIA C7 are not meant for gaming, but as I have shown today, they can do that, even if poorly. Also, the PCI8500GT is indeed a viable option for those of us stuck with only PCI slots (I still recommend a motherboard upgrade, since PCI video cards are way too costly for the performance), since its performance should be higher when not stuck with a CPU this weak (Again, I cannot predict by how much). I wouldn't recommend any of this setup to anyone, since it makes so little sense, but it does look cool, which was the entire point.
Thank you for reading.
Subscribe to:
Posts (Atom)