FIRST LOOK: Gigabyte K8NXP-SLI
by Wesley Fink on November 24, 2004 9:00 PM EST- Posted in
- Motherboards
Our Take
Just when it looked like we'd all be waiting breathlessly for the lone Asus SLI board sometime in December, Gigabyte has delivered a very capable SLI board that will likely be available in the market at about the same time. Our experience with the K8NXP-SLI was extremely positive. The board was exceptionally smooth in performance and trouble-free in our benchmarking with both single and SLI nVidia video cards. We had no issues at all with memory and the BIOS adjustments were exemplary.The best way to look at the Gigabyte K8NXP-SLI is to see it as the SLI version of the regular and great performing K8NXP-9. The board offers the same features, the same BIOS options, the same basic layout, the same performance with a single video card, and the same outstanding overclocking that you will find with the K8NXP-9. We were definitely impressed with the Gigabyte K8NXP-9 and the SLI version is just icing on the cake.
The only problem that you will have is trying to decide which flavor of the Gigabyte nForce4 you want, since the only real difference in the 9 and SLI is the SLI option. Whichever you choose, we think that you will be very pleased with the features, flexibility, and performance you will find. Gigabyte is definitely back in the Athlon 64 market with the K8NXP-9 and K8NXP-SLI. We won't know exactly where these boards stand in the nF4/SLI hierarchy until some dust settles. However, we are very impressed with what we see in both these Gigabyte boards right now. Both boards are faster than the nForce4 Reference board and both were a real pleasure to use. The overclocking capabilities of both versions will give any enthusiast a lot of pleasure. When you consider the possibilities with the SLI version, the math of potential gets very exciting. The only fly in the ointment is the disappointing DIMM voltage adjustments provided. However, even this may be patchable with something like the OCZ DDR Memory Booster to boost voltage ranges.
As we were finishing this review, we also learned that MSI will have both their nForce4 and nForce4 SLI boards available for review in the next week to 10 days. That will provide even more choices - and more competition - for your holiday dollars.
We can certainly recommend either Gigabyte nForce4 motherboard for your system. We don't foresee any surprises on the horizon and both these boards are very capable. One of these boards is going into our memory test bed in the near future. But we have the same problem that you will have - we just can't decide which one!
58 Comments
View All Comments
BenSkywalker - Thursday, November 25, 2004 - link
"To clarify test results, benchmarks are reported in separate graphs for standard results at 1024x768 resolution and enhanced results at 1280x1024. Since 1600x1200 normally requires a 20" or larger flat panel monitor, we did not report 1600x1200 results, since most readers will not run at that resolution."As pointed out by cnq not having even 1600x1200 makes your review of SLI worthless. Honestly, stopping at 1600x1200 is a bit of a joke, who is going to spend $1K on video cards and have some POS display that can't handle a resolution worthy of that kind of cash? SLI testing should START at 1600x1200 and go up to 2048x1536- anything else is pointless. Anyone who purchases dual 6800GTs to run 1280x1024 certainly lacks the capacity to be reading this site as everyone already knows it is nigh pointless.
You may as well test 320x240, it has about the same level of relevance. 1600x1200(for baseline, low end comparisons), 1920x1440 and 2048x1536 are the settings that people who are honestly thinking about ponying up want to see tested.
"Most any decent 19" CRT can support 1600x1200 as you stated, but have you ever tried to play a game at 1600x1200 on a 19" CRT. I tried it just to see for this review and it was pretty ugly. However 16x12 was OK on the 22" Diamondtron, though I prefer 1280x1024 on the 19" flat panels for most gaming."
Why is playing a game on a 19" @16x12 difficult at all? If you have a decent 22" Diamondtron try having someone set one input up for you running 1280x1024 4xAA and the other input running 2048x1536 without AA and see what you think is better. I can't even comment about how 2048x1536 with AA would look as the only people that can run that type of setup right now are those with SLI parts in their hands and unfortunately they haven't deemed us worthy of obtaining that type of knowledge.
Wesley Fink - Thursday, November 25, 2004 - link
#11 - The K8NXP v1.0 and the K8NXP-SLI both ran perfectly at 5X HT, which was the reported issue with the nF4 "bug". All stock benchmarks were run with 5X enabled.Live - Thursday, November 25, 2004 - link
I'll have to agree with Wesley on the resolutions. Most people don't run 1600x1200. SLI as it stays nowlooks limited on lower resolutions but will it when more demanding games are realesed? I think not. So the upgrade option is letting me run future games at max quality without byuing a new top of the line card. Just focusing on 1600x1200 beacuse you own a 2001FP seems rather silly.
Beenthere - Thursday, November 25, 2004 - link
There never was an issue with the nF4 chipset silicon. The A02 runs at 800 MHz HTT as the socket 754 was designed to run and the A03 runs at 1000 MHz HTT as the S939 is designed to do.The original Rev. 1.X A7N8X Mobo was designed to run at 133 MHz FSB as that's what AMD chipsets used for a FSB at the time it was designed. Later when AMD was planning to release the XP 3200 with a 200 MHz FSB, the Rev. 2.0 A7N8X was released. There never was any design defect in the A7N8X it's just a matter of CPU/Mobo evolution.
PCIe is just an evolution also. There is no performance advantage to PCIe over AGP unless you run dual graphics cards in SLI mode.
VIAN - Thursday, November 25, 2004 - link
Wesley, where is 8xAA/16xAF benching.I understand your resolution decision, plus you probably didn't have time to bench it all... but no 8xAA/16xAF. Especially when the 6800Ultra came to a crawl with 8xAA enabled.
That would've kicked ass.
Googer - Thursday, November 25, 2004 - link
Correction:I am wondering, since I have no Interest in running SLI but would like to use an LSI x4 scsi controler in place of the other graphics card. I know you can run a x1 or x2 device (nic?) in slot with a higher number of lanes (x8,x16,x32)and it will function fine.
My questen is because these boards use a semi-proprietary PCI-e setup it possible to use the second (x16 sized) unused PCI-E x16 slot for something else? while running something (i.e. a Graphics Processor) in the main x16 slot. I have a strong suspicion that it is ok; based on the fact that TOMSHADWARE.com Ran both ATI and nVIDIA (x800 and 6800uL) on the same motherboard! Run some Please let me know thanks!
Googer - Thursday, November 25, 2004 - link
#9 Agreed, they are less informed since they like to know as little as they possibly can. Ignorance is bliss. As for you and I, we are much higher up on the totem pole then they.I am wondering, since I have no Interest in running SLI but would like to use an LSI x4 scsi controler in place of the other graphics card.I know you can run an x1 or x2 device in slot with a higher number of lanes (x8,x16,x32) Is it possible to use the unused PCI-E x16 slot for something else? while running somethin in the main x16 slot. I have a strong suspicion that it is ok; based on the fact that TOMSHADWARE.com Ran
both ATI and nVIDIA (x800 and 6800uL) on the same board! Please let me know thanks!
madnod - Thursday, November 25, 2004 - link
it's a good review and it's a good mobo, but what about the reported bug in the NFORCE4 shipset is it fixed yet or no? if no then all this bunch of mobos are going to use the customers and beta tester, i had thsi situation with the ASUS A7N8X deluxe rev 1.3 2 years ago and i still hate asus for that.j@cko - Thursday, November 25, 2004 - link
I guess that their brain is too small to absorb all those info...j@cko - Thursday, November 25, 2004 - link
I don't understand why some people would critize Anand for including too much info... Those people are morons, period.