Final Words

ATI is the performance leader when we're talking about Source Engine performance at the high end. Unfortunately, we didn't have an X800 XT to compare with our 6800 Ultra, nor did we have our 6800 nonultra that we predict would have fallen between the 9800 XT and X800 Pro (closer to the latter) in our tests. Apparently, some of our graphics cards decided to go on vacation this week to visit a penguin. When it comes to upper midrange, NVIDIA's 6800 GT seems to have a leg up on the X800 Pro in most tests (though this may have changed if we could have gotten our Pro to run 20x15). This is just further proof that (so far) the GT offers some of the best value in NVIDIA's lineup.

Overall, the framerates we saw in these tests were higher than we expected. Doom III will bring just about anything to its knees at the highest settings, and 2048x1536 wasn't even an option on the list. We still expect to see very high framerates when gameplay elements (more CPU usage) are introduced into the mix. This follows the traditional view (that id Software broke from with Doom III) that higher resolutions and higher framerates are always the better option. Certainly, these aspects have their place, but id has proven they aren't the be all end all of graphics engine design. This fundamental difference in viewpoint helps explain our initial impressions of each game. Source can look incredibly crisp running at a steady framerate at 20x15, and Doom III can look incredibly frightening at 10x7 with its intense shadows, atmosphere and lighting effects, and well executed low contrast edges between overlapping objects.

We will absolutely still have to wait for Halflife 2 before we can make any further judgment calls about relative goodness of the engine. Obviously the outcome of our tests revealed that even when source is pushing its hardest against a graphics cards, modern hardware doesn't have any major trouble rendering scenes.

From our brief look at CPU scaling, we can see that none of our tests were really CPU bound. This helps us know we were pushing our graphics hardware as hard as possible. We can also expect Valve to use as much of the CPU headroom they can for other things in the actual game. This is why we haven't taken as in depth a look at CPU scaling yet.

We hope our coverage of Valve's latest beta release has been informative, and if there is anything further anyone would like to explore, please feel free to drop us a comment and let us know.

CPU Impact Teaser
Comments Locked

50 Comments

View All Comments

  • Ballistics - Sunday, September 5, 2004 - link

    I guess I was just expecting more from Anandtech, thats all.

    Tired of buyer guides that are biased towards certain manufacturers (ie) the FX5900XT was released in December 2003 and has proven itself to be an awesome price/performance card. It was a sub $200 card that ran with the +$300 + cards of the time. Yet ATI, ATI, ATI was all that was touted. How come nobody picked up on that little powerhouse of a card that finally gave nVidia fans something to get excited about? There weren't even any good articles on that card until february!?!

    I used to come to anandtech with confidence that what I was reading was the newest "unbiased" hardware analysis. I can't do that anymore. Thats all.
  • AtaStrumf - Saturday, August 28, 2004 - link

    False alarm :(
  • suryad - Saturday, August 28, 2004 - link

    Hey guys, forgive my lack of knowledge but does anyone know if the final product, Half Life 2 when released that is, will have the capability to use SSE/3DNow! instructios in the CPU? Would it also take advantage of Hyperthreading? That would probably lead to higher framerates wouldn't it? I raise this question because I read at the S.T.A.L.K.E.R game website that the X-Ray engine is supposedly capable of taking advantage of not only the GPU (Duhh) but rather the type of CPU the computer has as well. Any comments? Thanks guys. As for the article, I am kind of leaning towards people who say that the games should be benhced on high end, middle end and lower end spec machines. I agree completely so it would give a better idea to most people. But since this is just a beta and I am sure that most people are interested in knowing about HL2 instead of CS IMHO I cant really blame AT for not making the article the way people want it. I am sure when they get the final released copy of HL2 all our questions will be answered. Thanks everyone.
  • AtaStrumf - Saturday, August 28, 2004 - link

    It appears that Gabe said that HL2 will be going gold on Monday, August 30th.

    http://www.hl2fallout.com/forums/index.php?showtop...
  • robbase29a - Saturday, August 28, 2004 - link

    One more thing, if I may... Some of you guys have it all wrong. Yes, AT is a hardware site... that's a given. But something that (some of) you people aren't getting is that hardware just doesn't stand alone. People don't just buy the newest nvidia card because of it's awesome architecture. Nor do people buy $800 dollar cpus because of their sweet pipelines, right? Hardware is used to run software.... duh.

    So.. what i'm getting at: AT is using this game to glean information about all the available hardware there is. CPUs, Graphics cards, and maybe RAM too. You need to know how your current card measures up before you upgrade right?... That is why AT is going to write a comprehensive review of ALL (or it may be safer to say most) of the graphics cards out there (CPUs too). Not just the new ones. So let's stop this silly griping and wait for them to do their thing. Go AT! - Message posted with good intentions, not to hurt anyone's feelings.
  • Tobyus - Friday, August 27, 2004 - link

    Ok, thanks Derek. That probably explains the difference. I am just amazed that my $170 processor can outperform an overclocked $800 processor. I guess the optimizations really make a big difference.
  • Phiro - Friday, August 27, 2004 - link

    Jalf said it all :)
  • Jalf - Friday, August 27, 2004 - link

    Well, as someone said earlier, AT happens to be a hardware site, not a gaming site. It's a lot more relevant for them to benchmark new cards in an interesting game, than 3 year old ones.

    Presumably they're more interested in which of the new cards actually performs best, than in how old harware HL2 can handle.

    Makes sense to me, and doesn't bother me the least. I read AT to learn about new hardware, and to know what I should upgrade to, not to find out whether my current system can run games. I use actual game reviews for that.
  • Gugax - Friday, August 27, 2004 - link

    We all need to chill out a little bit. I am shure AT will do a complete review as soon as they have a final HL2 copy to benchmark.
    My suggetion goes to havinga bang for the buck report. Better if including both Doom3/Hl2 results combined.
    Most people prefers one of the games. But everybody sooner or later will use both engine based games to play on. And if they are using this benchmarks to help them decide the best way to go, this shoul help them.
    And yes, not everybody is lucky to have a 800$ CPU... (altough you used it to take the cpu out of the equation, I know). :)
  • robbase29a - Friday, August 27, 2004 - link

    I think everybody needs to chill about AT not including the midrange video cards. I would also liked to have seen them, but we have to keep in mind that this is NOT a real game. This is just a preliminary test of a test world. I'm sure AT will come out with a full blown (midrange included) review when the real thing comes out. If everyone just exercised a little bit of patience, we wouldn't have such hot heads floating around.

Log in

Don't have an account? Sign up now