Concluding their Gamescom festivities for their newly-introduced GeForce RTX 20-series, NVIDIA has revealed a bit more about the hardware, its features, and its expected performance this evening. Tonight NVIDIA is announcing the new Ansel RTX features in GeForce Experience, as well as some game performance metrics for the GeForce RTX 2080 up against the GeForce GTX 1080. After recent hands-on demos featuring real-time raytracing, NVIDIA is offering some numbers for out-of-the-box and Deep Learning Super Sampling (DLSS) performance in traditionally rendered games.

NVIDIA RTX Support for Games
As of August 20, 2018
Game Real-Time Raytracing Deep Learning Super Sampling (DLSS)
Ark: Survival Evolved - Yes
Assetto Corsa Competizione Yes -
Atomic Heart Yes
Battlefield V Yes -
Control Yes -
Dauntless - Yes
Enlisted Yes -
Final Fantasy XV - Yes
Fractured Lands - Yes
Hitman 2 - Yes
Islands of Nyne - Yes
Justice Yes
JX3 Yes
MechWarrior 5: Mercenaries Yes
Metro Exodus Yes -
PlayerUnknown's Battlegrounds - Yes
ProjectDH Yes -
Remnant: From the Ashes - Yes
Serious Sam 4: Planet Badass - Yes
Shadow of the Tomb Raider Yes -
The Forge Arena - Yes
We Happy Few - Yes

Starting with NVIDIA’s DLSS – and real-time raytracing for that matter – we already know of the supported games list. What they are disclosing today are some face-value 4K performance comparisons and results. For DLSS, for now we can only say that it uses tensor core-accelerated neural network inferencing to generate what NVIDIA is saying will be high-quality super sampling-like anti aliasing. Though for further technical information, this is a project NVIDIA has been working on for a while, and they have published some blogs and papers with some more information on some of the processes used. At any rate, the provided metrics are sparse on settings or details, and notably measurements include several games rendered in HDR (though HDR shouldn't have a performance impact).

Otherwise, NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates. To be perfectly honest, I spent the entire floor time talking with NVIDIA engineers and driver/software developers, so I have no pictures of the floor demo (not that anything less than a direct screenshot will really do it justice). Ultimately, the matter of DLSS is somewhat nuanced and there isn’t much we can add at the moment.

Overall, the idea is that even in traditionally rasterized games without DLSS, the GeForce RTX 2080 brings around 50% higher performance than the GeForce GTX 1080 under 4K HDR 60Hz conditions. Because this excludes real-time raytracing or DLSS, this would be tantamount to ‘out of the box’ performance. Though there were no graphics settings or driver details to go with these disclosed framerates, so I'm not sure I'd suggest reading into these numbers and bar charts one way or another.

Lastly, NVIDIA announced several new features, filters, and supported games for GeForce Experience’s Ansel screenshot feature. Relating to GeForce RTX, one of the features is Ansel RT for supported ray-traced games, where a screenshot can be taken with a very high number of rays, unsuitable for real-time but not an issue for static image rendering.

Ansel RTX also leverages a similar concept to the tensor core accelerated DLSS with ‘AI Up-Res’ super resolution, which also works for games not integrated with Ansel SDK.

In terms of the GeForce RTX performance, this is more-or-less a teaser of things to come. But as always with unreleased hardware, judgement should be reserved until objective measurements and further details. We will have much more to say when the time comes.

Comments Locked

92 Comments

View All Comments

  • Yojimbo - Saturday, August 25, 2018 - link

    "I doubt that. Where did you get 35%?"

    https://s22.q4cdn.com/364334381/files/doc_financia...

    1157/3123 = 37%. What the operating margins are of a particular product, I have no idea. It's probably a bit higher than average for the Ti parts, but it's tough to say because the data center (Tesla) parts probably have margins much higher than average. But these are just rough numbers to show you that what you said is way off base.

    "I don't buy that at $600 2080 would be barely breaking even. I really doubt it."

    That's not what I said. I said that if you take $200 off the price of the 2080 Ti GPU (I wasn't talking about the 2080, although taking $100 off the 2080 would do something similar. It just wasn't the calculation I made) they are probably approaching break even. Let's clarify something. NVIDIA sells GPUs, not graphics cards. When you buy an MSI RTX 2080 Ti for $1000, a cut of that goes to the retailer. I don't know what their margins are, but let's say they bought the card for $900. Now MSI had to buy all the RAM and the PCB and the GPU and all the other components to put it together, plus they need to make a margin as well to make it worth it. Perhaps it costs them $800 to make the card and $100 is their cut. Out of that $800, they need to pay for the assembly line and workers and the components, including the GPU. So I think $600 for the TU-102 GPU was a very high estimate, since it doesn't leave very much for all the other stuff. That $600 is what NVIDIA is getting. If you take $200 off of that then you leave them with $400. If NVIDIA is selling the card at about 1/3 operating margins then 2/3 of the money they receive for the part is going toward expenses. 2/3 of $600 is $400. So, that $200 you took off was surely almost all of their operating profits.

    Now, perhaps you want NVIDIA's AIB partners and the retailers (good luck with the retailers) to share in the margin loss with your price cut. Then you are not just squeezing NVIDIA's profits you are squeezing the others' profits too. Maybe you can get Micron to take less for the GDDR6 DRAM chips...

    My point is this: NVIDIA projects upcoming margins to be lower than the previous quarter's margins. So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards. That suggests that the higher prices are accounted for by greater costs to manufacture.
  • eddman - Saturday, August 25, 2018 - link

    "So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards."

    Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit.

    You can't calculate a card's profit margin based on the entire company's profit numbers. There is no way each 2080 Ti costs AIBs $800 to make. I very, very much doubt that. As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit.
  • Yojimbo - Sunday, August 26, 2018 - link

    "Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit."

    Then what are we arguing about? The raise in prices are justified by the greater cost of the card if they are still making less profit off them with the raise. You're just making a normative statement of "NVIDIA should be making less profit altogether".

    "You can't calculate a card's profit margin based on the entire company's profit numbers."

    You're right, but it's the best that we have. We can guesstimate.

    "There is no way each 2080 Ti costs AIBs $800 to make."

    If NVIDIA is charging $600 for the GPU, then that leaves $200 left for the other stuff. $200 for the RAM, PCB, voltage regulators, heat sink, labor and assembly line costs, etc., seems exceedingly low. I tried to estimate the price NVIDIA was charging AIBs in the high range because that gives the best chance of your $200 price cut to not result in a loss for NVIDIA, making your case as strong as possible. If we move the price NVIDIA is charging AIB's to $500 then that leaves more room for a possible $700 cost to make the cards. But that doesn't help your case.

    "As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit."

    No they don't. They pretty much maximize their profits for the expected market conditions while avoiding risky situations that could put them in financial distress. But NVIDIA doesn't really have to worry about that latter part at the moment. The planning of what costs are acceptable happens a lot earlier than bringing the product to market. It's the market that sets the price. Companies try to predict the market conditions and then maximize their profits within those conditions. They end up with margins because maintaining margins is the whole point of the game, not in case there are unexpected price cuts.
  • eddman - Sunday, August 26, 2018 - link

    There is no way nvidia is charging AIBs $600 for a GPU. Where do you even get these numbers from? I bet the entire card costs AIBs no more than $500.

    2080s are overpriced.
  • eva02langley - Thursday, August 23, 2018 - link

    You can buy a 1080 TI AMP on amazon for 529$.
  • milkod2001 - Thursday, August 23, 2018 - link

    BS, it is $679
  • iwod - Thursday, August 23, 2018 - link

    I assume hey price this so they can lowered it down a year later into normal range. Imagine next year you get double the performance of RTX 2080 with 7nm.

    I haven't been following GPU close, what happens to Dual GPU config? Are the software still not up to it?
  • yhselp - Thursday, August 23, 2018 - link

    Judging by your hands-on with real-time ray tracing in games from a couple of days ago, an RTX 2080 Ti struggles to maintain 60fps at 1080p. And even though games and drivers aren't final yet, it's still hard to believe they'd be able to gain much performance. If so, what on earth is NVIDIA on about with these 4K60fps stats, and on a less powerful card no less?

    Is NVIDIA advertising two separate features (ray tracing and 4K) that can work on their own but not together? Are we talking about 2080 being capable of 4K rasterization and ray tracing at, what, 900p? Seems about right if 2080 Ti struggles at 1080p... And what of the 2070 then? Would it be able to run ray tracing in games at all?

    It doesn't seem likely NVIDIA would champion a marquee feature, and name their cards after it, that is such a performance hog that it can only run on a $1000 flagship at 1080p in late 2018.

    Something doesn't add up. Please, confirm, deny, or provide more information.
  • Skiddywinks - Thursday, August 23, 2018 - link

    My impression has been that the struggling performance we've seen with the likes of BFV, Metro, and Tomb Raider was with RT enabled, and the recent numbers straight from nVidia have been without RT and with some unspecified AA (to then compare to DLSS). The 4K numbers are almost without a doubt with RT disabled, otherwise they'd be shouting it out from the rooftops.
  • eva02langley - Thursday, August 23, 2018 - link

    Guess what, they probably comparing 16X MSAA or either Ubersampling in to compare their cards with their new "equivalent" features.

    Unless we see benchmarks, this graph proves absolutely nothing beside it looks like the 2080 is about 10% faster than a 1080TI.

Log in

Don't have an account? Sign up now