After yesterday’s announcement from NVIDIA, we finally know what’s coming: the GeForce RTX 2080 Ti, GeForce RTX 2080, and GeForce RTX 2070. So naturally, after the keynote in the Palladium venue, NVIDIA provided hands-on demos and gameplay as the main event of their public GeForce Gaming Celebration. The demos in question were all powered by the $1200 GeForce RTX 2080 Ti Founders Edition, with obligatory custom watercooling rigs showing off their new gaming flagship.

While also having a presence at Gamescom 2018, this is their main fare for showcasing the new GeForce RTX cards. In a separate walled-off area, NVIDIA offered press some gameplay time with two GeForce RTX supporting titles: Shadow of the Tomb Raider and Battlefield V. Otherwise, they also had a veritable army of RTX 2080 Ti equipped gaming PCs for the public, also demoing Battlefield V and Shadow of the Tomb Raider (without RTX features), along with Hitman 2 and Metro: Exodus. Additionally, there were a few driving simulator rigs for Assetto Corsa Competizione, including one with hydraulic feedback. These games, and more, support real-time ray tracing with RTX, but not necessarily Deep Learning Super Sampling (DLSS), another technology that NVIDIA announced.

NVIDIA RTX Support for Games
As of August 20, 2018
Game Real-Time Raytracing Deep Learning Super Sampling (DLSS)
Ark: Survival Evolved No Yes
Assetto Corsa Competizione Yes No
Atomic Heart Yes
Battlefield V Yes No
Control Yes No
Dauntless No Yes
Enlisted Yes No
Final Fantasy XV No Yes
Fractured Lands No Yes
Hitman 2 No Yes
Islands of Nyne No Yes
Justice Yes
JX3 Yes
MechWarrior 5: Mercenaries Yes
Metro Exodus Yes No
PlayerUnknown's Battlegrounds No Yes
ProjectDH Yes No
Remnant: From the Ashes No Yes
Serious Sam 4: Planet Badass No Yes
Shadow of the Tomb Raider Yes No
The Forge Arena No Yes
We Happy Few No Yes

GeForce RTX 2080 Ti Hands-on: Shadow of the Tomb Raider

Starting with Shadow of the Tomb Raider, I got to play through a platforming puzzling sequence that was amusingly difficult to navigate. I thought I was just bad, but the neighboring gamer fared just as poorly and we ended up trading tips on each successive obstacle. Poor skills aside, the game was rendered in 1080p and capped at 60fps with the graphics settings locked, but I could definitely notice framedrops, even though the gameplay was rather slow-paced.

The game was rendering an outdoors scene, but because of the 1080p quality on a roughly 24” screen, I couldn’t see much of an overall quality improvement. Unfortunately, I didn’t realize until afterward that we had the option of capturing our footage, though honestly I’m glad no one was subjected to a video recording of my gaming incompetence.

Because we only had a certain allotted time, we didn’t get to finish that puzzle sequence, but from a real-time ray tracing perspective, it was hard for me to distinguish any added effects. It appears that this opinion was similar enough to others’ that the Tomb Raider Twitter issued a clarification.

GeForce RTX 2080 Ti Hands-on: Battlefield V

For Battlefield V, the situation was similar with a 1080p 144Hz monitor, playing on the Rotterdam map over LAN. There were framedrops during fast-paced scenes and in general it didn’t seem like it could keep up with the game. Again, there was no FPS info available but the RTX 2080 Ti was almost surely not cranking out constant 60fps. Here, the real-time ray tracing was quite noticeable, with vivid dynamic reflections in puddles, windows, and river. Even at 1080p, those features added to the overall image quality, though the ultimate performance cost was unclear. Framerates aren't a good tradeoff for image quality in fast-paced FPS', though for the record, I’ve always been terrible at shooters (except maybe Halo 2).

While the in-game real-time ray traced footage trailer is obviously putting the game and RTX in the best light possible, there is visible merit in explosions and lighting being reflected where they should. This time around, recorded gameplay footage could not be published until a later date, so words are all we have.

Assetto Corsa Competizione, Custom Models, and GeForce RTX 2080 Ti Photo Ops


Venue-goers try out the racecar rig after my turn is up

I also tried out Assetto Corsa Competizione on the rig with hydraulic suspension feedback, the whole setup being apparently worth over 40,000 euros. Only to find out what I already knew: I can’t drive a racecar (or non-automatics). The game is less intensive than Battlefield V or Shadow of the Tomb Raider, and on that note I didn’t notice any framedrops as I was half-racing half-crashing around the track.

In Gamescom proper, there were a few GeForce RTX 20-series AIB cards on display, including EVGA and Palit/Gainward. The Palit/Gainward representative mentioned their custom cards would be due mid-September, and that they had yet to start shipping, an interesting but unsurprising tidbit considering NVIDIA had just announced a firm date.


With real-time raytracing, games will be able to recreate realistic reflections as seen in bad photos like this one...


...or this one

NVIDIA even had a Gamescom booth with just the GeForce RTX 2080 Ti in a glass display stand, meant for photo ops. People got an NVIDIA RTX T-shirt out of it but it was somewhat amusing to see people line up to take a picture with a graphics card in the middle of a million public gaming demos.


Somehow, I think it would've been more 'normal' to see people take selfies with a graphics card

In any case, I think there are a few relevant takeaways from the hands-on:

  • RTX in terms of real-time ray-tracing is still in development, which is something confirmed by developers themselves for Shadow of the Tomb Raider and Battlefield V;
  • As presented thus far, RTX in terms of both real-time ray-tracing and deep learning super sampling (DLSS) require developer support and implementations may vary between them;
  • As presented thus far, RTX in terms of a technology or a platform is fairly confusing for gamers, because includes a few different technologies like real-time ray-tracing and deep learning super sampling (DLSS), but also provides the namesake for the “GeForce RTX” 20-series and “GeForce RTX” branded games (we will explain all this in detail when the time comes);
  • The demos didn’t clarify apples-to-apples performance differences between the GTX 1080 Ti and RTX 2080 Ti
  • September 20 is a long time to go without third-party objective analysis
POST A COMMENT

103 Comments

View All Comments

  • silverblue - Friday, August 24, 2018 - link

    Seconded; I'm very interested in how you combine RT with TBDR when, 20 years ago, it was heresy to even offer up the possibility that TBDR could be paired with hardware T&L.

    I still don't understand why immediate mode rendering is still the go-to for PC and console gaming if deferred rendering is pretty much an improvement in every single way. *shrugs*
    Reply
  • MrPoletski - Friday, August 24, 2018 - link

    Yes, every single way except that it introduces one frame of latency. That ray tracing on the powervr units is born from their purchase and integration of the caustic raytracing accelerator with their rogue architecture in the Wizard. Nvidia is claiming 10 giga rays in 250watts. ONe prodyuction powerVR wizard, in a mobile form factor, was specced at 10 mega rays in 2 watts. Scaled up that would be 12.5 giga rays at 250 watts. This was also from 2015/16 sorta time. God I wish PowerVr would get off their asses and put out another PC GPU. Reply
  • lucam - Friday, August 24, 2018 - link

    Very good point. I hope that PowerVR gets back too Reply
  • D. Lister - Saturday, August 25, 2018 - link

    Learn to use a calculator, "10 mega rays in 2 watts" scales upto 1.25 gigarays, not 12.5. Yeesh! Reply
  • McD - Thursday, August 30, 2018 - link

    The GR6500 was pushing 300mrays at 3W theoretical max though Otoy only got 100mrays. An iPad Pro (A10X) was showing 30mrays under Metal2 at WWDC2018 with no special hardware.

    Given the best AR solution needs RT and Apple could have bought ImgTec for the Caustic IP, I would say Apple have their own hardware solution. Not sure it’ll be in the A12/X though they’re more likely to leave it for the AR Glasses reveal.
    Reply
  • Yojimbo - Tuesday, August 21, 2018 - link

    The following is my take, which is different from yours.

    I think RTX is more than just accelerated DXR. RTX is a few things. As far as software it is an implementation of DXR. It maps the DXR API to NVIDIA's hardware, whether that hardware be accelerated for ray tracing or not. So, the RTX software stack is still being used to implement DXR on Pascal GPUs.

    RTX is also a hardware designation for a set of hardware technologies that accelerate NVIDIA's RTX software stack, and hence Microsoft's DXR.

    Finally, from what I remember, part of the RTX software stack is sort of like a GameWorks library, which means that it sits on top of DXR and implements it for various rendering techniques developers can use.

    So, if I understand correctly, RTX is three things.. the software libraries around DXR, the software that implements DXR on NVIDIA's hardware, and a hardware designation for cards with technology to accelerate the software side of RTX.
    Reply
  • MadManMark - Wednesday, August 22, 2018 - link

    Elaborating further on these great points, there is also the DLSS technology. As I understand it this will allow the new (for GeForce) tensor cores to take some of the load off the RT calculations by using the AI to interpolate. I am wondering if tweaking that (how much is actual RT and how much is AI sampling to interpolate rays not traced) is ultimately not the key to making FPS etc with RTX more stable & adjustable. Reply
  • Santoval - Wednesday, August 22, 2018 - link

    "RTX is an implementation of the DXR API by NVIDIA"
    Not quite. Nvidia also developed the VK_NV_raytracing extension, which they offered to the Khronos Group back in early May. This extension is technically a ray-tracing API, quite similar to Microsoft's DXR.

    The point of the extension/API is for it to play along with Nvidia's RTX API, because contrary to how it might have looked in the presentation of the 20xx series, Nvidia does not want to be locked to a single graphics API vendor *and* a single OS.

    Their main concern is not of course non-Windows (largely Linux) gaming, but some game studios which have focused on Vulkan instead of DX12, particularly some (like Id Software) which support *only* Vulkan in their upcoming games (e.g. next year's Doom Eternal), or others which skipped DirectX 12 support in favor of Vulkan (e.g. Valve's Dota 2).

    On the other hand, Vulkan should be at least a few months behind DX12 in ray-tracing support. I have no idea, for instance, if the VK_NV_raytracing extension was officially adopted by the Khronos Group, or if it still remains "off-tree".
    AMD, on the other hand, has the Radeon Rays ray-tracing engine (targeted at content creators, of course, since they lack ray-tracing hardware), but I believe it works only with OpenCL, not Vulkan.

    What's important is that the VK_NV_raytracing extension will be able to be used from Day 1 of the 20xx series release, despite the Khronos Group's currently unclear support for ray-tracing.
    Reply
  • gijames1225 - Tuesday, August 21, 2018 - link

    "The demos didn’t clarify apples-to-apples performance differences between the GTX 1080 Ti and RTX 2080 Ti"

    This is what I am most curious about. I suspect that we're not going to see anything too great for just vanilla GPU performance and the big leap here is solely the addition of ray tracing hardware.
    Reply
  • CaedenV - Tuesday, August 21, 2018 - link

    Really? It has been 2 years since the last major card launch. I am expecting a 20-50% performance gain (especially for higher resolution displays) when using normal conventional graphics rendering methods.
    That said... this sort of confirms what was seen in the keynote; The demos in the keynote (and the previous quadro keynote) were obviously not running RTX graphics at 1080p/60. More like 720p/30 in real time... and I think that is just where the technology is at today, and it will be ready for 1080p gaming in 2 years when the next gen cards come out, and 4k gaming when the next-next gen cards come out. This will be great for rendering things. Great for AI research 'on the cheap'. But simply not ready for prime time on modern games at modern resolutions. Just like CUDA on the 8800, it is a pretty cool technology, but it is going to be a while before it is useful.
    Reply

Log in

Don't have an account? Sign up now