This week, Futuremark unveiled Time Spy Extreme, a 4K version of the DX12 gaming benchmark released last July. Slotting into the 3DMark suite, Time Spy Extreme is available under the Advanced and Professional Editions on Windows, focusing on DX12 functionality on contemporary high-end graphics cards as well as high-core count processors. The original Time Spy was already more intensive than Fire Strike Ultra, and Time Spy Extreme brings that graphical punishment to 4K.

Where the original Time Spy scaled poorly on 10+ threads, Time Spy Extreme’s CPU test was redesigned for CPUs with 8 or more cores, and is additionally able to utilize AVX2 and AVX512 instruction sets. While the benchmark does not require a 4K monitor, the typical video memory demands are still present: Time Spy Extreme has VRAM minimum of 4GB, a step up from Fire Strike Ultra’s 3GB.

Otherwise, the sequence remains the same: the Time Spy explores a museum with artifacts and terraria of Futuremark’s other benchmarks, past and present. We’ve covered the technical aspects when Time Spy was first released, and the underlying details are likewise the same. Ultimately, Time Spy and Time Spy Extreme engine was built for DX12 from the ground up, and incorporates marquee DX12 features: asynchronous compute, explicit multi-adapter, and multi-threading. Like the original, Extreme was developed with input from Futuremark’s Benchmark Development Program, a group that includes AMD, Intel, and NVIDIA.

Time Spy Extreme will be released to the public on October 11. For the time being, an advance release has been made available to the press, and we’ve taken a quick look at the benchmark with a selection of modern cards.

Graphics Benchmarking Results

To preface, Time Spy Extreme’s CPU test is a simulation, and the benchmark measures average simulation time per frame. In other words, the score/measurement is not dependent at all with the graphical rendering work. With that in mind, we are only reporting graphics subscores and framerates. In general, the Time Spy and Time Spy Extreme benchmark scores do not relate to one another.

Overall, on our 8-core Skylake X GPU test bench, we looked at 9 cards with and without async compute disabled: the Titan Xp, GTX 1080 Ti, GTX 1080, GTX 1070, GTX 980 Ti, RX Vega64, RX Vega56, RX 580, and R9 Fury X. We also checked the effect of enabling 11.6GB of HBCC on the Vega cards. NVIDIA’s 385.69 drivers and AMD’s Radeon Software Crimson ReLive Edition 17.9.3 were the drivers used. For the RX Vega cards, the default power profile (primary BIOS and 'Balanced'' profile) was used.

3DMark Time Spy Extreme - 3840x2160 - Graphics Score

3DMark Time Spy Extreme - 3840x2160 - Graphics Test 13DMark Time Spy Extreme - 3840x2160 - Graphics Test 2

The results are largely unsurprising; as we noticed historically and in our RX Vega review, AMD's graphics performance benefits more from DX12 environments than DX11. And despite its 4GB frame-buffer, the Fury X is able to hold its own despite having the bare minimum required VRAM. The RX 580 was never intended for 4K gaming, and so is expectedly unsuitable.

The divergent performance between the two graphics subtests are reflective of the particular workloads. Like in the original Time Spy, Graphics Test 2 heavily features ray-marched volume illumination and has just under 3 times the amount of tessellation patches that Graphics Test 1 has.

Ultimately, Time Spy Extreme proves itself to be quite punishing, and if the benchmark were a game, only the 1080 Ti and Titan Xp would provide marginally playable framerates.

3DMark Time Spy Extreme - 4K - Graphics Score (Async On/Off)3DMark Time Spy Extreme - 4K - Graphics Test 1 (Async On/Off)3DMark Time Spy Extreme - 4K - Graphics Test 2 (Async On/Off)

With both Vega and Pascal, enabling asynchronous compute in Time Spy Extreme results in improved performance, and more so for the AMD cards. While not shown on the graph, the RX 580 and Fury X also benefit, while the 980 Ti regresses ever-so-slightly.

We also tested both RX Vega cards with an 11.6GB HBCC memory segment enabled. The scores differed by less than 2% compared to the default Async On scores, in line with previous reports of HBCC providing minimal benefits in games.

For more information on the subtests, the 3DMark technical guide has been updated with the specifics of Time Spy Extreme.

Availability and Purchasing

As mentioned earlier, Time Spy Extreme will be available to the public next Wednesday (10/11/17). It will be a free update for 3DMark Advanced and Professional Edition licenses purchased after July 14, 2016, as well as for anyone with older copies of 3DMark who have already purchased the Time Spy upgrade. For copies purchased before then, Time Spy Extreme will come with the purchase of the Time Spy upgrade.

Comments Locked

16 Comments

View All Comments

  • MattMe - Wednesday, October 4, 2017 - link

    Is it just me or are there a whole lot of jaggies for 4k in that last screenshot? The lighting looks great, but a lot of the textures don't look very real. I know it's only a benchmark suite, just commenting really.
  • Communism - Wednesday, October 4, 2017 - link

    That's what happens when an engine tries to do almost everything in shaders and blur filters instead of actually rendering things properly.
  • MrSpadge - Wednesday, October 4, 2017 - link

    Blame the artists, not the engine for lack of texture detail.
  • Communism - Thursday, October 5, 2017 - link

    If you don't know the subject matter, then don't comment.
  • peterfares - Wednesday, October 4, 2017 - link

    If anti-aliasing is not on then it doesn't matter what resolution is being rendered, there will be jaggies. You'll only notice them if you get up real close at 100%.
  • Cyanara - Sunday, October 8, 2017 - link

    Are you looking at it on a 4k monitor? I'm pretty sure the general idea is that 4k monitors have such small pixels that you won't notice them and hence you don't need AA. But at pixel for pixel on a 1080p monitor, you're gonna notice the jaggies.
  • DanNeely - Wednesday, October 4, 2017 - link

    "Overall, on our 8-core Skylake X GPU test bench, we looked at 9 cards with and without async compute disabled"

    Does this mean you've finally sorted out all the problems that were interfering with doing game benchmarks on Skylake-X?
  • Ian Cutress - Wednesday, October 4, 2017 - link

    Nate doesn't seem to have issues, but he's only testing cards with one CPU. I'm a few thousand miles away from Nate, so it's not as easy as swapping a CPU in the lab. I have new motherboards coming, which might fix my issue. It might be my CPUs too - I sent our official sample to Nate, while I'm running ES samples. Ideally, I'd be debugging this issue, rather than dealing with launches on other platforms that are taking priority right now.
  • lucam - Wednesday, October 4, 2017 - link

    I heard Matrox is coming back with a solution that can compete with latest AMD and Nvidia. Look forward to seeing that.
  • BrokenCrayons - Thursday, October 5, 2017 - link

    That's good news. I can finally replace the S3 ViRGE DX in my gaming PC and maybe get better performance. Up to now, even SLI Titans have just been too pedestrian deliver the FPS necessary to keep up with my ViRGE in bleeding edge games Descent Freespace. I'm thinking 640x480 is just too many pixels for only a pair of Titans.

Log in

Don't have an account? Sign up now