G-SYNC Gaming with QHD at 144Hz

We've talked previously about G-SYNC and how it can provide a better experience for gaming, but one of the big limitations with G-SYNC on most monitors so far has been the maximum refresh rate of 60Hz. With the ASUS ROG Swift PG278Q, that particular limitation goes out the window as it can refresh at up to 144Hz. What this means is that for the vast majority of users, particularly when running at the native 2560x1440 resolution, your frame rates will no longer be limited by the refresh rate. If you have a beefy SLI rig, you could see frame rates of well over 100 FPS without ever having to turn off V-SYNC.

What that means in practice is that while 60 FPS is what you generally need for "smooth" gaming, you can now go well beyond that. There's certainly a case of diminishing returns, so by no means do we think that 144Hz is absolutely required, but I’ve felt for a long time that 60Hz has been limiting. Once we hit 100Hz, however, we've reached the point where my eyes can see the difference. There's also a question of whether or not the pixel response time is fast enough to keep up with such high refresh rates, but ASUS has used a TN panel with a 1 ms response time and it seems to do the trick.

103 FPS, 103 Hz, No VSYNC, No Tearing

I mentioned in our last review of the Acer XB280HK that 4K gaming in practice tends to be too demanding for most GPUs right now, and with 2.25X as many pixels as QHD it's not hard to see why that's the case. By dropping the resolution to a more reasonable level, frame rates in most games effectively double – and in some cases, particularly if you exceed the amount of VRAM in your GPU, the difference in performance can be even more profound. Given the number of buffers being used in most games, plus post processing, anti-aliasing, high resolution textures, and other effects, I would say that you really need 6GB of VRAM per GPU in order to handle 4K gaming properly – and you also need faster GPUs to push that many pixels. QHD on the other hand tends to be just fine with 4GB VRAM, sometimes less.

One of the other issues that you run into with 4K gaming and G-SYNC is that you will frequently drop below 40 FPS in demanding games. At that point, the on-screen pixels begin to decay and you can see a noticeable flicker. That's one more reason to stick with a lower resolution, as staying above 40 FPS isn’t as difficult, but there are other potential benefits. With a 144Hz maximum refresh rate, rather than only drawing a frame twice when the refresh rate drops below 30Hz, it’s possible for G-SYNC to draw frames twice at anything below 72FPS, at which point flicker shouldn’t be an issue. It’s not clear whether or not ASUS (or NVIDIA G-SYNC) do this right now, and the response when I asked was a cryptic “we are not releasing any implementation details on G-SYNC right now”, which means it may be a future feature (and there’s likely a bit of overhead with drawing a frame twice). It would be smart to at least draw twice at frame rates below 45 FPS, though, as that’s when flicker starts to become a problem and there’s no reason a 144Hz display couldn’t refresh twice (effectively 90Hz).

If you’re wondering why this isn’t applicable to a 4K display, it’s because it's currently not practical to drive 4K resolutions at refresh rates above 60Hz. 60Hz already requires more bandwidth than a typical HDMI connection can deliver (though HDMI 2.0 would suffice), and even DisplayPort 1.2 with a maximum of 17.28 Gbit/s is pretty much tapped out (4Kp60 requires 15.9 Gbit/s). If you want to have higher refresh rates with 4K, DisplayPort 1.3 is required, which isn’t implemented on most displays yet. Of course there’s still that problem of trying to reach 60+ FPS, but with an 80Hz refresh rate you could potentially double up on redraws when the FPS is below 40 instead of below 30.

Maximum quality at QHD and Evolve is still buttery smooth -- with GTX 970 SLI of course.

Without belaboring the point, I can basically say that in the vast majority of circumstances I personally prefer the ASUS QHD 144Hz G-SYNC display over the Acer 4K 60Hz display. You can also reasonably run QHD at native resolution with 100% scaling and not have difficulties in windows; unless you have eagle eyes, 4K on a 28 inch display will usually require a bit of scaling (125-150% for me and my poor old eyes). But are there any situations where I would actually prefer the 4K display?

In fact there are, but most of them involve multimedia use. Having the actual native resolution available for 4K video editing is always nice, and it goes without saying that watching 4K video content generally means you should have a 4K display – otherwise you just end up downscaling to your native resolution. And if I sit close enough to the display (or if your vision is good enough), the extra resolution can be useful for general Windows use as well. And Photoshop or other image editing software means you can work with a QHD image and not have scroll bars at 100% zoom, which is pretty cool. I would also say that anti-aliasing at 4K becomes less necessary in games, thanks to the high DPI, though there’s still jaggies if you look for it.

One final note on the subject is that there was some news last month where at first someone thought G-SYNC laptops without a G-SYNC module were possible. The reality ends up being a bit different than that particular tale. As PC Perspective reports, it turns out ASUS accidentally let an alpha driver get out to the public that had some G-SYNC support. While some thought that G-SYNC could be done on any notebook, it turns out that’s not true – only the ASUS G751 line of notebooks seems to have worked with the leaked driver, and that had a display where G-SYNC was an option (and also worth noting is that Optimus Technology is not used on the G751JY).

Anyway, while G-SYNC did work in many instances using that leaked driver, there were problems when frame rates dropped too low, including the screen blacking out for half a second and other anomalies. If you’re wondering why the G-SYNC module is in desktop displays, that’s a big part of it right there: ensuring the experience actually works properly all of the time. And at least in my testing of the Acer XB280HK and ASUS PG278Q, it does exactly that. G-SYNC will almost certainly end up coming to laptops as well, but it will be in a slightly different form from the current desktop implementation, and the actual ETA is still unknown.

Introduction, Specs, and Design Brightness and Contrast
Comments Locked


View All Comments

  • shonferg - Monday, February 16, 2015 - link

    I found the article here on AndandTech that gave me the impression that G-sync can do self-refresh:


    "You can only do so much with VBLANK manipulation though. In present implementations the longest NVIDIA can hold a single frame is 33.3ms (30Hz). If the next frame isn’t ready by then, the G-Sync module will tell the display to redraw the last frame."

    "Game/hardware/settings combinations that result in frame rates below 30 fps will exhibit stuttering since the G-Sync display will be forced to repeat frames"

    Of course, that article was about first gen, pre-release hardware, and I don't know if things have changed since that initial article.

    But if that's still the way it works, it sounds like it will only kick in if the frame rate is below 30 fps, and even then it's kind of dumb in that it waits the full 33 ms before re-showing the previous frame. So if the next frame is ready moments later, it will have to wait for the next refresh, causing a stutter.

    Unfortunately, it sounds like it wasn't doing anything smart like noticing frame rate is falling lower than a certain threshold and then doubling the frame rate to prevent the possibility of flicker and stutter. Seems like it needs the ability for the GPU to send a "redraw the last frame now" command for situations like that so that frame refresh can be doubled without doubling bandwidth requirements.
  • GameLifter - Friday, February 13, 2015 - link

    I got this monitor at launch and I'm still loving it. G-Sync is incredible, ULMB is incredible, the higher refresh rate makes a noticeable difference, and the color quality is very good for a TN panel. Heck, better than any TN panel I've seen.

    However, I did notice a dead pixel towards the top of the screen recently. It's not bad but I hope more don't start to show up. Back light uniformity is sub par but it's not very noticeable to me unless I have my lights off and the screen is black or a darker color.

    Overall I'm very pleased with this monitor and hopefully higher refresh rate panels and VRR technology become the norm.
  • pandemonium - Saturday, February 14, 2015 - link

    May as well remove the Input Lag from the reviews until you can produce some results for that. Every time I see that I get disappointed because that's a key metric for me.
  • cheinonen - Saturday, February 14, 2015 - link

    It's only missing on monitors that are DisplayPort only, which has only been the G-Sync models to this point. If we left the section out without the explanation, it would cause far more comments.
  • wyewye - Saturday, February 14, 2015 - link

    Why are you reviewing an year old stuff?
    What do you have to add compared to the other gazillion reviews of ROG Swift out there?

    Apparently nothing.
    Nothing about latency or input lag on a gaming monitor review.

    Really pathetic.
    Whats going on with you AnandTech? Severe budget cuts?
  • cheinonen - Saturday, February 14, 2015 - link

    Input lag was addressed in the piece. Since the ROG Swift runs at a resolution beyond a CRT, and has no HDMI input for a lag tester, there is no way to generate a reliable number for lag. I've seen numbers for it that indicate under 5ms when using SMTT, but SMTT stopped issuing licenses and ours expired, so I cannot use it to test anymore. If you have a way to measure the input lag that is reliable and accurate and works with DisplayPort, we'd love to know.
  • Slowking - Saturday, February 14, 2015 - link

    "Why are you reviewing an year old stuff?"

    I clicked on the article half hoping it contained more information on a forthcoming cheaper version of the Swift.
  • Achaios - Saturday, February 14, 2015 - link

    Honestly, I cannot see a difference between 60 Hz and 144 Hz, which leads me to assume that: 1. Either my eyes are defective or 2. Those who claim to see a difference between 60Hz and 144 Hz are lying.
  • snuuggles - Saturday, February 14, 2015 - link

    It's not your eyes, it's your brain. I guess it could be like being colorblind or something. In a way, it's an advantage to you because you'll never need to bother spending money on something like this :)
  • Murloc - Saturday, February 14, 2015 - link

    it's like being an audio peasant, you spare lots of money if you're content with desktop speakers.
    I've never tried a 144Hz monitor so the jury is still out for me.

Log in

Don't have an account? Sign up now