Comments Locked

26 Comments

Back to Article

  • WinterCharm - Thursday, October 23, 2014 - link

    So it's essentially V-sync to 30 fps :P
  • III-V - Thursday, October 23, 2014 - link

    It's a bit more than that. Read the article.
  • spencer_richter - Tuesday, November 25, 2014 - link

    It's not as good as the top laptops on the market (see the rankings at ). For example the ASUS ROG G750JM-DS71 is a lot better for gaming. <a>http://www.consumer.com/</a> <a href="http://www.consumer.com/">http://www.consu... <a href="http://www.consumer.com/" title="http://www.consumer.com/">
  • nathanddrews - Thursday, October 23, 2014 - link

    With regular, old, dumb v-sync, additional frames are still rendered by the GPU, but select frames are only delivered from the frame buffer when ready to be synchronized to the monitor - it's not very efficient. BatteryBoost attempts to render only 30fps (or whatever the target is) to save power, and appears to succeed... somewhat.
  • looncraz - Thursday, October 23, 2014 - link

    Not on my system for the games I've tried. VSync reduces my GPU usage while reducing frame rate. The again, I've only tried a few games...

    But my own rendering engine accumulates (and even merges) changes until the next rendering time window, as directed by either the screen refresh or processing capability. (i.e. the render control thread doesn't initiate a frame render until the monitor can show it if VSync is enabled, or immediately once the last frame is completed if it is disabled). There just isn't a logical reason to do it any other way.
  • nathanddrews - Thursday, October 23, 2014 - link

    I wonder if power usage is at all related to the "pre-render max frames" setting?
  • OrphanageExplosion - Thursday, October 23, 2014 - link

    Assuming there are battery boost profiles for each game, couldn't it simply be dialling down quality settings where you're not likely to be able to tell the difference from, say, high quality shadows and normal quality shadows?
  • JarredWalton - Thursday, October 23, 2014 - link

    Note that I did not run with the "recommended" BatteryBoost settings for the various games; I ran with specific settings and kept those constant. GeForce Experience does have suggestions that sometimes match my settings...and sometimes not. :-)
  • inighthawki - Thursday, October 23, 2014 - link

    By default, at least on Windows, this is not true. When vsync is enabled, frames are queued to be presented at a particular interval. They are never discarded. This queue has a max height - typically 3 frames, but normally configurable by the game. After three frames, any present calls by the game will be blocked on the thread until a VBlank occurs and a frame is consumed.

    It is possible to get the behavior you're referring to if the target operating system supports it, and the game uses triple buffering. In this case, you can have a front buffer being displayed while the other two back buffers are used in a ping-pong fashion. At the vblank, the OS can choose to use the most recently fully rendered frame. Windows chooses not to do this for the exact power reasons described above. The advantage of doing it this way is you reduce a minor amount of latency in exchange for keeping your GPU pegged at 100% utilization.
  • nathanddrews - Friday, October 24, 2014 - link

    Since I have v-sync usually set to 96Hz, 120Hz, or 144Hz, I guess I never realize the power-saving benefits.
  • inighthawki - Friday, October 24, 2014 - link

    Yeah, at such high framerates, it wouldn't be uncommon to not always be at the max queue depth, so you'll get the illusion that it's always continuously rendering. But in this case you're really just rendering frames ahead. One nice advantage sis that if you hit the queue depth, you'll actually get more consistently smooth motion, since the frame rate is more consistent. Having the game wake up consistently every vblank and rendering one frame provides a more fixed timestep for things like animation, compared to having a variable rate by rendering as fast as you can. Most people will likely never notice though.

    It's unfortunate that Windows forces the games into that model, since sometimes I'd love the triple buffering model instead. I like the lower latency of that mode, while also removing screen tearing. Given that I run a GTX780 plugged into my wall socket, I'm not too concerned about the power savings - especially considering I usually disable vsync anyway, so I'm not really wasting any more than normal.
  • HiTechObsessed - Friday, October 24, 2014 - link

    If what you're saying is true, battery life would decrease when turning on VSync... Looking at the results here, with BatteryBoost off, turning VSync on increases battery life.
  • thepaleobiker - Thursday, October 23, 2014 - link

    Yes, please read the article good sir.
  • limitedaccess - Thursday, October 23, 2014 - link

    Is there any actual difference in terms of thermal performance? Either lower temps and/or fanspeed (fan noise)? I would assume if the GPU itself is consuming significantly less power its average heat output should be lower as well and less stress placed upon the cooling system.

    As an extension of this are you able to ask Nvidia to comment on whether or not it is technically possible to extend a variation of this to desktop GPUs and if there is any plan to? This would enable the flexibility of building a system that is extremely low noise (or even passive) for certain gaming workloads yet still have performance on demand.
  • nevertell - Thursday, October 23, 2014 - link

    As there is less energy consumed, there is less energy dissapated. Ultimately, all energy that is used by any computer that isn't then used to power LED's or displays will be turned into heat.
  • limitedaccess - Thursday, October 23, 2014 - link

    Yes I'm aware of the theory. However I am curious as to what the actual tested impact would be in this case and how significant (or insignificant) the difference might be.
  • Brett Howse - Thursday, October 23, 2014 - link

    When I tested the Razer Blade, I noticed a significant decrease in temperatures and of course noise when playing with Battery Boost enabled, which is what you would expect since it is working far less.
  • JarredWalton - Thursday, October 23, 2014 - link

    Yup. Running the GPU at lower clocks and reducing power consumed means the fans don't have to work as hard to keep the system cool. Targeting 30FPS, the GT72 is pretty quiet -- not silent, but not loud at all. I didn't take measurements (I'll try that for the final full review), but there's nothing too shocking: lower performance => less heat => less noise.
  • CrazyElf - Thursday, October 23, 2014 - link

    All in all, this new Battery Boost feature seems to indicate a modest incremental improvement in battery life. It's not as good as say, the leap in performance per watt that Maxwell gave, but it's welcome nonetheless.

    The issue has always been that there's a tradeoff between size, mobility, and battery life, especially for a large hungry gaming GPU.

    Jarred, by any chance, are you aware that there is going to be are variants of the GT72 with an IPS monitor coming out in the coming months? It's already up for pre-order at many of the laptop sellers. Downside is there's a pretty big price premium.
  • sonicmerlin - Thursday, October 23, 2014 - link

    Don't these laptops have nvidia Optimus and Haswell processors? Why is their non gaming runtime so low despite their large batteries?
  • JarredWalton - Thursday, October 23, 2014 - link

    This particular laptop does not have Optimus; you can manually enable/disable the GPU, though it requires a reboot. Since I'm testing games on the GPU, however, I wanted to compare battery life gaming to battery life not gaming (but with the GPU still active). It looks like the 980M uses around 8W idle, give or take, so turning it off and using the HD 4600 will improve battery life into the 6 hour range.
  • sonicmerlin - Tuesday, October 28, 2014 - link

    Given these things have much larger batteries than ultra books, which can last significantly longer than 6 hours, you'd think these things would get longer run times when using the IGP.
  • Krysto - Friday, October 24, 2014 - link

    Not a bad idea, this feature.
  • Calista - Friday, October 24, 2014 - link

    You can already today have a decent gaming experience with a 4 hour battery life. But you won't get it running full tilt with a modern game. We have the technology already, it's all about how the market works. More efficient component also allows for faster components. But those will consume more energy. And we're at full circle. My advice - return to games made five years ago and they will run very well on an Intel GPU while giving a long battery life.

    Long battery life/High framerates/Good graphics - feel free to pick two of those. But you will never get all three.
  • RoninX - Friday, October 24, 2014 - link

    Or carry a spare battery.

    I just bought a new MSI GT60 Dominator with the GTX 970M. The main reason I picked this over the smaller, lighter GS60 Ghost is that the GT60 comes with a removable 9-cell battery, where the GS60 has a non-removable 6-cell battery.

    I get over 2 hours of runtime with Borderlands: The Pre-Sequel at high settings, 30 fps, and 1920x1080. With a spare battery, that's over 4 hours, which is plenty for my primary use case for battery gaming (gaming while waiting for airline flights).

    I was also impressed with the GT60's full performance plugged into AC, which comes close to my desktop (i7-2700k with GTX 680) using 3D Mark. The fan does sound a bit like a hovercraft when the CPU/GPU is running at full tilt, but I can live with that.
  • jann5s - Tuesday, October 28, 2014 - link

    I love these type of articles, thank you AT!

    If I may propose another topic: The visual impact of game quality settings (e.g. FSAA) compared to the cost in performance.

Log in

Don't have an account? Sign up now