We previewed the performance of MSI's new GT72 notebook earlier this month, and while we're still running a few additional tests for the full review, one area that we wanted to look at in more detail is BatteryBoost. Initially launched with the GTX 800M series earlier this year, our first look at the technology came with the MSI GT70 with GTX 880M, and unfortunately battery life even when not gaming wasn't exactly stellar, and powering up the GTX 880M didn't help matters. NVIDIA's stated goal is to get useful gaming battery life above two hours, which so far we haven't been able to do (and starting with a laptop that only manages 4-5 hours in our light and heavy Internet testing doesn't help).

Without BatteryBoost, the MSI GT70 managed around 50 minutes of battery life while gaming (give or take), while enabling BatteryBoost in some cases could get us up to 80+ minutes of battery life. More recently, we reviewed the updated Razer Blade (2014 edition) with a GTX 870M. We were able to see an improvement from 46 minutes without BatteryBoost to 76 minutes with BatteryBoost in our limited testing. However, if the goal is to get above two hours of gaming battery life, we're still not there.

Basically, the amount of battery life you are able to get while gaming is largely dependent on how high frame rates are without BatteryBoost and how low the target frame rate is set with BatteryBoost. If a game on battery power can run at 60FPS and BatteryBoost puts a 30FPS cap into place, battery life can improve a decent amount. A game that can hit 120FPS meanwhile would potentially experience a much larger benefit from BatteryBoost, especially when putting a 30FPS cap into effect. With GT72 and the GTX 980M, both power efficiency and performance should be better than the GT70 and GTX 880M, which means BatteryBoost has the potential to stretch its legs a bit more.

For our testing, we've picked three games and we've run and reasonably high settings – but not necessarily maxed out settings, as that would generally prevent BatteryBoost from providing much if any benefit. Our goal was to run settings that would allow at least 70+ FPS on battery power. Keep in mind that just because the GT72 can hit well over 60 FPS on AC power, even without BatteryBoost enabled there are some performance limitations in effect. In the end, our three games consist of Tomb Raider at High quality, the newly released Borderlands: The Pre-Sequel at nearly maxed out settings (we left PhysX on Low), and GRID Autosport with High settings. Anti-aliasing was not used in any of the games (though FXAA was enabled in Borderlands), and the resolution was set to 1080p. The power profile was set to Balanced, with the LCD running at 200 nits.

One of the interesting things about BatteryBoost is that it allows you to target a variety of frame rates (from 30 to 60 FPS in 5 FPS intervals). NVIDIA has also stated that they're doing more than just frame rate targeting, so we wanted to test that by enabling VSYNC and running without BatteryBoost at a steady 60FPS. Since BatteryBoost also doesn't inherently enable VSYNC, that was one more variable to test. (While in theory anything between 30 and 60 FPS should result in a 30FPS frame rate with VSYNC enabled, at least in GRID Autosport that doesn't happen, either due to triple buffering or some other factors.)

In the end, for at least one game – GRID Autosport – we tested both with and without VSYNC at 10FPS intervals with BatteryBoost, plus checking performance without BatteryBoost. That's ten different settings to test, and with each cycle requiring at least several hours we've been running BatteryBoost tests almost non-stop since our preview article. This is about the most effort we've ever put into testing gaming battery life on a laptop, and it might be a while before we decide to delve into this subject in such a fashion again. So join us as we thoroughly investigate BatteryBoost on the GTX 980M.

BatteryBoost: Gaming Battery Life x 3
Comments Locked


View All Comments

  • WinterCharm - Thursday, October 23, 2014 - link

    So it's essentially V-sync to 30 fps :P
  • III-V - Thursday, October 23, 2014 - link

    It's a bit more than that. Read the article.
  • spencer_richter - Tuesday, November 25, 2014 - link

    It's not as good as the top laptops on the market (see the rankings at ). For example the ASUS ROG G750JM-DS71 is a lot better for gaming. <a>http://www.consumer.com/</a> <a href="http://www.consumer.com/">http://www.consu... <a href="http://www.consumer.com/" title="http://www.consumer.com/">
  • nathanddrews - Thursday, October 23, 2014 - link

    With regular, old, dumb v-sync, additional frames are still rendered by the GPU, but select frames are only delivered from the frame buffer when ready to be synchronized to the monitor - it's not very efficient. BatteryBoost attempts to render only 30fps (or whatever the target is) to save power, and appears to succeed... somewhat.
  • looncraz - Thursday, October 23, 2014 - link

    Not on my system for the games I've tried. VSync reduces my GPU usage while reducing frame rate. The again, I've only tried a few games...

    But my own rendering engine accumulates (and even merges) changes until the next rendering time window, as directed by either the screen refresh or processing capability. (i.e. the render control thread doesn't initiate a frame render until the monitor can show it if VSync is enabled, or immediately once the last frame is completed if it is disabled). There just isn't a logical reason to do it any other way.
  • nathanddrews - Thursday, October 23, 2014 - link

    I wonder if power usage is at all related to the "pre-render max frames" setting?
  • OrphanageExplosion - Thursday, October 23, 2014 - link

    Assuming there are battery boost profiles for each game, couldn't it simply be dialling down quality settings where you're not likely to be able to tell the difference from, say, high quality shadows and normal quality shadows?
  • JarredWalton - Thursday, October 23, 2014 - link

    Note that I did not run with the "recommended" BatteryBoost settings for the various games; I ran with specific settings and kept those constant. GeForce Experience does have suggestions that sometimes match my settings...and sometimes not. :-)
  • inighthawki - Thursday, October 23, 2014 - link

    By default, at least on Windows, this is not true. When vsync is enabled, frames are queued to be presented at a particular interval. They are never discarded. This queue has a max height - typically 3 frames, but normally configurable by the game. After three frames, any present calls by the game will be blocked on the thread until a VBlank occurs and a frame is consumed.

    It is possible to get the behavior you're referring to if the target operating system supports it, and the game uses triple buffering. In this case, you can have a front buffer being displayed while the other two back buffers are used in a ping-pong fashion. At the vblank, the OS can choose to use the most recently fully rendered frame. Windows chooses not to do this for the exact power reasons described above. The advantage of doing it this way is you reduce a minor amount of latency in exchange for keeping your GPU pegged at 100% utilization.
  • nathanddrews - Friday, October 24, 2014 - link

    Since I have v-sync usually set to 96Hz, 120Hz, or 144Hz, I guess I never realize the power-saving benefits.

Log in

Don't have an account? Sign up now