Closing Thoughts

The bottom line is that BatteryBoost is certainly improving battery life, though it does so at the cost of frame rates. Considering many console games target 30FPS it's not a horrible solution, but gamers willing to fork out the money for a notebook with a GTX 980M are likely to pack around their AC adapter so that they can get every ounce of performance possible out of their notebook. At some point, I still want to see a gaming notebook that can deliver a decent gaming experience at 60FPS and high quality for more than two hours – and once we reach that level, I'll want to see three or four hours of gaming battery life, I'm sure. It's the great thing about technology: there's always some further milestone to try to achieve.

The results of our testing also highlight another interesting potential for BatteryBoost: G-SYNC. While no one has created a G-SYNC enabled notebook display (at least, not that I'm aware of), I personally find that 30FPS is a bit too choppy but 40+ FPS with G-SYNC can work very well. The amount of power needed to reach 60FPS tends to be a lot higher than what would be needed for 40FPS, so at some point NVIDIA may have to work on G-SYNC notebooks. That of course G-SYNC might draw a bit more power as well for the extra circuitry, and for now G-SYNC also means no Optimus Technology (unless NVIDIA can figure out a workaround), but I suspect NVIDIA will cross those bridges when the time is right.

I suppose since I'm here testing the GT72, I should also note that I really like the changes MSI made with this model compared to the previous GT70. The decision to forego Optimus is also proving to be interesting; I like the idea of automatically switching to the Processor Graphics in theory, but there are definitely times when it gets in the way. For instance, I was just testing Civilization: Beyond Earth performance; none of the Optimus enabled laptops would let me connect an external 4K display over DisplayPort and run it (most likely due to a bug in either the Intel or NVIDIA drivers, though I'd lean towards Intel). What's more, I can't add a custom resolution through the Intel drivers of 2560x1440, because that "exceeds the available bandwidth", never mind the fact that 3840x2160 @ 60Hz works fine.

The full review of the GT72 will post next week, but if you're looking for a short verdict, I really like the notebook. It's expensive, and the battery is no longer externally accessible (so you can't take two or three batteries with you, though I don't know many people that ever do that). Overall however the design is much better looking, performance is great, and the dual cooling fans are definitely doing their job. When the IPS panels arrive, this will be one awesome notebook.

A Closer Look at Clock Speeds and Power
Comments Locked

26 Comments

View All Comments

  • WinterCharm - Thursday, October 23, 2014 - link

    So it's essentially V-sync to 30 fps :P
  • III-V - Thursday, October 23, 2014 - link

    It's a bit more than that. Read the article.
  • spencer_richter - Tuesday, November 25, 2014 - link

    It's not as good as the top laptops on the market (see the rankings at ). For example the ASUS ROG G750JM-DS71 is a lot better for gaming. <a>http://www.consumer.com/</a> <a href="http://www.consumer.com/">http://www.consu... <a href="http://www.consumer.com/" title="http://www.consumer.com/">
  • nathanddrews - Thursday, October 23, 2014 - link

    With regular, old, dumb v-sync, additional frames are still rendered by the GPU, but select frames are only delivered from the frame buffer when ready to be synchronized to the monitor - it's not very efficient. BatteryBoost attempts to render only 30fps (or whatever the target is) to save power, and appears to succeed... somewhat.
  • looncraz - Thursday, October 23, 2014 - link

    Not on my system for the games I've tried. VSync reduces my GPU usage while reducing frame rate. The again, I've only tried a few games...

    But my own rendering engine accumulates (and even merges) changes until the next rendering time window, as directed by either the screen refresh or processing capability. (i.e. the render control thread doesn't initiate a frame render until the monitor can show it if VSync is enabled, or immediately once the last frame is completed if it is disabled). There just isn't a logical reason to do it any other way.
  • nathanddrews - Thursday, October 23, 2014 - link

    I wonder if power usage is at all related to the "pre-render max frames" setting?
  • OrphanageExplosion - Thursday, October 23, 2014 - link

    Assuming there are battery boost profiles for each game, couldn't it simply be dialling down quality settings where you're not likely to be able to tell the difference from, say, high quality shadows and normal quality shadows?
  • JarredWalton - Thursday, October 23, 2014 - link

    Note that I did not run with the "recommended" BatteryBoost settings for the various games; I ran with specific settings and kept those constant. GeForce Experience does have suggestions that sometimes match my settings...and sometimes not. :-)
  • inighthawki - Thursday, October 23, 2014 - link

    By default, at least on Windows, this is not true. When vsync is enabled, frames are queued to be presented at a particular interval. They are never discarded. This queue has a max height - typically 3 frames, but normally configurable by the game. After three frames, any present calls by the game will be blocked on the thread until a VBlank occurs and a frame is consumed.

    It is possible to get the behavior you're referring to if the target operating system supports it, and the game uses triple buffering. In this case, you can have a front buffer being displayed while the other two back buffers are used in a ping-pong fashion. At the vblank, the OS can choose to use the most recently fully rendered frame. Windows chooses not to do this for the exact power reasons described above. The advantage of doing it this way is you reduce a minor amount of latency in exchange for keeping your GPU pegged at 100% utilization.
  • nathanddrews - Friday, October 24, 2014 - link

    Since I have v-sync usually set to 96Hz, 120Hz, or 144Hz, I guess I never realize the power-saving benefits.

Log in

Don't have an account? Sign up now