Comments Locked

18 Comments

Back to Article

  • mmrezaie - Friday, January 11, 2019 - link

    such an ugly design but I am seeing no intake from the bottom? this is good news. If only they could have a better keyboard and less noise on the next xps 15 (with no bottom intake) then I will consider trying it again.
  • DanNeely - Friday, January 11, 2019 - link

    Yeah, the blowers appear to be in from the side out the back, or vice versa.
  • Opencg - Friday, January 11, 2019 - link

    Not a fan a thin max q models. Even the "fat" laptops are hit or miss on throttling let alone default power limits for max q are about half of normal. They all have hidden or undisclosed performance limits. For over the price of another 2080 gaming laptop you can have half the performance? Why? Waiting on the fat option. I heard it actually has a socketed cpu this time. Though if I know dell its just a stunt to regain people they lost and the system will still be about tricking consumers. Clevo for the win I guess.
  • PeachNCream - Friday, January 11, 2019 - link

    Underside fan intakes suck (pun intended) for on-lap use so I totally agree that pulling from a side is better for actually using a laptop on a lap but that thin chassis packed with all of that hot running hardware might still get too warm for on-lap use. We'll have to wait for a review that measures underside temps at load.
  • HStewart - Friday, January 11, 2019 - link

    I think the Ugly nature of device is basically and issue with all gaming notebooks lately - but this one looks a lot thinner than ones on past - but Alienware is under Dell and Dell has done a lot of work on making laptops have better cooling - my XPS 13 2in1 virtually is quite all the time, but XPS 15 2in1 is much louder - but still does not generate much heat - not sure if issue with 4Ghz CPU or and/or the GPU on it. But this time next year, I would bet we have same level notebook as this one in side of XPS 15 2in1 and much quitter - especially with Sunny Cove - hopefully GPU will be better also.
  • wr3zzz - Friday, January 11, 2019 - link

    Doesn't XPS 13 2in1 use the y-seies CPU and is fanless? Shouldn't it be quite all the time and not just virtually?
  • KateH - Friday, January 11, 2019 - link

    i have to question how well a RTX 2070 / 2080 is going to fare in a chassis like this... looks to be the same thickness and almost identical cooling system to my MSI GE72, and that struggles to cool an i7 6700HQ and GTX 960M. I've read about Max-Q in an attempt to wrap my head around it but every explanation I've seen boils down to "it's magic! trust Nvdia!" It looks to be more than just lower GPU clocks, is there super strict binning going on there too?

    On a sidenote, I can't be the only one who would like to see more gamer/power user focused 17" Ultrabooks. Not "thin and light" like this or my MSI but ultrabook as in < 3/4" at the thickest, good display with full sRGB, i7 8xxxU or (preferably) R7 3700U in 25W DP-up mode with enough cooling to stay at high turbo, and a Thunderbolt port for eGPU. That would be my ideal notebook but the only thing i know of that comes close is the LG Gram 17.
  • PeachNCream - Friday, January 11, 2019 - link

    Max-Q does seem like a branding label sitting atop a variety of different methods that reduce thermal load. I haven't seen a lot from NVIDIA on the specifics, but performance does tend to be a bit lower than the non-Max-Q variant of the same dGPU. As an aside, I think Max-Q branding is ultimately going to fade into obscurity since it looks like lots of OEMs are opting for the cooler/lower heat GPUs to the point that they might end up becoming the standard versions.

    In regards to your side note, you're not the only one. However Ultrabooks are already rather expensive even without dedicated graphics. Adding gaming hardware and the associated price premium will probably result in niche systems that will discourage a large portion of potential buyers. I think the limited appeal is what ultimately will hold OEMs back from doing more than dabbling around in that realm.
  • Opencg - Sunday, January 13, 2019 - link

    Max q is about half the normal tdp limit. Aka half the performnce.
  • Spunjji - Monday, January 14, 2019 - link

    Why say something that can be so easily disproven? Have a look at the info on Notebookcheck, particularly for the 1060 and 1070 vs. their Max-Q variants. Going from the 1070 Max-Q to the standard 1070 gives you 18% more performance for a 33% increase in TDP.

    Max-Q designs run closer to the efficiency sweet-spot of the architecture. Personally I think they massively overcharge for that "privilege", but this is Nvidia and they have no competition in that market.
  • wr3zzz - Friday, January 11, 2019 - link

    Ultrabook by definition needs to be "thin and light" otherwise it's just a regular notebook. The concept is mutually exclusive with the demands of gaming notebook. I don't think there is any technical limitation for others to make something like the LG Gram 17 but historically big firms don't see big enough market for thin and light 17" form factor to invest the resources into 17" ultrabooks.

    Personally I would like one as well. A 17" ultrabook would be quieter and more practical than 15" because of easier cooling and more space for ports. I think ultrabooks should be either 13/14" fanless or 17" with all the ports.
  • Spunjji - Monday, January 14, 2019 - link

    I wouldn't place too much faith in how the cooling system looks. So far its stablemate the m15 tests surprisingly well when it comes to cooling, beating out the Razer Blade and even its own thicker Alienware cousin. If it's similar to that and just larger, it ought to cope very well indeed.

    Max-Q is binning + reduced clock speeds + reduced voltage. The relationship between clock speed, voltage and power is non-linear so (to a point) fairly moderate drops in clocks and voltages can produce a disproportionately large drop in TDP.

    The problem is that in practice both Nvidia and OEMs use it as an excuse to price-gouge because these are "premium" products, and yeah, Nvidia's explanation of what they're actually doing is sorely lacking.
  • OP20 - Saturday, January 12, 2019 - link

    Very ugly. Why cant they mature and put these specs in the design language of the xps 15. Lots of other people also feel this way but no one is making a gaming laptop for us.
  • timecop1818 - Saturday, January 12, 2019 - link

    Hey Dell is QCOM sucking your dick to include their garbage Ethernet and gaming WiFi garbage? What was wrong with Intel gbe and WiFi?

    plz stop supporting this snakeoil killer shit, I'll never buy another Dell until this stuff is removed
  • Opencg - Sunday, January 13, 2019 - link

    Well you probably are just one of those idiots who assigns an ego to a brand and then goes with it. While killer has bad bad cards for sure, intel has some shit cards and major issues as well. In fact when I last did research and bought many wifi cards to get the best one it turned out to be killer. Best signal. Most consistent. Virtually the same via latency mon. Installed without killer driver suite. No issues. As well this is not only my opinion. At an in depth look on a forum the consensus was that the killer card was best due to signal.
  • Spunjji - Monday, January 14, 2019 - link

    This has been confirmed objectively via various review sites, too. Some people just want to talk shit.
  • FXi - Sunday, January 13, 2019 - link

    I like it as having a bit more cooling power, bit more screen real estate and not much more weight and size than the M15. I think the design works and brings the AW line closer to the popular designs of the XPS series. I agree cooling suffers but they've made a lot of ground up by going dimensionally deeper. I'd bet if people looked at the real internal surface area of the coolers they'd find it's not as different from the older thicker series as you'd think. They were taller but the fins were shorter in depth. And when we go to 10nm you may not even note the thinner design as much of an issue anymore. Look at all the 2in1's toting 2-4Ghz quad cores.
    What would really do this some good would be a 17" OLED panel to match with the 15.6 going in the M15. OLED has issues but the benefits in pixel response (better than a blinking backlight) to reduce blurring, along with greatly improved visuals and HDR that no longer needs a 500W backlight driven through LCD, would highly offset the negatives. Of course you'd need Samsung to supply such an OLED panel (LG hasn't been interested yet). If I had OLED and a 9th gen CPU, along with a 2070 or 80, it'd be on my ordering list. For now this is a good design step and little visual cues that people do and do not like can be addressed. Cooling will be better than the M15. I think many end users will be pleased. And it weighs half of what 17" gaming laptops did a short number of years ago.
  • Altagon - Tuesday, January 15, 2019 - link

    Nice workstation, I am concern about a price tag ...

Log in

Don't have an account? Sign up now