Intel’s Sandy Bridge i7-2820QM: Upheaval in the Mobile Landscapeby Jarred Walton on January 3, 2011 12:00 AM EST
- Posted in
- Sandy Bridge
Intel’s Sandy Bridge: Upheaval in the Mobile Landscape
You’re probably sick of me talking about Sandy Bridge in our notebook reviews, particularly since up to now I’ve been unable to provide any numbers for actual performance. Today, Intel takes the wraps off of Mobile Sandy Bridge and I can finally talk specifics. Notebooks have always been substantially slower than desktops, and prices for a set level of performance have been higher; that’s not going to change with the SNB launch, but the gap just got a lot narrower for a lot of users. The key ingredients consist of higher core clocks with substantially higher Turbo modes, an integrated graphics chip that more than doubles the previous generation (also with aggressive Turbo modes), and some additional architectural sauce to liven things up.
If you haven’t already done so, you’ll probably want to begin by reading Anand’s Sandy Bridge Architectural Overview, as well as our Desktop Sandy Bridge coverage. I’m not going to retread ground that he’s already covered, so the focus for this article is going to be solidly on the mobility aspects of Sandy Bridge. With notebooks now outselling desktops by almost two to one, it shouldn’t surprise anyone that a greater emphasis is being placed on the new mobile offerings. For starters, most of the mobile SNB chips are getting the full 12EU graphics core, rather than a trimmed down 6EU variant. Toss in all of the improved power management features and what we end up with is a fast-when-needed, power-friendly, and efficient chip. We’ll get to the benchmarks in a moment, but let’s start with a recap of the mobile Sandy Bridge lineup.
|Intel Mobile Sandy Bridge (Retail)|
|Max SC Turbo||3.5GHz||3.4GHz||3.3GHz||3.4GHz||3.3GHz||3.2GHz|
|Max DC Turbo||3.4GHz||3.3GHz||3.2GHz||3.2GHz||3.1GHz||3.0GHz|
|Max QC Turbo||3.2GHz||3.1GHz||3.0GHz||N/A||N/A||N/A|
|Base GFX Freq.||650MHz||650MHz||650MHz||650MHz||650MHz||650MHz|
|Max GFX Freq.||1300MHz||1300MHz||1300MHz||1300MHz||1300MHz||1300MHz|
Up first, we have the retail SKUs for the quad-core and dual-core parts. Worth noting is that availability of the quad-core processors should start this week, but the dual-core and LV/ULV parts won’t show up for a few more weeks. The quad-core parts will also use a different BGA package than the dual-core parts. The above will be the most readily available Sandy Bridge parts, as well as the fastest offerings, but there are additional OEM and LV/ULV products as well.
|Intel Mobile Sandy Bridge (OEM)|
|Max SC Turbo||2.9GHz||2.9GHz||2.9GHz||N/A|
|Max DC Turbo||2.8GHz||2.8GHz||2.6GHz||N/A|
|Max QC Turbo||2.6GHz||2.6GHz||N/A||N/A|
|Base GFX Freq.||650MHz||650MHz||650MHz||650MHz|
|Max GFX Freq.||1200MHz||1100MHz||1200MHz||1100MHz|
We might get some of the above in OEM systems sent for review, and if so it will be interesting to see how much of an impact the trimmed clock speeds have on overall performance. The only mobile chip without support for Turbo Boost is the i3-2310M, so it will be interesting to see how that compares with current-generation i3 processors. Sandy Bridge should still be faster clock-for-clock than Arrandale/Clarksfield, and pricing on OEM parts might get these down into some very affordable notebooks and laptops. We’ll have to wait and see.
|Intel Mobile Sandy Bridge (LV/ULV)|
|Max SC Turbo||3.2GHz||3.0GHz||2.7GHz||2.6GHz||2.3GHz|
|Max DC Turbo||2.9GHz||2.7GHz||2.4GHz||2.3GHz||2.0GHz|
|Base GFX Freq.||500MHz||500MHz||350MHz||350MHz||350MHz|
|Max GFX Freq.||1100MHz||1100MHz||1000MHz||950MHz||900MHz|
What’s interesting to note about the ULV parts is that even the slowest i5-2537M (yeah, those code names are going to be easy to remember!) comes clocked higher than the outgoing i7-640UM, with more aggressive Turbo modes and a 1W lower TDP. Perhaps we’ll see an M11x R3 with 400M (or 500M?) graphics and one of these ULV chips?
But enough about other products; let’s take a look at the preview system we received and see how this thing stacks up to the current generation notebooks. As this isn’t final hardware, we won’t be focusing all that much on the laptop design and features but will instead concentrate on performance. So, come meet our mobile Sandy Bridge test notebook.
Post Your CommentPlease log in or sign up to comment.
View All Comments
skywalker9952 - Monday, January 3, 2011 - linkFor your CPU specific benchmarks you annotate the CPU and GPU. I beleive the HDD or SSD plays a much larger role in those benchmarks then a GPU. Would it not be more appropriate to annotate the storage device used. Were all of the CPUs in the comparison paired with SSDs? If they weren't how much would that affect the benchmarks?
JarredWalton - Monday, January 3, 2011 - linkThe SSD is a huge benefit to PCMark, and since this is laptop testing I can't just use the same image on each system. Anand covers the desktop side of things, but I include PCMark mostly for the curious. I could try and put which SSD/HDD each notebook used, but then the text gets to be too long and the graph looks silly. Heh.
For the record, the SNB notebook has a 160GB Intel G2 SSD. The desktop uses a 120GB Vertex 2 (SF-1200). W870CU is an 80GB Intel G1 SSD. The remaining laptops all use HDDs, mostly Seagate Momentus 7200.4 I think.
Macpod - Tuesday, January 4, 2011 - linkthe synthetics benchmarks are all run at turbo frequencies. the scores from the 2.3ghz 2820qm is almost the same as the 3.4ghz i7 2600k. this is because the 2820qm is running at 3.1ghz under cinebench.
no one knows how long this turbo frequency lasts. maybe just enough to finish cinebench!
this review should be re done
Althernai - Tuesday, January 4, 2011 - linkIt probably lasts forever given decent cooling so the review is accurate, but there is something funny going on here: the score for the 2820QM is 20393 while the score for the score in the 2600K review is 22875. This would be consistent with a difference between CPUs running at 3.4GHz and 3.1GHz, but why doesn't the 2600K Turbo up to 3.8GHz? The claim is that it can be effortlessly overclocked to 4.4GHz so we know the thermal headroom is there.
JarredWalton - Tuesday, January 4, 2011 - linkIf you do continual heavy-duty CPU stuff on the 2820QM, the overall score drops about 10% on later runs in Cinebench and x264 encoding. I mentioned this in the text: the CPU starts at 3.1GHz for about 10 seconds, then drops to 3.0GHz for another 20s or so, then 2.9 for a bit and eventually settles in at 2.7GHz after 55 seconds (give or take). If you're in a hotter testing environment, things would get worse; conversely, if you have a notebook with better cooling, it should run closer to the maximum Turbo speeds more often.
Macpod, disabling Turbo is the last thing I would do for this sort of chip. What would be the point, other than to show that if you limit clock speeds, performance will go down (along with power use)? But you're right, the whole review should be redone because I didn't mention enough that heavy loads will eventually drop performance about 10%. (Or did you miss page 10: "Performance and Power Investigated"?)
lucinski - Tuesday, January 4, 2011 - linkJust like any other low-end GPU (integrated or otherwise) I believe most users would rely on the HD3000 just for undemanding games in the category of which I would mention Civilization IV and V or FIFA / PES 11. This goes to say that I would very much like to see how the new Intel graphics fares in these games, should they be available in the test lab of course.
I am not necessarily worried about the raw performance, clearly the HD3000 has the capacity to deliver. Instead, the driver maturity may come out as an obstacle. Firstly one has to consider the fact that Intel traditionally has problems with GPU driver design (relative to their competitors). Secondly, should at one point Intel be able to repair (some of) the rendering issues mentioned in this article or elsewhere, notebook producers still take their sweet time before supplying users with new driver versions.
In this context I am genuinely concerned about the HD3000 goodness. The old GMA HD + Radeon 5470 combination still seems tempting. Strictly referring to the gaming aspect I honestly prefer reliability and a few FPS' missing rather than the aforementioned risks.
NestoJR - Tuesday, January 4, 2011 - linkSo, when Apple starts putting these in Macbooks, I'd assume the battery life will easily eclipse 10 hours under light usage, maybe 6 hours under medium usage ??? I'm no fanboy but I'll be in line for that ! My Dell XPS M1530's 9-cell battery just died, I can wait a few months =]
JarredWalton - Tuesday, January 4, 2011 - linkI'm definitely interested in seeing what Apple can do with Sandy Bridge! Of course, they might not use the quad-core chips in anything smaller than the MBP 17, if history holds true. And maybe the MPB13 will finally make the jump to Arrandale? ;-)
heffeque - Wednesday, January 5, 2011 - linkYeah... Saying that the nVidia 320M is consistently slower than the HD3000 when comparing a CPU from 2008 and a CPU from 2011...
Great job comparing GPUs! (sic)
A more intelligent thing to say would have been: a 2008 CPU (P8600) with an nVidia 320M is consistently slightly slower than a 2011 CPU (i7-2820QM) with HD3000, don't you think?
That would make more sense.
Wolfpup - Wednesday, January 5, 2011 - linkThat's the only thing I care about with these-and as far as I'm aware, the jump isn't anything special. It's FAR from the "tock" it supposedly is, going by earlier Anandtech data. (In fact the "tick/tock" thing seems to have broken down after just one set of products...)
This sounds like it is a big advantage for me...but only because Intel refused to produce quad core CPUs at 32nm, so these by default run quite a bit faster than the last gen chips.
Otherwise it sounds like they're wasting 114 million transistors that I want spent on the CPU-whether it's more cache, more, more functional units, another core (if that's possible in 114 million transistors) etc.
I absolutely do NOT want Intel's garbage, incompatible graphics. I do NOT want the addition complexity, performance hit, and software complexity of Optimus or the like. I want a real GPU, functioning as a real GPU, with Intels' garbage completely shut off at all times.
I hope we'll see that in mid range and high end notebooks, or I'm going to be very disappointed.