Comments Locked

195 Comments

Back to Article

  • sabot00 - Tuesday, March 6, 2012 - link

    How long will Intel keep its HD Graphics increases?
  • MonkeyPaw - Tuesday, March 6, 2012 - link

    I don't understand the logic of selling a high end CPU with the best IGP. Seems like anyone running an it isn't going to stick with the IGP for games, and if they aren't gaming, then what good is that high-end GPU? Maybe the entire "Core i" line should use the HD 4000.
  • Flunk - Tuesday, March 6, 2012 - link

    Because the low end chips are just die-harvested high end chips it makes sense. No reason to disable it so they leave it on.

    And some people do actually use high end processors with IGPs. It's fairly easy to get one from a major OEM. It's stupid but most people don't know any better.
  • aahkam - Tuesday, March 27, 2012 - link

    Funny comments I saw.

    What's wrong even if the High End CPU that comes with IGP?

    Is High End CPU = Gaming Machine CPU? If that is your logic, you're a rich but shallow boy!

    I do lots of Video Editing and Transcoding, I need High End CPU but None of the High End GPU beats Quick Sync in Transcoding in terms of Quality and Speed.
  • dqniel - Friday, April 6, 2012 - link

    "and if they aren't gaming, then what good is that high-end GPU?"

    I feel like you missed that part. He's not saying that only gamers use high-end CPUs. He's saying that gamers using a high-end CPU won't care about the high-end iGPU because they won't use it. Also, non-gamers who need a high-end CPU generally won't see the benefits of the included high-end iGPU. So, he proposes that the better niche for the high-end iGPU would be on the more affordable CPUs, because then budget-minded gamers could buy an affordable CPU that has a relatively powerful iGPU integrated into it.
  • defter - Wednesday, March 7, 2012 - link

    This is a mid-range CPU, not high-end one.

    High desktop CPUs (i7 3800-3900) don't have IGP.
  • KoolAidMan1 - Wednesday, March 7, 2012 - link

    It is because laptops continue to get slimmer and slimmer. Getting good GPU performance without the compromises on the chassis that a dedicated GPU would force is the point.
  • Tormeh - Wednesday, March 7, 2012 - link

    This.

    My next laptop will have processor graphics for the sake of battery life and size, and whoever has the best graphics gets my money.
  • bznotins - Wednesday, March 7, 2012 - link

    Seconded.
  • aguilpa1 - Wednesday, March 7, 2012 - link

    If the 4000HD is on the level of lets say a 560m I would not hesitate to get a laptop with no dedicated graphics but if it isn't I'm still going to go for the dedicated.
  • niva - Wednesday, March 7, 2012 - link

    I think it's a long time away from approaching 560m performance. If you're going to do any remotely serious gaming on a laptop it's still best to get a dedicated graphics card.

    I'm still sticking to gaming on a tower, so these CPUs (esp the AMD llano) make sense for me in laptops. Don't ever see myself gaming on a laptop unless I completely get rid of the towers in my house... which won't happen anytime soon (if ever.)
  • pepperoni - Wednesday, March 7, 2012 - link

    I felt the same way when I was shopping recently. I WANTED to buy a Llano-based notebook (inexpensive, better graphics vs. Intel). The problem is there's no such thing as a slim and light Llano. Every OEM sticks you with the same configuration: six pounds and 15.6" turd-768 resolution screen. It's bizarre.

    For the sake of competition, I hope Trinity will get some better design wins.
  • CeriseCogburn - Sunday, January 27, 2013 - link

    If you look at the gaming charts, the resolution may go past x768, but the settings are on LOW, and don't give us a minimum frame rate, so the answer is:
    That's all that llano can handle is low end low rez.
    So AMD forces the giant .lb weighted monster as a selling point.
  • poached - Wednesday, April 18, 2012 - link

    so AMD?
  • Demon-Xanth - Wednesday, March 7, 2012 - link

    I agree with you there. To get those "$100 mid range GPUs" on a laptop you need to bump up the cost by around $400 to get to one that simply can have one. Most laptops currently do not have discrete GPUs.

    I am glad to see that integrated graphics from both Intel and AMD can now be compared with low end cards like the GT520 and GT440 without it becoming a laugh. Also that they are actually completing the tests well now. That is a rather major step. I remember some reviews of integrated graphics that resulted in a lot of either "could not complete" or "the bar is too small to fit a number on" entries.
  • Azethoth - Wednesday, March 7, 2012 - link

    The IGP provides the QuickSync implementation. It would be insane to not include the silicon for it on the high end system. In addition moving forward you can get compute work out of the GPU so why would you ever not include it.
  • danjw - Wednesday, March 7, 2012 - link

    Quicksync the Intel video transcoding feature is based in the GPU. This is important to a lot of users.
  • sweetspot - Wednesday, March 7, 2012 - link

    Well also these make for nice office machines. So businesses upgrading there desktop workstations.

    When you have thousands of employees, the desktop refresh, these are decent option, since they are not gaming at work ( right lol ).
  • Taft12 - Sunday, March 18, 2012 - link

    Hardly any large corporations buy desktops anymore. Maybe for the call centre employees, that's about it.
  • AFUMCBill - Wednesday, March 7, 2012 - link

    Because gaming isn't the only thing that uses graphics cards. For instance, more and more video editors use the graphics card for doing video decode/encode/applying effects. So having a high performance graphics engine to go along with the high performance CPU can be a really nice thing.
  • Mithan - Wednesday, March 7, 2012 - link

    Most gamers don't spend money on the I7 lineup, prefering to buy the Core i5 series and invest the extra money into more GPU.
  • just4U - Thursday, March 8, 2012 - link

    It's not only the extra money.. Apparently the 2500K does better in a fair number of games over the 2600K (in part.. i think due to Hyper Threading) and the graphs seem to support that (altho maybe not for the reason I mentioned)

    Looking at the 3700 series though it beats out both the 2500K and 2600K so I think that one is going to be of special interest to gamers.. moving forward.
  • auvrea - Monday, November 19, 2012 - link

    bump
  • nuha_te10 - Tuesday, March 6, 2012 - link

    I'm afraid the next Haswell will be Tock-
  • Arnulf - Wednesday, March 7, 2012 - link

    Why ?

    If the statement "the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides" is true then I'm looking forward to Haswell very much. I'll finally be able to dump discrete GPU as I only use relatively mdoest dislay resolutions, and instead pour the money into even quieter cooling solution. Silence, sweet silence :)
  • Articuno - Tuesday, March 6, 2012 - link

    Nice to see AMD winning where it actually matters for most consumer applications.
  • Exodite - Wednesday, March 7, 2012 - link

    Browsing?
  • Fujikoma - Wednesday, March 7, 2012 - link

    You crack me up... that was truly a funny response.
    Seriously though, Llano isn't that bad for a generic/cheap build. I did pick one up to build a machine for my mom. The mobo and the proc. were justified by the price. I knew it wouldn't be powerful, but it's fairly energy efficient, has decent graphics and the money I saved went toward the ssd. Most people I build/fix computers for, don't come close to using them to their potential, so price becomes the biggest factor. Would I buy one for myself? No, I'll stick with the i7 I currently have and when I build my next machine, it looks like it'll be an Intel also.
  • Exodite - Wednesday, March 7, 2012 - link

    Thanks. :)

    I quite agree, Llano is awesome for what it does and provides an excellent platform for most users.

    I'm just tired of the assumption that GPU grunt is more powerful than CPU.

    It's true that most of my friends and family would be perfectly happy with the GPU muscle in a Llano chip. That said, they'd also be perfectly happy with the iGPU in something like a i3 2100.

    As for myself I'm using an i7 2600K, running at stock, but then I have somewhat different requirements.

    I'd not hesitate to recommend either Llano or Intel chips with iGPU solutions, it all depends on the person really.
  • Exodite - Wednesday, March 7, 2012 - link

    ...'GPU grunt is more important that CPU'... would probably read better, looking back.

    Ah well, you get the point I'm sure.
  • retrospooty - Wednesday, March 7, 2012 - link

    "Nice to see AMD winning where it actually matters for most consumer applications."

    I dont see how you can look at these (or any) benchmark and call it a win for AMD. Intel is smoking them. A few useless integrated graphics benchmarks and you call it a win? Hey, I hear RIM is looking for a new PR rep, they could really use a guy like you. ;)
  • juampavalverde - Tuesday, March 6, 2012 - link

    IGP performance is nice, but no comments about the subjective quality, i have seen side to side HD Graphics 2000 vs Radeon IGP and the graphics quality was night and day, with the radeon being the day...
    I dont know whats needed to do properly integrated graphics, but seems intel still lacks...
  • nuha_te10 - Tuesday, March 6, 2012 - link

    Yes,people standard are different. For gamer intel IGP might suck, but it's more than enough for me.If I buy Llano, the graphic core might be just a wasted silicon because I don't really do gaming.Buy 1 if you only need 1
  • lowenz - Tuesday, March 6, 2012 - link

    OK, DirectCompute is supported by the GPU: we see the fluid benchmark in review.

    But GPU is OpenCL 1.1/1.2 compliant?
  • ltcommanderdata - Tuesday, March 6, 2012 - link

    You mentioned in your intro about the Intel-Apple exclusivity agreement being up and Apple constantly pushing Intel for better GPU performance. Do you think Ivy Bridge has made sufficient gains in GPU performance to keep Apple on board? Have you had a chance to test Ivy Bridge's IGP OpenCL performance since that seems like a particular area of interest for Apple?
  • tipoo - Tuesday, March 6, 2012 - link

    I think its sure that they will. They chose a weaker CPU in favour of a stronger IGP (9400 and Core 2 Duo) before, but now we're at a point where the HD4000 would be more than adequate for Mountain Lion and probably onwards, plus Intel is way ahead with 22nm and the resulting power draw as well as CPU performance, and I think Apple uses Quicksync for Airplay which is Intel-only.
  • Exodite - Wednesday, March 7, 2012 - link

    It really depends on your workload.

    Personally I need CPU grunt far more than GPU grunt, which I suppose means I wouldn't even strictly need the 4K class GPU.

    But it's 'free' so I'll take it. :)
  • tipoo - Tuesday, March 6, 2012 - link

    The top end models which would probably be paired with a discreet card get decent integrated graphics, while the low end ones which will probably be standalone get cut down IGPs. Odd. If anything I think on the top end people would want models with less space used on integrated graphics with that headroom used for higher clocks or lower prices, even the cut down IGPs can do Quicksync.

    Also a suggestion for the full review, we know pretty much to expect from the HD4000 performance wise, but what about image quality? AMD and Nvidia improved things generation after generation, and I doubt Intel got it right with their first or second serious foray into lower-midrange graphics.
  • lowenz - Tuesday, March 6, 2012 - link

    Apple?
    If IGP supports OpenCL as well as DirectCompute there's no more reason for a AMD APU for pro users not gamers.
  • tipoo - Tuesday, March 6, 2012 - link

    Seriously, this is too much, its fine that you have an opinion and I might not have a problem with it if you posted it once, but you post the same damn thing on every article whether its related or not and usually multiple times, someone just please do us all a favour and ban this guy and delete his comments?
  • jjj - Tuesday, March 6, 2012 - link

    CPU perf pretty much as expected,GPU perf somewhat dissapointing ,i thought they'll at least aim to match Llano but i guess it is ok for 1MP laptops screens if mobile parts perform close enough (and a couple of big ifs when it comes to image quality and drivers).
    Any opinions yet about QuickSync encoding quality?
  • wifiwolf - Tuesday, March 6, 2012 - link

    And we should remark that's comparing 2600 with 3700 which have different cpu too.
    Other benchmarks had significantly better results on 3700 than 2600.

    So Anand, how you know that difference is not attributable to the CPU and not to some gpu improvement?
  • IntelUser2000 - Tuesday, March 6, 2012 - link

    You are not being serious, are you? The CPU gets 10% in CPU sensitive benchmarks and GPU gained 40-60%. Even taking out 10%, its still 30-50%, which btw isn't true as games aren't very sensitive to CPU changes as applications do.
  • wifiwolf - Wednesday, March 7, 2012 - link

    Look at crisys or metro benchmarks and tell me where you find that improvement, at least more than what you find in cpu difference.
  • mosu - Wednesday, March 7, 2012 - link

    I've tried it on some HD clips at a local TV station and on a big screen it really sucked.It's way behind AMD.We used aHP EliteBook 8460P laptop.
  • Articuno - Tuesday, March 6, 2012 - link

    At least AMD's products are HD capable.
  • dr/owned - Thursday, March 8, 2012 - link

    My 5 year old laptop with a shared ram gpu is "HD capable". GTFO noob.
  • Articuno - Tuesday, March 6, 2012 - link

    Billions in R&D, double the MSRP, half the power and yet it still can't play Crysis better than Llano, which will be replaced by Trinity in a few weeks. What a crying shame.
  • travbrad - Tuesday, March 20, 2012 - link

    Not playing Crysis sounds like a good thing to me.
  • tipoo - Tuesday, March 6, 2012 - link

    Source or gtfo. Apple got the stock HD 3000, why would this be different?
  • tipoo - Wednesday, March 7, 2012 - link

    Thankfully the comments of a certain troll were removed so mine no longer makes sense, for any future readers.
  • Articuno - Tuesday, March 6, 2012 - link

    Just like how overclocking a Pentium 4 resulted in it beating an Athlon 64 and had lower power consumption to boot-- oh wait.
  • SteelCity1981 - Tuesday, March 6, 2012 - link

    That's a stupid comment only a stupid fanboy would make AMD is way ahead of Intel in the graphics department and is very competitive with Intel in the mobile segment now.
  • tipoo - Tuesday, March 6, 2012 - link

    Your comments would do nothing to inform regular readers of sites like this, we already know more. So please, can it.
  • tipoo - Tuesday, March 6, 2012 - link

    Not what I asked little troll. Give a source that says Apple will get a special HD4000 like no other.
  • Operandi - Tuesday, March 6, 2012 - link

    What are you talking about? As long as AMD has a better iGPU there is plenty of reason for them to be viable choice today. And if gaming iGPU performance holds on against Intel there is more than just hope of them getting back in the game in terms of high performance comput tomorrow.
  • tipoo - Tuesday, March 6, 2012 - link

    I'm pretty sure even 16x AF has a sub 2% performance hit on even the lowest end of todays GPUs, is it different with the HD Graphics? If not, why not just enable it like most people would, even on something like a 4670 I max out AF without thinking twice about it, AA still hurts performance though.
  • IntelUser2000 - Tuesday, March 6, 2012 - link

    AF has greater performance impact on low end GPUs. Typically its about 10-15%. It's less on the HD Graphics 3000, only because their 16x AF really only works at much lower levels. It's akin to having option for 1280x1024 resolution, but performing like 1024x768 because it looks like the latter.

    If Ivy Bridge improved AF quality to be on par with AMD/Nvidia, performance loss should be similar as well.
  • tipoo - Wednesday, March 7, 2012 - link

    Hmm I did not know that, what component of the GPU is involved in that performance hit (shaders, ROPs, etc)? My card is fairly low end and 16x AF performs nearly no different than 0x.
  • Exophase - Wednesday, March 7, 2012 - link

    AF requires more samples in cases of high anisotropy so I guess the TMU load increases, which may also increase bandwidth requirements since it could force higher LOD in these cases. You'll only see a performance difference if the AF causes the scene to be TMU/bandwidth limited instead of say, ALU limited. I'd expect this to happen more as you move up in performance, not down, since ALU:TEX ratio tends to go up along the higher end.. but APUs can be more bandwidth sensitive and I think Intel's IGPs never had a lot of TMUs.

    Of course it's also very scene dependent. And maybe an inferior AF implementation could end up sampling more than a better one.
  • Articuno - Tuesday, March 6, 2012 - link

    Except the quality is the same as competing AMD products if not worse because of driver issues, but you lose 20-30% performance in every scenario versus the last gen Llano APU. The facts are in this very review.
  • Articuno - Tuesday, March 6, 2012 - link

    Sure sounds like Bulldozer at this point, doesn't it?

    ""It's just a driver issue, AMD/Intel will fix it!"
    "It's just the review units sent out, AMD/Intel will have a BIOS update at the official release that improves performance!"
    "If you overclock it to hell and back, it can almost sort of maybe compete with Intel/AMD!"
    "Oh look, there's a new update out that improves performance! Sure it's only 1% performance, applicable in only certain scenarios, but it's better than nothing!"
  • Articuno - Tuesday, March 6, 2012 - link

    Aside from that NOT being what I said at all... you do realize you justified the reasoning in your post, right? They're bribing Intel. That doesn't mean they did nothing wrong, it's a BRIBE. Besides, Intel is just as guilty as Microsoft of OEM threatening and hand-holding in the 90s.
  • Makaveli - Tuesday, March 6, 2012 - link

    Who the hell is this Sans2212 troll.

    Dude please do all of us a favour on this site and STFU.

    90% of the people reading this site know more than you.

    Take your bad english GTFO.

    +1 for ban!
  • Articuno - Tuesday, March 6, 2012 - link

    If you Google his handle you'll find out he's been doing this for a while now (and that he's probably Japanese, which would explain the poor English).

    +2 for ban.
  • m.amitava - Tuesday, March 6, 2012 - link

    I don't think he's serious....reading come of his comments...nobody with a human brain can reason like that...

    If he IS serious, opens up the possbility of creating an online zoo exhibit out of him...prod him with an AMD logo and he'll roar, shout, roll and snap :)
  • tipoo - Wednesday, March 7, 2012 - link

    Poes law in full swing. The morons are indistinguishable from the people trying to look like them.
  • silverblue - Wednesday, March 7, 2012 - link

    Awww I missed it... I usually like reading his rants, especially his obsession with "amd craps". He filled the void SiliconDoc vacated.
  • Jamahl - Tuesday, March 6, 2012 - link

    Wake me up when intel does something interesting again.
  • MJG79 - Tuesday, March 6, 2012 - link

    2009 - $35B in revenue
    2010 - $44B (1st $40B year)
    2011 - $54B (1st $50B year)
    $20B revenue growth in 2 years

    You wake me up when there is competition again.
  • rpsgc - Wednesday, March 7, 2012 - link

    All that revenue, all that profit and yet, they STILL can't bet AMD in integrated graphics.

    I think that qualifies as a fail.

    Thanks for (kind of) proving his point?
  • dagamer34 - Thursday, March 8, 2012 - link

    They don't really care to. The point of a business is to make money, not have the best products. The latter only gets solved when AMD gets serious in competing with Intel on power/performance again.
  • Operandi - Tuesday, March 6, 2012 - link

    The internet called,"stop wasting my bits".
  • StevoLincolnite - Tuesday, March 6, 2012 - link

    You know what? All you do is bash AMD.
    If you think AMD sucks THAT much and it's engineers and everything else is incredibly bad...
    Then I have a challenge.

    Go build your own Processor or GTFO with the bashing.
  • bennyg - Wednesday, March 7, 2012 - link

    Do not feed the troll.
  • StevoLincolnite - Tuesday, March 6, 2012 - link

    Except... Intels IGP drivers on Windows are bad already. They are allot worst on the Mac.
    Historically Intel has never supported it's IGP's to *any* great length and even had to throw up a compatibility list for it's IGP's so you know what games they could potentially run.

    Here is a good example:
    http://www.intel.com/support/graphics/intelhdgraph...

    Heck I recall it taking Intel a good 12 months just to enable TnL and Shader Model 3 on the x3100 chips.

    Historically the support has just not been there.
  • earthrace57 - Tuesday, March 6, 2012 - link

    AMD's CPUs are going to die...sucks to be an AMD fanboy. However, whatever they are doing with their dedicated GPUs, they are doing something right...if they can manage to pull their act together on the driver side, I think AMD would live as a GPU company...
  • earthrace57 - Tuesday, March 6, 2012 - link

    I'm sorry, but Llano APUs will stay on top for quite a while; Intel is still at heart a CPU, Llano is part GPU...if AMD can get drivers the quality of nVidias, they will most likely do extremely well on that front.
  • zshift - Tuesday, March 6, 2012 - link

    I really enjoyed the added compilation benchmark. This site has the most comprehensive collection of benchmarks that I've seen, it's a one-stop shop for most of my reviews. Keep up the great work!
  • Jamahl - Tuesday, March 6, 2012 - link

    Would be great to see power benchmarks of the IGP, especially vs Llano and the HD 3000. Let's see if the graphics improvements have come at the price of yet more power consumption or if intel has managed to keep that down.
  • Bateluer - Tuesday, March 6, 2012 - link

    Until AMD goes out of business. Then Intel gets lazy again, and the price of even a mid range CPU creeps back up above 600 dollars. You might be too young to remember the 500 dollar price tags on the first gen P3s, when Intel had no effective competition from AMD.

    Its not in the consumer's best interests for AMD to die off.

    And, FYI, their GPUs are top notch and excellent, across the entire market. Downside is, they're basically carrying the company right now and that's not sustainable.
  • m.amitava - Tuesday, March 6, 2012 - link

    This sans guy is hilarious!!

    Lets prod him a bit more and really get his fanboi juices flowing :)

    AMD is the best!!!!! yaaay....Intel sucks they'll go out of business sometime next week :D
  • Azeraph - Thursday, March 8, 2012 - link

    it doesn't really matter if the igp isn't that great most people don't buy them for their graphics power.I get the feeling that maybe intel is just putting them out there to keep it's base solid against AMD,Not that it needs it and i'm an amd fan. i found something the other day that will possibly change how tomorrows processors will use light instead of electricity.

    http://scitechdaily.com/penn-researchers-build-a-c...
  • m.amitava - Tuesday, March 6, 2012 - link

    ain't he cute ? :)...I hope he's not a bot...that would break my heart
  • Galvin - Wednesday, March 7, 2012 - link

    Please
  • mattgmann - Wednesday, March 7, 2012 - link

    it would be cool to see a 4ghz clocked nehalem shuffled in the mix. I'm sure I'm not the only one rocking an i7 9xx wondering how much actual productivity gains are to be had with the new tech. I personally don't like to upgrade until the new gen's retail performance out-does my previous overclocked performance by a solid 15%.
  • svata - Wednesday, March 7, 2012 - link

    Is the bug with true 23.976 fps playback fixed?
    http://www.anandtech.com/show/4083/the-sandy-bridg...
  • sicofante - Wednesday, March 7, 2012 - link

    I understand that will be part of the new chipsets which haven't been tested here, but I'm also very interested. As a matter of fact, I have a few HTPC customers waiting for Ivy Bridge for this sole reason.
  • vlado08 - Friday, March 9, 2012 - link

    I don't find the silence about 23.976 fps playback very promising. This is new chipset "Keep in mind that this is a preview using early drivers and an early Z77 motherboard" .... "Intel Z77 Chipset Based Motherboard"

    I find three possibilities:
    1. They are not going to fix it with Ivy bridge.
    2. They are not ready with the drivers.
    3. They are ready and everything is fine but keeping silen becoause they need to sell old chips.

    It didn't left much. We'll see.
  • Assimilator87 - Wednesday, March 7, 2012 - link

    Intel's always the best, EXCEPT WHEN THEY'RE NOT! Athlon 64. Since AMD's sticking with Bulldozer's base architecture for at least a couple generations, they won't be competitive for a while, but that doesn't mean they'll never be competitive.
  • silverblue - Wednesday, March 7, 2012 - link

    At the very least, AMD need a less power hungry successor to Bulldozer. From the Xeon review, it's mentioned that they should be in a position to do this, and could at least clock the thing a lot higher and still use less power than Bulldozer. Regardless, that IPC deficit is a killer - the following page is so telling of the architecture's current limitations:

    http://www.anandtech.com/show/4955/the-bulldozer-r...
  • abianand2 - Wednesday, March 7, 2012 - link

    1. General curiosity: You stated you did not get a sanction or support from Intel for this preview. I believed that sort of a thing isn't allowed before the release date. How do exceptions like this work?

    2. Specific: I observed most of the discrete GPU tests were at 1680x1050, where there . Any reason for this? I guess it is since this is just a preview. Am I right? Any other reason?

    Thanks
  • Kjella - Wednesday, March 7, 2012 - link

    1. If you want to officially review the chip, you sign an NDA and Intel provides you with it. Here he got access to it from a partner, who probably broke their agreements but Anand never signed any agreement so he can publish whatever he wants.

    2. I would think so ,and in GPU bound scenarios I wouldn't expect much change at all.
  • InsaneScientist - Wednesday, March 7, 2012 - link

    1) Generally what happens with previews and first looks is that the company producing a product (Intel) will send out press samples to reviewers if the reviewers will sign a Non Disclosure Agreement (NDA). When the NDA expires (generally the same time for everyone), the reviewers can post their findings to the public.

    This is done (I assume) to give reviewers enough time to thoroughly review a product without having (theoretically) to worry about having information leak until the company wants it to get out.

    If, on the other hand, a reviewer acquires a product via other means so there is no NDA that they have to sign in order to get the product... well, they're not under NDA, so they're free to disclose whatever they want.
  • sld - Wednesday, March 7, 2012 - link

    What a troll.

    Start crying when Intel is able to jack up prices by 2x - 3x when AMD is gone.

    You don't even realise that his fears of a tock- means that since Ivy Bridge has more features, presumably pushed forward from Haswell, Haswell itself will bring less features to the table.
  • Hrel - Wednesday, March 7, 2012 - link

    WHY!!!!? Does Intel HAVE to disable Hyper Threading on the sub 300 dollar CPU's? It's not like having in ENABLED costs them anything more at all. It would just be providing their customers with a better product. This shit is infuriating. It's there on the chip no matter what, HT should just be on every single Intel chip no matter what. That shit pisses me off SOOOO much.
  • Exodite - Wednesday, March 7, 2012 - link

    I would imagine there's going to be a fair few sub-300 USD dual-cores with HT down the line, though I suppose you meant specifically for the quads?

    The reason seem obvious enough for me, if you need the extra performance in applications that stand to gain from HT you'll have to pay for it.

    Frankly I don't see the added cost as anything major, considering the gains.

    It's just differentiation really.

    Sure, we'd all want more stuff cheaper (or for free!) but lacking HT doesn't in any way cripple a chip.
  • sicofante - Wednesday, March 7, 2012 - link

    It's called market segmentation.
  • Hector2 - Wednesday, March 7, 2012 - link

    Chill out. You'll pop a blodd vessel. With all that's happening in the world, THAT's what's pissing you off ? LMAO
  • sld - Wednesday, March 7, 2012 - link

    Intel's products get cheaper with smaller dies and with competition. Without competition, their dies cost the same to make, but they rob and loot your pockets and make obscene profits off you because your hated AMD no longer exists as an alternative supplier of good chips.
  • BoFox - Wednesday, March 7, 2012 - link

    There are three different versions of HD 5570 (DDR2, DDR3, and GDDR5 - with the GDDR5 having FIVE times as much bandwidth as the DDR2 version).

    There are also two different versions of HD 5450 (DDR2 and DDR3).

    It would be appreciated if you could let us know which versions were used in the benchmarks in this article. Thanks
  • BoFox - Wednesday, March 7, 2012 - link

    Just let us know which GPU was used for the discrete GPU tests! LOL..
  • KZ0 - Wednesday, March 7, 2012 - link

    "ATI Radeon HD 5870 (Windows 7)", page 4
    :)
  • WiZARD7 - Wednesday, March 7, 2012 - link

    There should be a comparison at the same clock speed - nehalem - sandy bridge - ivy bridge (@4 Ghz )
  • Breach1337 - Wednesday, March 7, 2012 - link

    On page, shouldn't:

    " Doing so won't give you access to some of the newer 7-series chipset features like PCIe Gen 3 (some 6-series boards are claiming 3.0 support), native USB 3.0 (many 6-series boards have 3rd party USB 3.0 controllers) and Intel's Rapid Start Technology."

    say

    " Doing so will give you access to some of the newer 7-series chipset features like PCIe Gen 3 (some 6-series boards are claiming 3.0 support), native USB 3.0 (many 6-series boards have 3rd party USB 3.0 controllers) and Intel's Rapid Start Technology."
  • iwod - Wednesday, March 7, 2012 - link

    There are many thing not mentioned.

    Intel are strong in Software everywhere except the Gfx drivers department. No wonder why others call Anand a pro Intel site, i dont want to believe it, until all the article continue to label Intel are hard at work on Gfx drivers when they are clearly not. They are better then what they are used to be, but still far from good.

    Graphics Quality on Intel IGP are not even close to what AMD offers.

    Even if Haswell double the performance of Ivy they will still be one generation behind AMD.

    I continue to wonder why they use their own GPU on Desktop / Laptop and not on Mobile SoC. They could have used PowerVR on Desktop as well, developing drivers for one hardware will simplify things and hopefully have bigger incentive to increase software R&D.
  • meloz - Wednesday, March 7, 2012 - link

    >>No wonder why others call Anand a pro Intel site
    What should he do, fake the benchmark results to make AMD look better than they are? Anand can only report his findings, he does this truthfully. Some people do not want to accept reality and prefer to shoot the messenger. Direct your frustrations towards AMD, not websites which report results of benchmarks.

    From past benchmarks you can see the results at Anandtech are that different from other websites, AMD is getting destroyed on CPU perfomance and performance / watt metric.

    >>I continue to wonder why they use their own GPU on Desktop / Laptop and not on Mobile SoC. They could have used PowerVR on Desktop as well,

    FYI, they are dumping PowerVR in near future as well. Already covered on many websites, google it. PowerVR was a temporary fix, or rather an attempt at a fix which was more of a hassle and didn't work in the marketplace anyway.

    They are now comitted to improving their own iGPU and drivers. This will take time for sure, Intel marches to its own beat.

    The simple fact is that with the much weaker Sandy Brdige iGPU they outsold AMD 15 to 1, so even though the Ivy Bridge iGPU has not surpassed AMD yet, Intel should continue to do really well.

    >>i dont want to believe it, until all the article continue to label Intel are hard at work on Gfx drivers when they are clearly not.

    You can believe whatever you want to believe, this is not about beliefs but facts. As a user of Sandy Bridge and linux I know better than most just how much Intel drivers suck. In fact, their linux iGPU drivers suck much worse than Windows version (hard to imagine, but true) and weren't truly ready until Mesa 8.0, more than a year after release of the hardware.

    But I also know they are working on things like SNA which in early test already offers ~20% performance boost.

    No word on when it will be consumer ready, but Intel are working and steadily improving on drivers side as well. Perhaps not at the pace you want. You do not have to accept reality if it is so difficult for you, don't blame websites for reporting reality, however.

    I am almost grateful Intel is not 'good enough' on GPU side as yet. It keeps AMD alive another year. Hopefully.
  • meloz - Wednesday, March 7, 2012 - link

    >>From past benchmarks you can see the results at Anandtech are that different from other websites

    Should read: From past benchmarks you can see the results at Anandtech are NOT that different from other websites.

    Sigh, allow us to edit posts, if only for 10 minutes or so after making the initial post.
  • ET - Wednesday, March 7, 2012 - link

    PowerVR has lower performance and fewer features, so would not be a good PC solution. I'm also sure that Intel would rather have its own solution, it's just that it can't yet compete with PowerVR at the low power arena. I imagine that if Intel succeeds in mobile space it will try to create its own low power 3D core.

    As for graphics drivers, I'm sure Intel is hard at work at them, but probably has fewer people than AMD on that. Far as I can see it's no longer the case that reviews with Intel graphics keep talking about what didn't run correctly, which means that things are getting better.
  • Belard - Wednesday, March 7, 2012 - link

    Anyone notice in the Compile Chromium Test in which CORE count actually matters...

    AMD's "8 core" fx8150 doesn't come close to the 3770K, much less the 2500K (4 core/4 thread) CPU.

    But give it to AMD for Llano for easily out-performing intel with built-in graphics, handy for notebooks. AMD should have put GPUs into the fx-line.

    The odd-thing about intel's HD-Graphics is that the lower-end really needs to have the HD4000 more than the higher end.
  • fic2 - Wednesday, March 7, 2012 - link

    I totally agree. Intel is again going to cobble the lower end with the HD2500 graphics so that people that don't need the i7 cpu have to buy a discrete video card. I really wish review sites would hammer Intel for this and pressure them to include the better integrated graphics. It's not like the HD4000 is so good that people will buy an i7 just for the graphics.
  • Jamahl - Thursday, March 8, 2012 - link

    HD4000 takes up more die space which means it costs them more. That's all intel cares about, they don't give a shit about what people need at the lower end.

    They were forced to start using HD3000 graphics in all their lower end chips because of Llano. The 2105 basically replaced the 2100 at the same money so they would be less embarrassed by Llano. That's what competition does.
  • Death666Angel - Wednesday, March 7, 2012 - link

    I like this tick. The CPU performance goes up by as much as I expected and the iGPU side goes up significantly.

    If I had the spare change to throw around, I'd upgrade from my 3.8GHz i7 860. But as it is now, an upgraded CPU wouldn't do much for me in terms of gaming performance and I rarely do CPU intensive tasks these days. The chipset and native USB 3.0 are nice, but I'll wait for Haswell next year and get a good GPU or two instead.
  • tiro_uspsss - Wednesday, March 7, 2012 - link

    I'm a little confused :/

    the 3770K consistently beat the 3820 (by a very small margin)

    *wait*

    oh.. I found out why.. the specs of the 3820 as listed in the 'line up' are incorrect - the 3820 'only' turbos to 3.8 not 3.9.. is this why the 3770K did a little better?

    aside from the small extra turbo that the 3770K has, the 3820 has more L3, more memory channels & a higher core clock (that's if the core clock listed for the 3770K is correct)

    soooo.. the extra turbo.. is that why the 3770K is slightly better all-round?
  • Death666Angel - Wednesday, March 7, 2012 - link

    You know that they are different CPU generations, right? One is SNB-E on a 32nm process node and the other is IVB on a 22nm node. The review said that IVB has a 5-15% higher IPC.
  • tiro_uspsss - Wednesday, March 7, 2012 - link

    *slaps own forehead* DUH! thats right! I forgot! :D I knew I was forgetting something! :P :D thanks! makes sense now! :)
  • BSMonitor - Wednesday, March 7, 2012 - link

    The number scheme is misleading.

    3820 and up are SNB-E.

    3770K is Ivy Bridge.

    An IVB core will perform better than a SNB core clocked at the same speed.

    New architecture wins over cache, memory channels, clock speed.
  • Shadowmaster625 - Wednesday, March 7, 2012 - link

    "Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective. The 20 - 40% increase on the graphics side is what blurs the line between a conventional tick and what we have with Ivy Bridge."

    "Being able to play brand new titles at reasonable frame rates as realistic resolutions is a bar that Intel has safely met."
  • hansmuff - Wednesday, March 7, 2012 - link

    The review is good, I really like that you added the compilation benchmark for chromium -- good job!

    I'm a little disappointed in the lack of overclocking information. What is the point of reviewing the K edition of this chip without even doing a simple overclock with a comparison to 2600K in terms of power draw and heat?
  • Silenus - Wednesday, March 7, 2012 - link

    That is because this is NOT a review...it's just a preview. I'm sure they will do some overclocking testing in the full review later. Those results would be more meaningful then anyway as this is still early hardware/drivers.
  • Zoomer - Wednesday, March 7, 2012 - link

    It would have been interesting to see. Personally, I don't care for IGP, as they sit disabled anyway. Right now, it seems like it's a 7% clock for clock perf increase, which is very poor for one process node. Knowing where the clocks can be will let everyone know exactly how much faster the CPU can be over SB.
  • NeBlackCat - Wednesday, March 7, 2012 - link

    For me, the most interesting things about IVB are improved multi-monitor support, and power savings not just at stock, but also undervolted (stock clock) and overclocked.

    Because I want to know if I'm finally going to get that laptop or mini-itx system that can drive several monitors while remaining cool and sipping power, even under load.

    Not covered at all. Shame.
  • beck2050 - Wednesday, March 7, 2012 - link

    Intel marches on. Their domination of 80+% of all CPU markets will continue.
  • silverblue - Wednesday, March 7, 2012 - link

    PC and especially server market, sure, but not smartphone/tablet. Not yet, anyway.
  • fvbounty - Wednesday, March 7, 2012 - link

    Should have a had SB 2700K to run clock for clock against the 3770K and see if there's much difference!
  • ellarpc - Wednesday, March 7, 2012 - link

    Agreed! I was just about to post that same comment. It doesn't make much sense to compare it to a lower clocked SB product. Well unless you wanted to make the IB look better. Now I'm going to sift through anand's past reviews to see what kind of gains the 2700 has over the 2600.
  • ellarpc - Wednesday, March 7, 2012 - link

    Doesn't look like Anand has a 2700k for testing
  • ueharaf - Wednesday, March 7, 2012 - link

    I was thinking that the difference in gpu perfomance between HD3000 and HD4000 about 20% to 40% increase perfomance, will remain in the ivy-bridge mobile chips!!! I hope soo!!!
  • lilmoe - Wednesday, March 7, 2012 - link

    Great review. You guys know your stuff. I've been waiting for a review like this since IvyBridge was announced.

    However, I'll still "cling to my Core 2" since it does the job now, and I'll postpone my upgrade till next year. You make it seem like Haswell is a good reason to wait. I bought the system in early 2010, and I usually upgrade every 2-4 years. 3 years sounds just right. I'll be investing in SSDs since you talked me into it though, it seems a better upgrade at the moment.
  • Breach1337 - Wednesday, March 7, 2012 - link

    Did Intel specifically ask not to include overclocking tests in ES previews?
  • mrSmigs - Wednesday, March 7, 2012 - link

    The ivy bridge 3770k is a direct replacement for the sandy bridge 2700k which is only a small upgrade from the 2600k yet still missing from the benchmarks to allow a direct architectural comparison.

    Intel badly need powervr in its graphics core.... will they finally use a multicore Rogue series 6 core in the next generation (Haswell???) for some decent performance in their IGP???? They developed easily the fastest graphics core in the arm soc tablets/phones inside the ipad 2/iphone4s now its time to save intel (one of imgtechs biggest shareholders along with apple). Intel need to ditch this old weak igp core architecture and get with the times....

    The amd llano even with its terribly weak cpu core still clearly outpaces this new improved intel hd4000 core in these non gpu limited tests. If amd had a faster cpu they would be even further ahead in regards to graphic capabilities, which appear cpu limited in many cases too(see discreet gpu tables to get an idea of intels cpu advantages).

    Where are the in game checks on intel's notorious poor image quality, much like when radeons are compared to geforces to ensure these are even producing an acceptable image for the performance they give and not cutting corners???

    Happy with the lower power and performance cpu gains of Ivy Bridge. Disappointed in the weak old graphics once again, which fail to match llano even with a far stronger cpu dragging it along...
  • hasseb64 - Wednesday, March 7, 2012 - link

    How about OPEN GL support?
  • numberoneoppa - Wednesday, March 7, 2012 - link

    Perhaps because not everybody who needs a lot of CPU power also needs to game or do other GPU heavy activities.

    Come on, mate. Think.
  • Conficio - Wednesday, March 7, 2012 - link

    You guys asked for it and finally I have something I feel is a good software build test.


    I just wanted to say thank you for this. May be we can add a maven based java test as well, which should give some idea of javac performance (or a large Eclipse base build all).
  • Conficio - Wednesday, March 7, 2012 - link

    Uhh, this comment renders funny oh Chrome.
  • piesquared - Wednesday, March 7, 2012 - link

    Is this some kind of joke? It may be comical, but it sure ain't funny. intel themselves had slides circulating around showing at least 2x performance increasee over last generation. Now they show up with not even half that and Anand falls to his knees in praise.. Seems a little fishy to me where have I seen this before....Right, the primary elections in the US! Same shit, the elite give the mainstream media their marching orders, and the main stream media sets out to brainwash the mass population with that message. And you continue to lead the charge on downplaying image quality and functionality, ever since you became intel's mouthpiece. Where are the days of proper image quality comparisons, and feature benefit to consumers. That's all dropped off the radar because intel has abysmal and atrocious graphics capability and know how. They're the WORST in the industry, and yet he we have good ol' anand patting his buddy on the bumb ensuring that intel will ever have a need to actualy compete. They can just hand off money to the pieces' of shit in the world and have them manipulate the perception.
    tics
  • Hector2 - Wednesday, March 7, 2012 - link

    Sounds like you have some issues. Maybe you should see a therapist
  • awg0681 - Wednesday, March 7, 2012 - link

    Maybe I misread the article or read a different one. It came across to me that Anand was mainly comparing the HD4000 to HD3000. In which case there is generally a notable increase in performance. It's not 2x the HD3000, but doing a quick search trying to find these slides you mention showing such an increase came up with nothing. Only found one on Tom's which was a leaked slide comparing HD2000 to HD4000. If you could link some of those that would be great. Also, in just about every case where the HD4000 was (almost inevitably) beaten by AMD in graphics performance, it was pointed out.
  • geddarkstorm - Wednesday, March 7, 2012 - link

    I wonder how much of the improvement in the performance to power ratio is due to the trigate technology. In same ways, I was expecting a bigger jump around 20%, but since they also dropped the power by 30W, that says a lot. Looking at his from the perf/power perspective makes it a bigger deal than it sounds from a 5-15% CPU gain.

    Still.. for some reason I feel a little disappointed. I thought trigate would change things even more in conjuncture with 22 nm process.

    So can't wait to see what Hanswell will do.
  • Exodite - Wednesday, March 7, 2012 - link

    Does it matter though?

    After all that argument cuts both ways.

    Any iGPU today is good enough for 2D use, browsing and mainstream gaming - which means stuff like The SIMS 3 rather than Crysis.

    The same is true for CPU power.

    Heck, most users would be perfectly happy with using their smartphones as a desktop.
  • krumme - Wednesday, March 7, 2012 - link

    Well the dilemma for Anand is apparent. If he stops writing those previews that is nice to Intel, someone else will get the oportunity and all the info. He can write two bad previews and the info and early chips just stops comming. Intel and Anand have a business to run, and there is a reason Intel gives Anand the chips (indirectly).

    He have a "deal" with Intel, the same way we have a deal with Anand when we read the review. We get the info - bended/biased - and then we can think ourselves. I think its a fair deal :) - we get a lot of good info from this preview. The uninformed gets raped, but its alway like that. Someone have to pay for the show.
  • chemist1 - Wednesday, March 7, 2012 - link

    The Macbook Pro, for instance, has a discrete GPU, yet can switch to the chip-based GPU to save power when on battery. So having a better chip-based GPU makes sense in this context.
  • Sabresiberian - Wednesday, March 7, 2012 - link

    I'd like to see the discreet graphics card industry make the kind of progress, relatively speaking, Intel has made in the last 2 years.

    Ivy Bridge is a ways from competing with a high-end discreet solution, but if the relative rates in progress don't change, Intel will catch up soon.
  • sixtyfivedays - Wednesday, March 7, 2012 - link

    I use the iGPU on my build for my second monitor and it is quite nice.

    I can watch HD videos on it and it doesn't take away from my dedicated GPU at all.
  • mlkmade - Thursday, March 8, 2012 - link

    Is that even possible? Special hack or software?

    When you install a discrete graphics card, the integrated gpu gets disabled.

    Would love to know how you accomplished this..Is it a desktop or laptop?
  • mathew7 - Thursday, March 8, 2012 - link

    "When you install a discrete graphics card, the integrated gpu gets disabled."

    It was exclusive in northbridge-IGP units (Core2Duo/Quad and older). With Core-i, it's by default disabled but can be enabled through BIOS (of course if you don't have a P5x/6x chipset).
  • AnnonymousCoward - Wednesday, March 7, 2012 - link

    1. How much faster is Ivy Bridge at single thread versus my Conroe@3GHz?
    2. How much faster is my GTX560Ti than HD4000?
  • dr/owned - Thursday, March 8, 2012 - link

    1) Your 65 nm cpu would get the shit blow out of it by IB at the same clock speed in single threaded applications. Assuming 15% improvements in each of the tick-tocks since Conroe, a 1.8 Ghz IB would probably be about the same as a 3 Ghz Conroe.
    2) Discrete graphics vs. integrated graphics. Intel isn't trying to compete here so it's a stupid comparison.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Your "get the shit blown out" is worthless. All I'm looking for is a number, and your effective answer is +67%.

    2. It's not a stupid comparison, because:
    a) I'm interested.
    b) HD4000 is designed for games.
    c) They benchmarked with modern games.
    d) Games are designed around people's performance.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Another website shows the i7 3770K scored 2643 on the Fritz Chess Benchmark with 1 processor. My machine does 2093. That's only 26% different.

    2. I very roughly estimate the GTX560Ti might be 5-6x faster than the HD4000.

    It'd be useful to see a real comparison of these though.
  • The0ne - Wednesday, March 7, 2012 - link

    "There's not enough of an improvement to make existing SNB owners want to upgrade, but if you're still clinging to an old Core 2 (or earlier) system, Ivy will be a great step forward."

    Basically all the laptops in the last few years for business have been bought with C2D. I think with Ivy, it's a great time to upgrade them all and see a good improvement. Same for family members too. I can't wait to try them out! Thanks for the review Anand.
  • benjaminbldp - Thursday, March 8, 2012 - link

    maybe intel should drop the graphics all together, i don't like it, let the pro take care of it. just too much.
  • dr/owned - Thursday, March 8, 2012 - link

    This article blows because there's no overclocking results. We're not looking for a fine tuned overclock. Just give us the rough and dirty! My money is on 5 ghz with minimal effort using an air cooler.
  • dagamer34 - Thursday, March 8, 2012 - link

    It's a preview, not a review.
  • dr/owned - Thursday, March 8, 2012 - link

    It seems like Intel has the tick-tocks backwards. The i7-920 is arguably the greatest cpu to come out in recent years and it was "just" a tick.
  • Wardrop - Thursday, March 8, 2012 - link

    I'm afraid you have it backwards.

    The i7-920 is a Nehalem processor (it's a 45nm chip). It's a tock. Why is this concept so hard to grasp?
  • bigboxes - Thursday, March 8, 2012 - link

    Look at the chart. Nehalem (i920) was a tock.
  • just4U - Thursday, March 8, 2012 - link

    I have to disagree with you on the i920 being such a huge leap. As someone who goes thru virtually every cpu line up for AMD/Intel I'd have to say the C2D (or quad) 6x series was the biggest leap forward in the past decade. Before that it was the A64 and X2 variants (altho.. we didn't get alot of use out of those secondary cores)
  • IntelUser2000 - Thursday, March 8, 2012 - link

    LOL, this must be the most hilarious argument I've heard in a while.

    How do you relate 30+ % graphics gain as being ALL CPU? Don't be ridiculous, and that's an understatement.
  • Silma - Thursday, March 8, 2012 - link

    Are low-res testing really relevant for graphics?

    Most players play at 1920x1080 or higher.
    1368x720 or 1680x1050 does not seem relevant to me at all for most people, especially those purchasing a computer with this processor.
  • dagamer34 - Thursday, March 8, 2012 - link

    Most players who game at 1920x1080 also have graphics cards that cost more than $100. That's not what this was testing.
  • kensiko - Thursday, March 8, 2012 - link

    Man you won't believe the difference :)

    Get an SSD with that.
  • dagamer34 - Thursday, March 8, 2012 - link

    You're going to seriously start wondering why you didn't upgrade sooner. Just don't hurt yourself too much when you slap your own face. Tech has advanced astronomically in the last 10 years.

    Heck, I'm pretty sure the iPad 2 is faster than your Northwood Pentium 4.
  • Yojimbo - Thursday, March 8, 2012 - link

    uhh.. i don't think that's true. graphics intensive applications are not the only ones that benefit from fast CPUs.
  • krumme - Thursday, March 8, 2012 - link

    Where does Charlie claim to be biased?

    But i agree, this sacred, aura, "this is not sanctioned by Intel" is a pain to read. It makes thesse articles a little bit difficult to start reading :)

    But how profitable, and how good a business do you have if you dont have "good conections"? - charlie uses his for underhand information, anand his to get info before the others. Its very obvious for us to interprete Anands article because we know the obvious, - it have to be profitable for both anand and Intel. But what about Charlie, what is the motives for the people leaking info to him? - its not quite so obvious and transparrent.
  • awg0681 - Thursday, March 8, 2012 - link

    "Sure, he was comparing Intel graphics to Intel graphics, except he wasn't, because he himself threw Llano in there to compare."

    By the same token, if he had not included Llano results people would be wondering where they were and complaining that they weren't included. Puts Anand in a catch 22 when deciding whether or not to include Llano.

    There is validity to the complaint about the numbers being incorrect. Those should be looked at and corrected. Glossing over the results and no mention of Llano being more capable, again, this was mainly to compare Intel v Intel in a preview of their new chip and improvements they've made since last gen. Sure, he could've been more thorough with the AMD v Intel side, but that's not really what this article was about. We could also go to a steakhouse and complain there's not a large vegetarian meal selection too.
  • Azeraph - Thursday, March 8, 2012 - link

    it doesn't really matter if the igp isn't that great most people don't buy them for their graphics power.I get the feeling that maybe intel is just putting them out there to keep it's base solid against AMD,Not that it needs it and i'm an amd fan. i found something the other day that will possibly change how tomorrows processors will use light instead of electricity.

    http://scitechdaily.com/penn-researchers-build-a-c...
  • arno - Friday, March 9, 2012 - link

    ...hi everybody.

    I'm an electrical engineer, doing intensive "spice" simulations.
    I want to know if, as it requires a lot of floating point calculations, does it worth to wait for Ivy Bridge instead of buying right now a laptop with a quadcore Sandy Bridge? I expected Ivy Bridge for March and i've been waiting since last december :(.
    To buy now would be very comfortable, as i'm in the simulaiton phase of my project. To buy later, I believe, would make more sense in term of pure performances . But how much sense is the question....

    Thanks for sharing

    PS (another thing is also theuse of 1600 memory instead of 1333, which might be doing it for another software I use)
  • arno - Friday, March 9, 2012 - link

    I wonder how IvyBridge perform in term offloating point calculations as I do intensive electrical simulations.
    I urgently need an upgrade and would definitely go for a Ivy Bridge. But I've been waiting a long time now and Ivy Bridge may again been delayed.
    Does anyone have an advice about it?

    Thanks for sharing.
  • Nomorehero - Friday, March 9, 2012 - link

    How about OC? Info please?
    Is hard to decide wait until IB or get SB now because the how well IB can OC.
  • arno - Friday, March 9, 2012 - link

    No, it is just out of question for me to overclock. I wanna buy a profesional laptop (w520 lenovo). SO no way to teak it.
    Fact is memory will be 1600 Mhz and the processor a bit stronger with maybe a better memory controler.
    At one month of the release, it worth to wait it.
    Just wanna make sure that in my particular case it really worth it cause i'm tired of my heavy old laptop. I buy this damn machine just for working after all. At home, my E8400 is still upto date for what I do with it.
  • DDR4 - Friday, March 9, 2012 - link

    I want to see some increase in performance and actual processing power. For now, i can leave the graphics to the GPU.
  • Nexing - Friday, March 9, 2012 - link

    @Arno
    I'd consider a few aspects:
    -Do you need to use precision external gear, -like we audio people do with soundcards- and hence need ExpressCard or Thunderbolt connectors? Then I'd expect May-Jun launches will bring those professional Laptops and Ultrabooks.
    -If portability is important, factual Sandy bridge battery capacity is near 4 hours whether Ivy Bridge battery will extend real usage around eight hours for similar performance.
    -Furthermore USB 3.0 will be native, something important since most renesas boards have been far from perfect and just their recent (Feb/March 2012) releases seem to finally have nailed efficiency.. Problems with USB 3.0 equiped Sandy Bridge laptops abound in forums, and that is in professional brands.
    -If you were questioning about SandyB vs IvyB desktops, you could still buy now the former and later upgrade for the later CPU, but with the mobile platform, Intel has stated that H67M -their actual chipsett platform, also named Cougar Point- Upgradeability is not going to be feasible, despíte it could be technically possible easily..
    Therefore, there a many reasons pointing to wait. Since sales are very low, any are choosing this route.
  • arno - Saturday, March 10, 2012 - link

    Thanks Nexing for u answer. Actually, i totally agree with you on:
    portability => IB is a shrink and must be more power efficient for an equivalent task load. Seems that the test proves it. moreover, I will work a lot in trains or outdoors (visiting customers), so it is definitely a +.
    USB 3 => u feedback is very interesting. I myself think that "native" versus "add on" USB3 feature must be better. And that was also a reason for me to wait when last december, i was already thinking of buying something new. Now i'm quite sure that it was the good thing to do.

    For the rest, more than external gears, I need a processor good in floating points calculation. I do intensive electrical simulations so i definitely need it.

    I took my decision and I will wait. This laptop will replace and desktop and laptop for work (and work only cause for internet or usual offices task, i definitely think a core 2 duo can make it); so better to catch the best. I will manage the present emergency I have, praying for Lenovo (or Samsung?) to offer new Ivy Bridge laptop as soon as possible. Let's make a bet: Lenovo got it ready to release and is just waiting for the official launch date....

    thanks for sharing ;)
  • Nexing - Friday, March 9, 2012 - link

    Should say:
    "many are taking this waiting route"
  • arno - Saturday, March 10, 2012 - link

    "FP/integer divider delivers 2x throughput compared to Sandy Bridge"

    I should read more carefully. That is an answer to my question. Maybe not a spectacular improvement, but still one.
  • DrWattsOn - Tuesday, March 13, 2012 - link

    @arno I'm GLAD you didn't read more carefully, because you posted the question, and Nexing's answer focused me on something I still wasn't considering as a major factor in my decision: USB3. Between your question and the response, I also got a better picture of how specific use is affected by the tech. So, I'm a waiter (tho I don't serve food 8^D ).
  • stephenbrooks - Saturday, March 10, 2012 - link

    Intel released on 2006, 2007, 2008, 2010, 2011, 2012.

    In base 9 they're on schedule.
  • bhima - Saturday, March 10, 2012 - link

    Basically every 2D-based graphic designer/web designer doesn't need a discrete GPU for their work. The IGPs handle that workload fine (mainly because most of the processing needed for photoshop, indesign, illustrator or dreamweaver is CPU based). A discrete GPU gives you better performance with the very limited 3D stuff that photoshop offers which is situational at best for the vast majority of graphic designers.

    3D artists and those that pump a ton of effects in video editing, they would benefit from discrete.
  • shadow king - Monday, March 12, 2012 - link

    ^ =)
  • taltamir - Monday, March 12, 2012 - link

    Rarson is correct.
    He isn't suggesting no IGP at all. He is saying put a good IGP on the lower end.

    While there ARE people who need a powerful CPU and will not get a video card because they don't play games, those people do not in any way benefit from having a higher end IGP.

    High end gamers = discreete GPU + Powerful CPU
    Budget gamers = IGP + mid-low range CPU
    Non gamers with money = High end CPU + IGP (underused)
    Non gamers on a budget = Mid-low range CPU + IGP (underused)

    The only people who need a more powerful GPU are the budget gamers and thus it makes sense on the lower end CPUs to have a more powerful IGP.
  • Urillusion17 - Monday, March 12, 2012 - link

    Great article but.... where are the temps??? The few benches I have seen don't mention overclocking, and if they do, they do not mention temps. I am hearing this chip can boil water! I would think that would be as important as anything else...
  • DrWattsOn - Tuesday, March 13, 2012 - link

    +1 (very much in agreement)
  • boogerlad - Wednesday, March 14, 2012 - link

    is it possible to fully load the igp with an opencl application, and not affect the cpu performance at all? From what I've read, it appears the igp shares the cache with the cpu, so will that affect performance?
  • rocker123 - Monday, March 19, 2012 - link

    Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

    Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% range
  • rocker123 - Monday, March 19, 2012 - link

    Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

    Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% range
  • tipoo - Monday, March 19, 2012 - link

    They give the drivers their own tweaks and bug fixes, but I doubt they could do something like add T&L without the manufacturers support. In fact, they didn't, unless they have bigger driver teams now.
  • ClagMaster - Wednesday, March 21, 2012 - link

    "Personally, I want more and I suspect that Haswell will deliver much of that. It is worth pointing out that Intel is progressing at a faster rate than the discrete GPU industry at this point. Admittedly the gap is downright huge, but from what I've heard even the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides."

    Personally, I believe on-board graphics will never be on par with a dedicated graphics part. And it is obcessive-compulsive ridiculous to compare the performance of the HD4000 with discrete graphics and complain its not as good.

    The HD4000 is meant for providing graphics for business and multi-media computers. And for that purpose it is outstanding.

    If you want gaming or engineering workstation performance, get a discrete graphics card. And stop angsting about how bad onboard graphics is to discrete graphics.
  • pottermd - Thursday, March 22, 2012 - link

    Today's desktop processors are more than fast enough to do professional level 3D rendering at home.

    The article contained this statement. It's not really true. I've had a long nap and the render I'm doing is still running. :)
  • Dracusis - Friday, April 6, 2012 - link

    "The people who need integrated graphics"

    No one *needs* integrated graphics. But not everyone needs discrete graphics. The higher performance an IGP has, the less people overall will *need* DGPs.

    Not all games need dedicated graphics cards, just the multi million dollar re-hashed COD's that choke retail stores. There are literally thousands of other games around that only require a small amount of graphics processing power. Flash now has 3D accelerated content and almost every developer using it will target IGP performance levels. Almost all casual game developers target IGPs as well, they're not selling to COD players. Sure, most of those games won't need a hight end CPU as well, but people don't buy computers to play casual games, they buy them for a massive range of tasks, the vast majority of which will be CPU bound so faster would be better.

    Also, as an indie game developer I hit performance walls with CPUs more often than I do with GPUs. You can always scale back geometry/triangle counts, trim or cut certain visual effects but cutting back on CPU related overheads generally means you're cutting out gameplay.
  • Valitri - Saturday, April 14, 2012 - link

    "there's also the question of which one (CPU or GPU) approaches "good enough" first."

    I was worried that my A6 3420 laptop would feel sluggish in windows and general tasks, especially compared to my 2500k desktop system. However, I've been very surprised and think it works just fine in windows.

    I was also very impressed that the iGPU lets me play most newer games comfortably. I was able to OC my A6 3420 on my Samsung 3 series to 2.0ghz. It runs Crysis 2 on low at 1366x768 in the 25-30 fps range. Now to me that is not really playable, but I was surprised it could even run it. Other games like SC2, Arkam Asylum, CSS, WOW, have all ran like a champ. Most of them even on medium settings!

    So I think if you want a cheap laptop (mine was $399), and you want the ability to play some games while still doing general tasks well, we have already hit that "good enough" stage on the CPU department. It will be interesting to see if Windows 8/Metro does anything to change this.
  • p05esto - Monday, April 23, 2012 - link

    You are dead wrong. I need fast CPU for my work but need or care about the gpu. You realize people do more than game, right.
  • SquirrelPunch - Monday, April 23, 2012 - link

    Could not disagree more.

    In fact the majority of power users do not need a powerful GPU, just lots of RAM and fast CPU.

    Graphics designers: All 2D mostly, do not need powerful GPU.

    Video Editors: Same as above.

    Software developers (not games): same as above

    Standard CAD: No intensive 3D models involved.

    Most also don't care for multi-monitor setups, or the 2x that HD series will let you use.
  • klmccaughey - Monday, April 23, 2012 - link

    Intel needs another Larrabee. It keeps cobbling together these graphics cores, which are always well short of the mark. Either Larrabee 2 or licence from Nvidia, but something has to be done about it in the long (possible mid) term. It makes perfect sense and, to me anyway, has the air of inevitability about it.

    Why not take the plunge?
  • MarkJohnson - Tuesday, August 21, 2012 - link

    I find it odd the AMD A8-3870K was left out of the power consumption section, but is in the others.

    I ran a quick test and my kill-a-watt meter read 126Watts max x264 HD v5.0.1 which bests all of them

    It also idles at 34.5 watts which blows them all away by a very large margin. The best is double what the AMD A8-3870K idles.

Log in

Don't have an account? Sign up now