9700k can pump out the most frames per second but it is not the best by any means, its utilization it typically more than %80. Just like a few years ago when all those quad cores were doing so great compared to AMDs more cores and more thread approach. Now those quad cores that put out all those frames are struggling to keep up in modern titles, those AMD processors are still putting out descent frame rates! Another example of AMD's fine wine technology.
With that said, is the frames per second really a good metric to determine longevity of a processor?? Or should be looking at CPU utilization as well.
This article is old but "fine wine" about AMD's old processors is pure delusion. 2600k-age AMD looks horrible. Bulldozer was always horrible, and Piledriver has looked worse with age. Even Excavator gets absolutely smoked by most old Intel CPUs. While obviously not identical and much higher power, an Intel 3960X still went even with nearly every Ryzen 1 CPU. Fine wine my ass.
Actually, this is a pretty fair summary. The 9700K, 9 years later, offers about 40% advantage over the 2600 (except in gaming, where more cores don't matter, today), which is quite abysmal.
Obviously, I was referring at the article. "More cores" meant going from 4 of the 2600 to 8 of the 9700. And no, they don't matter, unless you see a benefit of running at 300fps instead of 250fps. At high res, when the fps start coming close to 60fps, the 2600 and the 9700k are basically equivalent. A different story would be going from 2 to 4, but this would have nothing to do with the article... Is it clear now?
In most test here it's around 100% or more increase in perf, i don't see where it's 40%.
Also when you increase the graphics/resolution in gaming, the FPS are the same because the GPU becomes the bottleneck of FPS. You could put any futuristic cpu, the fps would be the same. So why is it an argument about disappointing/abysmal performance.
After so many decades being wrong you guys still claim CPU power doesnt matter much in games. Youre wrong. Again. Common bottleneck today in games is the CPU, especially because the GPU advancement has been very slow.
GPU advancement slowing down *makes the CPU less relevant, not more*. The CPU is only relevant to performance when it can't meet the bare minimum requirements to serve the GPU fast enough. If the GPU is your limit, no amount of CPU power increase will help.
Is it abysmal because of the CPU though, or because of the software?
Lots of software isn't written to take advantage of more than four cores tops, aside from the heavy hitters, and to an extent, we've hit a celing with clock speeds for awhile, with 5GHz being (not exactly, but a fair representation of) the ceiling. AMD has caught up in a big way, and for server apps and rendering, it's an awesome value and a great CPU. Even with that, it still doesn't match up with a 9700K in games, all other things being equal, unless a game is dependent on GPU alone. I think most mainstream software isn't optimized beyond a certain point for any of our current great CPUs, largely because until recently, CPU development and growth has stagnated. I'm really hoping real competition drives improved software. Note also that it hasn't been like the 90s in some time, where we were doubling CPU performance every 16 months. Some of that is because there's too many limitations to achieving that doubling, both software and hardware.
I'm finding considerable speed boosts over my i7-4790K that was running at 4.4GHz (going to an i9-9900K running constantly at 4.7GHz on all cores) in regular apps and gaming (at 1900x1200 with two GTX 1070 cards in SLI), and I got a deal on the CPU, so I'm perfectly happy with my first mainboard/CPU upgrade in five years (my first board was a 386DX back in `93).
Same here. i7-2600k from may 2011, with the same OCZ Vertex 3. 8 years, twice the cores, not even twice the performance in real world. Just essentially overclocked to the max from the factory.
Remember when real life performance more than doubled every 2 years? On the same 1 core, in all apps, not just heavily multithreaded? Good thing AMD at least forced Intel go from 4 to 6 to 8 in 2 years. Now they need to double their memory controllers, it's the same 128 bits since what, Pentium Pro?
Same here. Over the years I've stuffed it full of RAM and SSD and been pleased with the performance. I'm thinking it's time for it to go though.
In 2016 I put a 1060 in the machine and was mildly disappointed in the random framerate drops in games (at 1200p). Assuming it was the GPU's fault, I upgraded further in 2018 to a 1070 Ti some bitcoin miner was selling for cheap when the market crashed. The average framerates went up, but all of the lows are just as low as they ever where. So either Fallout 4 runs like absolute garbage in certain areas, or the CPU was choking up both GPUs.
When something that isn't PCIe 3 comes out I suppose I can try again and see.
For whatever it's worth, in my experience Fallout 4 (and Skyrim/Skyrim SE/maybe all Bethesda titles) are poorly optimized. It seems their engine is highly dependent on IPC, but even in spite of running an overclocked 6700K/1080 Ti, I get frame drops in certain parts of the map. I think it's likely at least partially dependent on where your character is facing at any given point in time. There can be long draw distances or lots of NPCs near by taxing the CPU (i.e. Diamond City).
Yeah, that makes sense. F4's drops are definitely depended on location and where the character is facing for me too.
The country side, building interiors and winding city streets you can't see very far down are just fine. Even Diamond City is okay. It's when I stand at an intersection of one of the roads that runs arrow straight through Boston or get up on rooftops with a view over the city that rates die. If the engine wants pure CPU grunt for that, then the 2600 just isn't up to it.
Strangely, Skyrim SE has been fine. The world is pretty sparse compared to F4 though.
Fallout 4 is simply a game of asset overload. That happens especially in the urban areas. It shows us that the engine is past expiry date and unable to keep up to the game's demands of this time. The game needs all those assets to at least look somewhat bearable. And its not efficient about it at all; a big part of all those little items also need to be fully interactive objects.
So its not 'strange' at all, really. More objects = more cpu load and none of them can be 'cooked' beforehand. They are literally placed in the world as you move around in it.
This is also part of the reason why the engine has trouble with anything over 60 fps, and why you can sometimes see objects falling from the sky as you zone in.
Wait for AMD then. Apparently (according to AMD) are going to quadruple (at least for Rome, which uses the same Zen 2 architecture) and only half that is core count.
What many are expecting from Ryzen 3rd generation at this point: a significant IPC boost(anywhere from 10-15 percent), and potentially 5GHz on 8 or even 12 cores. Not enough information to know if the 16 core version will be able to hit 5GHz on all cores or not right now. Considering that Ryzen 2700X is hitting 4.3GHz on 8 cores, 12 cores@5GHz will be a significant boost combined with the IPC improvements as well.
May 27th is soon enough to get the official clocks and core counts, and then we get to wait for independent benchmarks on overclocking on X370, X470, and then X570.
I see I purchased my SB pricematched to MC in 2011 (thanks NCIX! and RIP). Maybe it'll make it a decade. Will give time for DDR5 to mature. Don't want to be stuck on a platform with obsolete DDR4.
Haven't come across it yet. When that day comes... I imagine it will be the same when I dragged my feet when CPU's with SSE, SSE2, SSE3 and so on came out... I will upgrade when the need arises.
This, they're dime a dozen because enterprise are dumping them and consumers are too scared to buy them. Mine is 8 core 16 thread with quad channel DDR3.
I'm trying to stay more focused on work and learning this year, so stopped using my i5-2500k@4ghz and re-activated an old laptop with i7-2620m (max 3.1ghz) with 12gb ram and an average SSD.
As today world is heavily web-based for office-like productivity (basically reading emails, accessing online systems, and creating some documents), I'm actually amazed that this laptop is serving me so well. I use a newer i5-8350u at work, which obviously is faster, but the difference is not that much.
for users that want to stay at the top of the game, upgrading makes sense. for users that just want to use the device, it does not (unless your work actually depends of such performance increases).
I still have some CAD users rocking it on a 2600 (non-K) and and SSD just fine too.
I left the PC world when the 2600K was king (and the glorious Q6600 before it) and came back when the i7-6xxx series was mid-life and man was I disappointed in the lack of performance jumps that we were so accustomed to from the athlon 64 -> Core 2 Duo/Quad -> i7-2600.
I'm still running one of these too and they were great like my Northwood before it. I am soon getting a 9900k R0 stepping if I hear good things and relegate this PC to a home Ubuntu server.
I do wish I could afford to upgrade more regularly though. 8 years is too many.
Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.
That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
I originally wanted a 3770K, but missed the window to get a good deal when they were newer. My 3570K+1080Ti still scratches most of my itches, but it's the MMO-style games that really tank my CPU performance and starve my GPU.
I had a 2500k and had to admit tha VR needed the 4 threads full so i found a brand new 3770k for 80$ which gave me 4 extra threads for the system. This was for me enough to pull most games with my gtx 970 as i rarely play MMO's.
...... but rendering have me keen eyed on a threadripper......
Olde94.... I am desperately looking for an excuse to buy a Threadripper. I just can't find one.
I suspect I'm just going to invest the money in a really sweet gun for target shooting instead but the nerd part of me still wants to cheap out on the gun and get a Threadripper....
I just built an AMD Rig with a Ryzen 7 2700X, ASRock X470 Taichi Ultimate, Sapphire Nitro+ RX 590, 32gb G.SKILL RIPJAWS Series V & 2x 1TB M.2 drives (1 for OS and other for Gaming). Boots to Win 10 Pro in 8 seconds. Blazing fast in games.
I just bought a Smith & Wesson 686 Plus 357 Magnum so I know what it's like to want a gun as well. I'm looking at getting a LMT Valkyrie 224.
I'm in almost exactly the same boat. I have a 3770K on Z77 running at 4.2GHz. That's all that I could get out of my chip without thermal throttling under heavy load with a 212 EVO... already running the lowest possible voltage that it is stable. Remounted the cooler several times, upgraded the fan, and switched out the paste to Thermal Grizzly but it didn't help enough to get me to 4.3GHz. I considered throwing a bigger cooler at it but decided to save that money for my next build instead.
Running 1440p 75Hz Freesync (only 48-75Hz range) display that I picked up before Vega launched with the intention of buying Vega when it released -- but I missed buying it at launch, then it was unavailable, then it was expensive, then the crypto boom meant you couldn't get one... so I bought a 1080Ti instead. Even with the newly added Freesync compatibility I'm getting a reasonable bit of stutter that frustrates me.
Strongly considering Zen2 when it comes out. I never seriously considered upgrading to anything else so far, not Haswell through KBL due to lack of performance increase for the price, and not CFL or CFL-R due to high cost. 2700X just doesn't quite have enough single-thread performance increase, but based on the swirling rumors I think Zen2 will get there.
Nope, the cap is for the non-K chips. There you have a 42x multiplier cap with a 100 MHz clock, so you are limited to 4.2... unless you also change the base clock, but that causes other issues that are not worth the effort to address. If you have a K chip, the only limits are your RAM, and cooling. Almost all Sandy chips can hit 4.5GHz, with a majority capable of going above 4.8!
I have a non k 3770 running at 4.2ghz all core, 4.4 single. It's also undervolted to 1.08V and hits a MAX temp of 55-56C after months of use on a corsair AIO and liquid metal . Usually runs in the high 40s under load. Before de-lidding it, it ran in the high 60s at 4.2ghz on a corsair air cooler and arctic mx4 paste. Why are your temperatures so high?
Yeah. In many cases it is very sad when you look at this article. It has effectively taken a decade to finally get to the point that there is a worthwile upgrade in CPU performance. Prior to this, we were seeing CPU performance double every couple of years. A case in point is to look at an article from 2015 that did a comparison of CPUs over the last decade (i.e. ~2005 - 2015) and over that timeframe you saw a 6x performance increase in memory bandwidth and 8x - 10x CPU computational increase. But looking from 2011 to 2019 we barely see a doubling in performance (and then only on select use cases), while at the same time the price of said CPU is 25% more. It is no wonder why people have not been upgrading. Why spend $1000 for new CPU, motherboard, RAM to only gain 25-40% performance? We are just finally hitting that point now that people start to consider it worth that price.
That all being said, it would have been nice to have included at least 1 AMD CPU in theses benchmarks for comparison. Sure, we can go to the review bench to get it, but having it here for some easy comparison would have been nice, especially given how Intel has seemed to have decided to innovating and purposely taking a dive (almost as if they feared regulatory actions from the USA/EU for effectively being a "monopoly" and to avoid such actions decided to simply stop releasing anything really competitive until AMD was able to get their act together again and have a competitive CPU...).
Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.
That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
Still have a 2600(not even the K model) running in a living room PC, paired with a GTX1050Ti and an SSD - runs everything without any issues, been playing Sekiro and Division 2 on it without any problems, locked 1080p@60fps. Progress is all good and fine, but these "old" CPUs have loads of life in them still.
Me too. I haven't had much time for video games the last couple of years to justify $$$, but putting a 1050ti in an old i2600 office PC has kept me happy the last 18 month's or so (eg 55ish fps for Far Cry 5 ND medium/1080, 70 fps+ Forza 7/FH4 high/1080). I'm about to try a S/H RX580 which will probably be a bridge too far, but at least I'll get freesync.
Dare I look at the CIV 6 benchmarks; knowing they are pointless? What sort of idiot tests cpu performance in CIV 6 using FPS rather than turn times? I don't know who specifically but they write for anandtech.
I made a similar comment, Civ6 added a new benchmark with Gathering Storm as well that is even more resource intensive. Turn length will show what your CPU can do, without GPU issues getting in the way.
I'm one of those who bought the 2600K back in the day. A few months ago I made the move to the 9900K. Cores and price don't matter so much as feeling it will be a chip that will offer great bang for the buck for years. I think it is the spiritual successor to the 2600K and that it was a mistake to omit it.
Not even close, it's near double the price. The Ryzen 2700 at $300 would be a way better "successor" as it's within a lot of people's budgets, offers good gaming performance and with 8 cores is probably going to last quite a while as we move to higher threading.
The Ryzen 2 chips moving to 7nm will probably have the largest leap in a while, so whichever one comes in around the $300 mark will probably be the "true" successor of the 2600K.
The issue that some will have with the 2700X is that the clock speeds are not up there at the 5GHz mark, which is what many Intel systems have been able to hit for over four years now. Third generation Ryzen should get to the 5GHz mark or possibly beyond, so there wouldn't be any compromises. Remember, extra cores will only result in better performance in some areas, but single threaded and many older programs benefit more from higher clock speeds(with similar IPC).
Don't get me wrong, I have a Ryzen 7 1800X in this machine and wouldn't step down to a quad-core chip again on the desktop, but I do appreciate that some things just want higher clock speeds. I expect a 40 percent boost in overall performance by switching from this 1800X to the 16 core Ryzen if it hits 5GHz, and that doesn't even count the increase in core count. I may end up paying $600 or more for the CPU though, but that will keep me happy for at least another five years.
Same boat. I used a 2400k and 2500k for my two main PCs for years and years. Just replaced the 2500k with a Ryzen 5 1600 (they were $80 at Microcenter for some blessed reason). Tripling the thread count has down wonders for my compile times, but it's just amazing how strong and long lasting the IPC was on the 2ng generation Core i processors.
You've convinced me. Staying with my Sandy Bridge for another year. At 1600p difference in CPU is not that high (definitely not worth 1000+ USD for completely new system) and for day to day work it is plenty fast. Up to four threads there's very little to gain and only when more threads are at play there is large enough difference (same goes for Ryzen only there I would gain almost nothing up to four threads). Perhaps Zen 2 will change that, or maybe 10nm CPUs from intel when they finally arrive with new CPU architecture and not rehash of 4 year old Skylake.
The only reason why I upgraded from a Sandy Bridge laptop to a Haswell-U laptop was because it was $30 cheaper to get a refurb PC with Windows 10 preloaded than it was to just buy a Windows 10 Pro license for my Sandy Bridge system so I could finally get off WIndows 7. Oddly enough, I spend more time on my Sandy Bridge laptop after moving it to Linux Mint than I do on the newer Windows 10 laptop. The Haswell-U is simply here for a handful of things that I can't do in Linux which are mainly a few games lacking a Linux version that are iffy or uncooperative in WINE. It really had nothing at all do do with a lack of compute power and more to do with EOL on 7. I'd argue that these days, pretty much any Sandy or newer system is adequate from a compute power perspective for most mundane chores and a number of heavy lift tasks.
You can still take the free upgrade from Win 7 to Windows 10, Microsoft never stopped this from working. Do one upgrade the dirty way, get activated and future clean installs will activate too.
Thanks for the article - it is really interesting.
I think it shows very well why the PC market was stagnant for a long time. Depending on ones use case, the only upgrade that seems worth while is going from the top Early 2011 4C CPU to the top late 2018 8C consumer CPU.
I would love to see a similar article comparing the top of the line GPU with the 2600k in this time frame to see what performance difference a GPU upgrade made and contrast this with a CPU upgrade.
I am running a 2600k at stock on 16 gig 1333 ram ddr3 and dont plan to upgrade until mobos with ddr5 and pci express 4 i only play 1080p anyway so thats enough for me i guess
I am not sure what is the goal of this? Is it for saying that Sandy Bridge is still relevant, Intel IPC is bad or games developers are lazy?
One thing for sure, it is time to move on from GTA V. You cannot get anything from those numbers.
Times to have games that are from 2018 and 2019 only. You cannot just bench old games so your database can be built upon. It doesn't represent the consumer reality.
I'd argue that hardly anyone ever played PC games at that resolution. 720p is 1280x720. Computer screens went from 4:3 resolutions to 16:10 and when that was the case, most commonly the lower resolution panels were 1280x800. When 16:9 ended up taking over, the most common lower resolution was 1366x768. Very few PC monitors were ever actually hit 720p. Even most of the low res cheap TVs out there were 1366 or 1360x768.
My threshold for a CPU upgrade has always been 2x performance increase. It's sad that it took this many generations of CPUs to get near that point. Almost all of the systems in my upgrade chain (friends and family) are Sandy Bridge based. I guess that it's finally time to start spending money again.
Indeed, it's sad that it took ~8 years to have double performance kind of while in '90 we get that every 2-3 years. And look at the office tests, we're not there yet and we will probably never ever be as single-thread perf. increases are basically dead. Chromium compile suggests that it makes a sense to update at all -- for developers, but for office users it's nonsense if you consider just the CPU itself.
Such great innovation and progress and cost-effectiveness advances from Intel between 2011 and 2017. /s
Yes AMD didn't do much here either, but it wasn't for lack of trying. Intel deliberately stagnated the market to bleed consumers from every single cent, and then Ryzen turns up and you get the 6 and now 8 core mainstream CPUs.
Would have liked to see 2600K versus Ryzen honestly. Ryzen 1st gen is around Ivy/Haswell performance per core in most games and second gen is haswell/broadwell. But as many games get more threaded, Ryzen's advantage will ever increase.
I owned a 2600K and it was the last product from Intel that I ever owned that I truly felt was worth its price. Even now I just can't justify spending £350-400 quid on a hexa core or octa with HT disabled when the competition has unlocked 16 threads for less money.
Theyre saying AMD didnt do much to push the price/performance envelope between 2011 and 2017. Which they didnt, since their architecture until Zen was terrible.
I don't think AMD would have sold as many of the 8350s and 9590s as they did had people known that i3's and i5's outperformed them in pretty much all games, and, at lower clock speeds, no less. Many people probably bought the FX8350 because it 'sounded faster' at 4.7 GHz than did the 2600K at 'only' 3.8 GHz' , or so I speculate, anyway... (sort of like the Florida Broward county votes in 2000!)
Not everyone looks at games as the primary use of a computer. The AMD FX chips were not great when it came to IPC, in the same way that the Pentium 4 was terrible from an IPC basis. Still, the 8350 was a lot faster than the Phenom 2 processors, that's for sure.
I got my FX 8320 because I preferred threads over single core performance. I was much more likely to notice a lack of computing resources and multi tasking ability vs how long something took to open or run. The funny part is that even though people shit all over them, they were, and honestly still are valid chips for certain use cases. They'll still game, they can be small cheap vhosts, nas servers, you name it. The biggest problem recently is finding a decent AM3+ board to put them in.
Is there any way you can do a similar comparison with the i5 CPUs? I have a 3570k OC to 4.2 GHz and its starting to struggle in some games. E.g., I can get over 60 fps in AC Odyssey for the most part, but there's all sorts of annoying spikes where the min FPS will tank for whatever reason. I'm running a GTX 970 that's OC'ed pretty close to a 980 and I don't know if it would be worth upgrading that or if my CPU would strangle anything faster. Also, whats the performance difference between an OC 3570k and a OC 3770k in modern games?
This is mostly due to being 4 threads, that's also why I wouldn't go with anything <8 threads as you'll see it happen more and more as we all move to higher core counts. Plus Ubisoft has probably got the buggiest/worst optimized games, last one I can think of that was all right was Black Flag, mostly because they didn't change the engine and just changed the story line/map.
I owned pretty much every iteration of Intel and AMD since the 80286. I pushed them all on relatives and friends to make space for the next iteration.
But everything since Sandy Bridge stuck around, both because there was no reason to move them out and I had kids to serve. Mine was a 2600 no-K, because I actually wanted to test VT-d and for that you needed to use a Q-chipset and -K was not supported.
Still drives the gaming rig of one of my sons, while another has the Ivy Bridge (K this time but not delivering beyond 4 GHz). Got Haswell Xeons, 4 and 18 core, a Broadwell as Xeon-D 8 Core, Skylake in notebooks and Kaby Lakes i7-7700K in workstations and an i7-7700T in a pfSense.
Those newer i7s were really just replacing AMDs and Core-2 systems being phased out over time, not because I was hoping for extra performance: AT made it very clear for years, that that simply won’t happen anymore with silicon physics.
What I really wanted from Intel, more cores instead of a useless iGPU, more PCIe lanes, more memory channels I eventually got all from the e5-2696v3 I scored for less than $700 on eBay.
Zen simply came a little too late, a couple of Phenom II x4-6 and three generations of APUs taught me not to expect great performance nor efficiency from AMD, but at least they were budget and had become reliable (unlike the K2-K3+s).
With the family all settled and plenty of systems in all sizes and shapes the only reason to buy CPU any time soon would be to replace failed parts. And fail they just don’t, at least not the CPUs.
And then I must have 100GB or so in DDR3, which I really don't buy again as DDR4 or 5. DDR3-2400 is really just fine with Kaby Lakes.
I overclocked a bit here and there, mostly out of curiosity. But I got bitten far to often with reliability issues, when I was actually working on the machines and not playing around, so I keep them very close to stock for years now: And then it’s simply not worth the trouble, because the GPU/SSD/RAM is far more important or nothing will help anyway (Windows updates…).
Nice write-up, Ian, much appreciated and not just because it confirms my own impressions.
The Zen chips actually have pretty good efficiency, I was expecting way worse before it came out since AMD hadn't been competitive in years. Zen 2 will be quite interesting, mostly due to the node shrinkage hopefully bringing way lower power envelopes and maybe cheaper CPUs, since we all need that saving for the mess that the GPU market has become.
Don't discount the significant IPC improvements that are expected from the third generation Ryzen processors(not the APUs which are Zen+ based from what I have read).
Still have a 2600k at 4.6 GHz with proper turbo support (slows down when idle). Went from GTX 680s in SLI to a single GTX 1080 and it plays most games just fine.
That being said I'd love to throw in a Ryzen 7 2700X but only if one of you pays for it... 😁
Nice flash back review thank you. I am still on a i7 2600K@5.1GHz with 32GB DDR3@2400MHz and very tight timings. It took a while to dial in the memory since Sandy does not really support this speed gracefully like it's newer brothers & sisters do. I have 2 Samsung 512GB SSD drives in raid zero so plenty fast for windows drive and some games installed as well as 2 4TB 7200RPM hard drives.
I think some of the issues you were having with the OC 4.7GHz was probably do to either memory not 100% stable or the CPU may have just been at the edge of stable because it probably wanted just a tad bit more voltage. on my system I had random problems when it was new due to memory timings and finding just the right voltage for the CPU. After getting all of that dialed in my system is pretty much 100% stable with 5.1GHz and DDR3@2400MHz and has been running this way since 2011.
So going from these charts for the gaming results & mine at 5.1GHz would place my system faster than the i7 7700K stock and a slightly over clocked one as well. Though I am 100% sure a i7 7700K fully overclocked would get better FPS since their IPC is like what 10%-12% better than a Sandy clock for clock and then if you throw in AVX2 My Sandy would get hammered.
I am going to be upgrading my system this summer not because I feel my system is slow but more because I know because of it's age that something could fail such as main board or CPU and it would be costly to try to replace either of those so time for the big upgrade soon. I probably will move this system to do secondary duties and have it as a back up gaming system or there for my friends to use when we get to together for a gaming session. I have not fully decided which way to go but am leaning towards maybe AMD Ryzen with Zen 2 and at least 8/16 CPU and maybe a 12/24 CPU if they release more than 8 cores on the main stream desktops.
I would love to see a 6 core i7 980xe overclocked to 4.3 ghz with 2 ghz 12 gig ram triple channel memory vs all these quad cores. < my rig. Playing all games at max settings for example shadow of Tomb Raider max settings at 3440x1440p getting 60fps gsync helps with frame variance smoothness. Metro Exodus extreme settings plus tesselation, physx and hairworks getting average 60fps same resolution with 1080ti ftw3.
"there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost"
I have to disagree with that statement. My reason for my trusty 2600K still running is that its a wonderful "hand-me-down" system. I was running my 2600K as my primary system right up until I went Ryzen. At that point, my old system became my wife's new system. I toned down the overclock to 4.2 Ghz so I could slap a cheap but quiet cooler on it and for her uses (MS Office, email, web browsing, etc) it is a great system and plenty fast enough. My old Samsung 850 EVO SDD went along with it since in my newer system I've got a 960 EVO, but other than gaining that SSD along the way, its had no significant upgrades since 2011.
For someone who could easily get by on something like an i3-8100 or i5-7xxx, the 2600K hand-me-down is a great option.
Personally I have not owned or cared for a desktop since my Dual Xeon 5150, it 12 years old and for a while until later i7's came out it was fastest machine in around. Back then I was into 3D rendering and even built a render farm - also serious into games with latest NVidia Graphics cards.
But since then I went mobile and less graphics and try to less games but still like get out Command & Conquer and Company of Hero's - never much a first person shooter. So for me a higher end laptop would do me fine - for a longest time Lenovo Y50 was good - but Lenovo for me had build issues... but when the Dell XPS 13 2in1 came out it was great for some things portability was great and still use it because it nice to travel with documents and such. But I wanted a faster machine so when the Dell XPS 15 2in1 was announce, I jump onto bandwagon almost fully loaded 4k screen is probably a waste on it because I am getting older - graphics is slightly better than the 3 year old Y50, but CPU is extremely faster than the Lenovo. Some older games have trouble with GPU, and professional graphics like Vue 2016 have trouble with GPU.
But I will be 60 in couple of years and need to grow up from games.
I think my next computer is going to be something different, I want a portable always online - cellular device - I thought about a iPad with cellular but I think I am going wait for Lakefield device, small device with long battery life and connected. My experience with iOS and Android over time is always the same thing - great when first started out - but later there battery drop and performance drops with OS upgrades - when if you think about it no different than with Windows. Even though I am a technical person, never a Linux person - just does not fit with me even when I try it.
Thanks Ian! The most disappointing aspect of the newer Intel i7s vs. Sandy Bridge is the underwhelming progress on performance/Wh. How much more efficiency did the multiple changes in manufacturing and design really gain? Judging by the numbers, not that much. The amazing thing about Sandy Bridge was that it did boost performance, and did so at significantly improved perf/Wh. At this moment, we seem to be back to Athlon vs. P4 days: the progress is most noticeable with the chips that say "AMD" on them.
I think one needs to look at more than just the basic benchmarks and especially multithreading.. Single thread performance has almost triple in new machines also with AVX,
I think it would be nice to see what quad core cpus of Sandy Bridge and new ones do without hyperthreading. it would be nice to see the effects of hyperthreading on and off on different. benchmarks.
How many AVX2 workloads do you have? Adobe's suite has AVX, FF as well, past that can't think of anything that needs AVX2 support where it would be noticeable in my day-to-day stuff, pretty much nothing is interdependent in games, and even those cases where it is, it's not worth the effort of implementation for a tiny gain.
AVX512 is pretty much in the ML space, wouldn't be running most of that stuff on my home machine.
AMD isn't sitting still, and IPC improvements from Ryzen 3rd generation are expected to be in the 13-15 percent range compared to the second generation. Clock speeds are also expected to be significantly higher, though a lot of Intel fans seem to really be pushing that AMD won't have faster than a 4.7GHz clock speed from the new generation. That IPC improvement is all about architecture improvements, clock speed is from the fab process jump.
I've got mine paired with an RX Vega 56 in a Hackintosh. Still gets it done when compiling games for iOS. I had to move to more cores on my main PC, though. Thank you Amazon for that $250 1920X when Threadripper 2 dropped last year! :)
I'm using a 3920xm at 4.4ghz, which is the mobile equivalent of the 3770K. This review just reaffirms that for 4K there is no benefit to something new.
With how much the 9700K leads the 7700K at lower resolutions though this makes me think that old quads without HT are really suffering now. I am curious how the 2500K and 3570K are fairing. Probably not well.
I upgraded from a 2500k (also a legend!) to a 4790k, it was an ok upgrade, and i said next time im gonna upgrade only when its 8 cores. So, i guess that time has come, but im waiting for 10nm. So...from what im reading about 10nm for desktops ill be waiting until 2021....
I'd wait and see how Zen2 clocks later this year if you're itching to upgrade. If they can manage some 12 and 16 thread parts with base speeds over 4.5Ghz around the $300 mark it's going to get rather interesting ;)
Mind you, I don't honestly know if that will happen. If it's more "4Ghz base, 5.xGhz turbo" then that's... process node disappointment again. More notebook chips that can't crunch full thread loads (again).
Worst case though, the 4790K you have now isn't a bad chip at all to be "stuck" with. Unless your paranoid about power, overclock that to 4.5Ghz and you'll probably be good to wait another year or two if zen2 doesn't end up being compelling to you.
What's the base speed on Intel chips? What do you see for the base speed on second generation Ryzen chips? If AMD has a BASE speed of 4.0GHz with boost to 4.8-5.0GHz, that is going to be a lot better than anything Intel has on the desktop.
Very fun article to read since most of us here very likely went through a 2600K phase or still have one, but why did you not include super popular newer games that are known to scale better with memory bandwidth, thread count, etc? Overwatch and BF5 to name a couple. Especially OW has gotten very big in the competitive scene and there is so much out there on every type of setting and scaling to compare and scales almost infinitely with memory bandwidth so just the change from DDR3 to DDR4 is drastic.
8700k and 9900k missing from the comparison. Anyway i did upgrade from my 2600k@4.8 to a 9900k@5.2. Doubling cores AND threads makes a huge difference. The SB platform is really old.
Great article. Still running a 2500K @ 4.8GHz- talk about good value! Holding out for Zen 2 / Ryzen 3000 to replace it with what will hopefully be another long lasting winner...
"Intel also launched its first overclockable dual core with hyperthreading, the Core i3-7350K". If I remember correctly , the 655k was multiplier unlocked and the entire westmere line was bclk overclockable making this statement not quite true. It should say, "their first overclockable dual core with hyperthreading in almost 9 years", or "the first modern dual core with hyperthreading that is truly overclockable/unlocked."
Higher core counts are long overdue. I thought my i3-8100 I bought at end of 2017 is decent. Turns out, it is entry level for doing something creative on a PC. A few months after, AMD arrived with the highr core count Ryzens. I hope to get an 8 core this yaear or next.
Great article. Would love to see more of this kind. I commented along this line on some previous articles. Not everybody upgrades from one gen to the next (in fact, who does?), so incremental review are useful only from a technical perspective (which is already quite a bit), but somebody with a 4-5 years old PC would struggle to find a reference.
Personally, I would have loved to see Ryzen 2*** added to the picture (not sure if there's something that costs as much as the i7-9700K, but a 2800 seems relevant). Thanks again.
Still running i7-2700k on my main pc and looking to replace it any time soon, at 4.7ghz most games run over 100fps. My other machine is still on Xeon W3690, OCd to comfortable 4ghz on all 6 cores and again is crashing every game, why upgrade?
My OC’d 3770k is still going strong, moved it to my wife and I run a 8700k now but I only upgraded because her 3rd gen i5 died not because I felt a strong performance need. I’m primarily a gamer. Where I did see an uplift (and needed it desperately) is in my HEVC transcoding.
One thing I want to point out that modern games are far less demanding relative to the CPU versus games in the 90s. If anyone thinks their 8 year old Sandy Bridge quad is having it sort of rough today, they are probably not around to remember running Half-Life comfortably above 60 FPS at least needed a CPU that was released 2 years later.
There is a point in every Windows OS user computer endeavors, that they start playing less and less games, and at about the same time start foregoing upgrades to their CPU. They keep adding ram and hard disk space and maybe a new graphic card after a couple of years. The only reason that such a person that by now has completely stopped playing games may upgrade to a new CPU and motherboard is the maximum amount of RAM that can be installed on their motherboard. And with that really comes the final PC that such a person may have in a long, long time. Kids get the latest CPU and soon will realize the law of diminishing returns, which by now is gradually approaching "no return", much faster than their parents. So, in perhaps ten years there will be no more "Tic", or "Toc" or Cadence or Moore's law. There be will computers, baring the possibility that dumb terminals have replaced PCs, that everybody knows what they can expect from. No serendipity there for certain.
The fact that you don't see really interesting games showing up all that often is why many people stopped playing games in the first place. Many people enjoyed the old adventure games with puzzles, and while action appeals to younger players, being more strategic and needing to come up with different approaches in how you play has largely died. Interplay is gone, Bullfrog, Lionhead....On occasion something will come out, but few and far between.
Games for adults(and not just adult age children who want to play soldier on the computer) are not all that common. I blame EA for much of the decline in the industry.
I still have an i7-2600 in an old Dell based upon an H67 chipset. I was thinking about using it as a server and updating the board to get updated connectivity. updating the board and using it as a server. Z77 chipset would seem to be the way to go although getting a new board with this chipset seems expensive unless I go used. Anyone any thoughts on this - whether its worthwhile etc or a cost effective way to do it?
Oh wow this is insane timing, I'm actually upgrading from one of these and have had a hard time figuring out what sort of performance upgrade I'd be getting. Much appreciated!
I feel like I can chip in a perspective re: gaming. While your benchmarks show solid average FPS and all that, they don't show the quality of life that you lose by having an underpowered CPU. I game at 4K, 2700k (4.6ghz for heat&noise reasons), 1080Ti, and regularly can't get 60fps no matter the settings, or have constant grame blips and dips. This is in comparison to a friend who has the same card but a Ryzen 1700X
Newer games like Division 2, Assassin's Creed Odyssey, and as shown here, Shadow Of The Romb Raider, all severely limit your performance if you have an older CPU, to the point where getting a constant 60fps is a real struggle, and benchmarks aside, that's the only benchmark the average user is aiming for.
I also have 1333mhz RAM, which is just a whole other pain! As more and more games move into giant open world games and texture streaming and loading is happening in game rather than on loading screens, having slow RAM really affects your enjoyment.
I'm incredibly grateful for this piece btw, I'm actually moving to Zen2 when it comes out, and I gotta say, I've not been this excited since..well, Sandy Bridge.
"I don’t think I purchased a monitor bigger than 1080p until 2012." Wow, really? So you were a CRT guy before that? How could you work on those low res screens all the time?! :D I got myself a 1200p 24" monitor once they became affordable in early 2008 (W2408hH). Had a 1280x1024 19" before that and it was night and day, sooo much better.
Still running 1366x768 on my two non-Windows laptops (HP Steam 11 and Dell Latitude e6320) and it okay. My latest, far less uses Windows gaming system has a 14 inch panel running 1600x900. Its a slight improvement, but I could live without it. The old Latitude does all my video production work so though I could use a few more pixels, it isn't the end of the world as is. The laptop my office issued is a HP Probook 640 G3 so it has a 14 inch 1080p panel which to have to scale at 125% to actually use so the resolution is pretty much pointless.
Ugh, phone auto correct...I really need to look over anything I type on a phone more closely. I feel like I'm reading comment by a non-native English speaker, but its me. How depressing.
I've done some horrendous posts when I used my phone to make a comment somewhere. Mostly because my phone is trained to my German texting habits and not my English commenting habits. And trying to mix them leads to sub par results in both areas, so I mostly stick to using my phone for texting and my PC and laptop for commenting. But sometimes I have to write something via my phone and it makes a beautiful mess if I'm not careful.
Well, laptops and desktops (with monitors) are in a different category anyway, at least that's how I see it. :-) I work with a 13.3" laptop with a 1440p resolution and 150% scaling. It's not fun, but it does the job. The advantage of the larger screen real estate with a 15" or 17" laptop is outweight by the size and weight increase. I've also done work on 1024x768 monitors and it does the job in a pinch. But I've tried to upgrade as soon as the new technology was established, cheap and good enough to make it worth it without having to pay the early adopter fee or fiddle around to get it to work. Even before Win7 made it a breeze to have multiple windows in an orderly grid, I took full advantage of a multi window and multi program workflow for research, paper/presentation writing, editing and media consumption. So it is a bit surprising to see someone like Ian, a tech enthusiast with a university doctorate be so late to great tech that can really make life easier. :D
Great article. Was hoping to see all the CPU's tested (my 4770k), but I think it shows enough. This isn't the 1st article showing that lesser CPU's can run close to the best CPU's when it come to 4k gaming. Does that look to change any time soon? I was thinking I should upgrade this year, but would like to know if I should be shooting for an 8 core, or if a 6 will be a decent enough upgrade. Consoles run slower 8 core proc's that are utilized more efficiently. At some point won't pc games do the same?
There is always the question about what you do on your computer, but I wouldn't go less than 8 cores(since 4-core has become the base on the desktop, and even laptops should never be sold with only 2 cores IMO). If you look at the history, when AMD wasn't competitive and Intel stopped trying to actually innovate, quad-core was all you saw on the desktop, so game developers didn't see a reason to support more threads(even though it would have made sense). Once Ryzen came out with 8 cores, and Intel finally responded, you have to expect that every game developer will design with the potential that players will have 8+ core processors, so why not design with that in mind?
Remember, a program that is properly multi-threaded in design will work on lower-core processors, but will scale up well when processors with more cores are being used. So going forward, quad-core would work, but 8 or more threads WILL feel a lot better, even for overall use.
This was a fascinating article! And what I am seeing in the real world seems to reflect this. For the most part, the IPC for general use has improved, but not by a whole lot. But if doing anything that hits the on-chip GPU, or requiring any kind of decrypt/encrypt, then the dedicated hardware in newer chips really makes a big difference. But at the end of the day, in real-world scenarios, the CPU is simply not the bottle neck for most people. I do a lot of video ripping (all legally purchased, and only for personal use), and the bottleneck is squarely on the Blu-Ray drive. I recently upgraded from a 4x to a 10x drive, and the performance bump was exactly what was expected. Getting a faster CPU or GPU will not help there. I do a bit of video editing, and the bottle-neck there is still almost always in storage. The 1gbps connection to the NAS, and the 1GBps connection to my RAID0 of SSDs. I do a bit of gaming at 4k, and again the bottleneck there is squarely on the GPU (GTX1080), and as your tests show, at lower resolution my chip will be slower than a new chip... but still faster than the 60-120fps refresh of the monitor.
The real reason for an upgrade simply isn't the CPU for most people. The upgrade is the chipset. Faster/more RAM, M.2 SSDs, more available throughput for expansion cards, faster USB/USB-C ports, and soon(ish) 10gig Ethernet. These are the things that make life better for the enthusiast and the normal user; and the newer CPUs are simply more capable of taking advantage of all the extra throughput, where Sandy Bridge would perhaps choke when dealing with these newer and faster interfaces that are not available to it. All that said; I am still not convinced to upgrade. Every previous computer was simply broken, or could not do something after 2-3 years, so an upgrade was literally necessary. But now... my computer is some 8 years old now, and I am amazed at the fact that it still does it all, and does it relatively quickly. Without it being 'broken' it is hard to justify dropping $1000+ into a new build. I mean... I want to upgrade. But I also want to do some house projects, and replace a car, and do stuff with the kids... *sigh* priorities. Part of me wishes that it would break to give me proper motivation to replace it.
Great timing, I've been using the same chip for 7 or 8 years now and never felt the need to upgrade until this year, but I will upgrade end of this year. DDR4 finally dropped in price and my GTX1070TI I think is getting throttled when the CPU ain't overclocked.
Gaming at 4K with a i7 3930K @ 4.2ghz (4.6ghz capable when needed) with 2 GTX 1080s...I was planning a new build this year but after reading this I may hold off even longer.
I've got a 3930K as well. I was planning on upgrading to Threadripper 3 when that comes out, but if it gets delayed I may wait a bit longer for a 5mm Threadripper.
I think the conclusion is slightly off for gaming, from what I could see it's not that the newer processors were only better higher resolutions, it's that the newer systems were better able to keep the GPU fed with data, resulting in a higher maximum frame rate.
So at lower resolutions/quality settings, when the GPUs could let loose they could achieve much higher FPS.
My conclusion from the results wouldn't be to keep it for higher res gaming, but to keep it for gaming if you're still using a 60Hz display (which I am). I bet if you tuned quality settings for all of the GPUs to run at 60 FPS your results would sit pretty close at any resolution.
I'm currently running an E5-2670 for my gaming machine with quad channel DDR3 (4x8GB) and a 1070. That's the budget upgrade path I'd probably recommend at 60Hz.
Just upgraded to Core i7 4790 (from i5 4460) late last year. At first I was thinking about upgrading to the shiny Ryzen 7, but overall cost is pretty high considering I have my H97 mainboard with 16GB of memory. I don't want to shell out that much money and getting stuck at older platform, again.
It does work ok, with the performance around the current gen Core i5 I guess (with less power efficiency). Consider what I paid, I think it's not too bad.
A interesting read there Ian. I started to notice a slow down on 2600K class systems a few years ago when I worked on them.. (I hadn't used one since 2014) For me.. If I can notice those slowdowns in real time then it's time to move away from that CPU. The 4790K appears to still be holding up ok but older 3000/2000 chips not so well.
Best quote out of the entire article: "In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate" :-)
But seriously though, for me, when I upgraded from a Core2Duo E6750 with 4GB of RAM to an i7-6700 (non-K) with 16GB of RAM, it was simply amazing. I was fully expecting that going from an i7-2600K to an i7-9700K would be similar - and it is for things like compiling but not for things like gaming.
Thanks for the aricle, Ian! Dig the LAN setup. :-)
Why would you test a CPU and use a framerate test from Civilization 6, rather than the turn length benchmark which is a true test of the CPU rather than the GPU? Turn based games SHOULD be there as CPU tests, and only caring about the framerates seems to be wrong.
Interesting article to read. I've only recently upgraded from my 2600k to the 9700k, even that was begrudgingly as the 2600k itself still works fine, however the motherboard simply decided to give up on me.
I've got to say though, the difference in the subsystems (NVMe vs SSD makes for some great load times for pretty much everything) as well as other tangible benefits (gaming at higher frame rates) is quite apparent now I have upgraded.
I would have upgraded far sooner had Intel not chosen to keep changing the sockets, swapping out just a CPU is far simpler than rebuilding the entire system.
I an still with a 2500K overclocked to 4.8Ghz, 8Gb of DDR3 1600Mhz RAM and, a 850 Evo SSD and a Nvidia 1070. I honestly see no reason to upgrade. IAN: All your testing basically demonstrated that there is no real reason that justifies spending 400 bucks for a new CPU, 200 bucks for a new Motherboard and 100 bucks for new DDR4 Ram - This totals 700 dollars. But your conclusion is that we should upgrade?! I dont get it.
Go ahead and re-read his "Bottom Line" concluding articles: gives a few specific recommendations where is may and may not be to your advantage. And if you aren't desiring/needing all of the other new bells/whistles that go along with newer boards and architecture, then you are set (he says). Seems pretty clear.
I think the biggest thing I noticed moving to a 8700k from a 2600k was the same thing I noticed moving from a core 2 duo to a 2600k. Less weird pauses. The 2600k would get weird hitches in games. System processes would pop up and tank the frame rate for an instant, or just an explosion would trigger a physics event that would make it stutter. I see that a lot less with a couple extra cores and some performance overhead.
I agree, the user experience is definitely improved in those ways. Granted, many of us think our time is a bit more important than it likely really is. (does waiting 3 seconds really ruin my day?)
You get about 3Xperformance when going from an upclocked 2600k@4.5GHz to a 8700k@4.5GHz when working in DAW:s (Digital Audio Workstation), i.e running dozens and dozens of virtual instruments and plugins when making music. The thing is that it is a combination of applications that: 1. Use all the SSE/AVX or whatever all the streaming extensions that makes parallell flotaing point calculations go much faster. DAW is all about floating point calculations. 2. Are extremely real-time dependent to get ultra low latency (milliseconds in single digits).
This makes even the 7700 k about double in performance in some scenarios when compared to an equally clocked 2600k.
"and Intel’s final quad-core with HyperThreading chip for desktop, the 7700K" "the Core i7-7700K, Intel’s final quad-core with HyperThreading processor"
"... the best chips managed 5.0 GHz or 5.1 GHz in a daily system."
Worth noting that with the refined 2700K, *all* of them run fine at 5GHz in a daily system, sensible temps, a TRUE and one fan is plenty for cooling. Threaded performance is identical to a stock 6700K, IPC is identical to a stock 2700X (880 and 177 for CB R15 Nt/1t resp.)
Also, various P67/Z68 mbds support NVMe boot via modded BIOS files. The ROG forum has a selection for ASUS, search for "ASUS bolts4breakfast"; he's added support for the M4E and M4EZ, and I think others asked the same for the Pro Gen3, etc. I'm sure there are equivalent BIOS mod threads for GIgabyte, MSI, etc. My 5GHz 2700K on an M4E has a 1TB SM961 and a 1TB 970 EVO Plus (photo/video archive), though the C-drive is still a venerable Vector 256GB which holds up well even today.
Also, RAM support runs fine with 2133 CL9 on the M4E, which is pretty good (16GB GSkill TridentX, two modules).
However, after using this for a great many years, I do find myself wanting better performance for processing images & video, so I'll likely be stepping up to a Ryzen 3000 system, at least 8 cores.
Forgot to mention, someting else interesting about SB is the low cost of the sibling SB-E. Would be a laugh to see how all those tests pan with with a 3930K stock/oc'd thrown into the mix. It's a pity good X79 boards are hard to find now given how cheap one can get 3930Ks for these days. If stock performance is ok though, there are some cheap Chinese boards which work pretty well, and some of them do support NVMe boot.
I am still running 3930k, prices for it are still very high ~$500. Not much cheaper then what I paid for it in 2011. I am yet to really test my GTX 680's in SLI. Kind of a waste, but they are driving many displays throughout my house. There was an article where some Australian bloke guy runs an 8 core sandy bridge - e (server chip) vs all modern intel 8 core chips. It actually had the lowest latency so was best for pro gamers, lagged a little behind on everything else- but definitely good enough.
I run 3960X at ~ 4 GHz on X79 ASUS P9X79 and have nvme boot drive with modified BIOS. So it is really interesting to compare 2011/2012 6c/12t to 8700K or 9900K. I guess it's about 7700K stock, so modern 4c/8t is like old 6c/12t. Per core perf is about 20-30% up on average and this includes higher frequency ... So IPC is only about 15% up: not impressive. Of course in some loads like AVX2 heavy apps IPC could be 50% up, but such case is not common.
Oh man... I just upgraded my 2600K to a 9900K and a couple days later this article drops... The timing is impeccable!
If I ever had a shred of buyer's remorse, the article conclusion eradicated it thoroughly. Give me more FPS.
I saw a screenshot of StarCraft 2. On a mission which I, again, coincidentally (this is uncanny) played today. I can now report that the 9900K can FINALLY feed my graphics card in SC2 properly. With the 2600K I'd be around 20-60 FPS depending on load and intensity of the action. With the new processors, it barely ever drops below 60 and usually hovers around 90FPS. Ingame cinematics also finally run above the "cinematic" 30 FPS I saw on my trusty old 2600K.
Using a 4790K for years and increasingly disillusioned with Intel's shady practices and lack of progress. Last AMD processor was an Athlon 64 3400 from the glory days of Intel decimated by the competition. Next processor will be 7nm Zen and I look forward to Intel being under the cosh for as long as AMD can manage it. Thanks for a great nostalgic read...I liked the lean and mean Cutress LAN machine :)
In less than 5 months my i7-860 will celebrate its 10th birthday. I've been keeping an eye on Ryzen 3 and Navi but never feel the need to upgrade (unless something goes wrong). It doesn't feel any slower than my work-issued i7-6700.
About 5 years ago I went backwards and downgraded(?) my Core i7 2600K to a Gulftown Core i7 990x when they became affordable. The Core i7 990x on my Asus Rampage Formula is running @ 4.660 and is really quite faster in all benchmarks than the Core i7 2600K. Those gulftown processors were ahead of their time. Sure a core i7 7700k is 18% faster in single core work but the 990x destroys it in multi-threaded work. As long as it keeps running I'm going to keep using it with my current GTX 1080ti.
"Sandy Bridge as a whole was a much more dynamic of a beast than anything that's come before it." Excess "of a": "Sandy Bridge as a whole was a much more dynamic beast than anything that's come before it."
"They also have AVX2, which draw a lot of power in our power test." Missing "s": "They also have AVX2, which draws a lot of power in our power test."
This was a fun article to read through. A great look into the CPU that defined the decade and a wonderful send-off (or not!?!) to the greatest CPU processor since the Core 2 Duo.
Up until last year, I had the younger cousin: i5 2500k, which with a lack of hyper-threading, made it much more difficult to keep up in much more CPU intensive tasks (even for a gamer) in 2018 and I made the switch to team orange.
Ryzen is here now, promising longevity, of not just its CPU, but more importantly - the AM4 platform - something that Intel did not accomplish with any of it's processors.
With the Ryzen 3000 series, It's time to jump on board.
You don't need to buy a new computer every year and with an intelligently made upfront investment you can potentially keep your desktop, with minimal or zero hardware upgrades, for a *very* long time
/news at 11
If there is any argument that supports this its Intel's consumer/prosumer HEDT platforms.
The X99 was compelling over X58. The x299 is not even remotely compelling. I still have my old X99/ i7-5930k (6 core 40 lane PCIe3). its still fantastic, but thats at least partially because I bit the bullet and invested in a good motherboard and GPU at the time. All modern games still play fantastically and it can handle absolutely anything I throw at it.
More a statement of "future proofing" than inherent performance.
It's always disappointing to see heavily GPU bottlenecked benchmarks in articles like these, without a clear warning that they are totally irrelevant to the question at hand.
It also feeds into the false narrative that what resolution you play at matters for CPU benchmarks. What matters a lot more is what GAME you're playing, and these tests never benchmark the actually CPU bound multiplayer games that people are playing, because benchmarking those is Hard.
Now that there is so much hype about the Ryzen 3, is that my best option if I wanted to upgrade? I guess I would need a new mobo and memory in addition to the CPU. Otherwise I can use the same SSD etc.
I was running my X1950XT AIW at wonder level overclocks with a Pentium M overclocked, and crushing Athlon 64 users.
It would have been really interesting to see that 7700K with DDR3. I run my 7700K @ 5Ghz with DDR3-2100 CL10 on a GA-Z170-HD3. Sadly the power delivery system on my board is at it's limits. :-(
But still a massive upgrade from a FX-8320e and MSI 970 mobo that I had before.
I forgot to add that it's 32GB(8GB x 4) G.Skill CL9 1866 1.5V that runs at 2100 CL10 at 1.5V but I have to give up 1T command rate.
The GPU that I carried over is the Fury X. Bios modded of course so it's undervolted, underclocked and the HBM timings tightened. Whips the stock config.
The GPU is next up for upgrading, but I'm holding out for Navi with hardware RT and hopefully HBM. Once you get a taste of the low latency it's hard to go back.
OpenCL memory bandwidth for my Fury X punches over 320GB/s with single digit latency. The iGPU in my 7700K, is around 12-14GB/s and the latency is... -_-
There are several things about this article I dont like
1. In the Game Tests, i actually dont care if one CPU is 50 Percent better when one shows 10 FPS and the other 15. Also I don’t care if it is 200 or 300 fps. So I would change to scale into a simple metric and that is: is it fun to play or not.
2. Development is not mentioned: The Core Wars has just started and the monopoly of intel is over. Why should we invest in new processors when competition has just begun. I predict price per performance will fall faster in the next years than it did in the previous 10 years. So buying now is buying into an overpriced and fast developing marked.
3. There is no Discussion if one should buy a used 2600k system today. I bought one a few weeks ago. It was 170 USD, has 16 GB of Ram and a gtx760. It plays all the games I throw at it and does the encoding of some videos I take in classes every week. Also I modified its cooler so that it runs very very silent. Using this system is a dream! Of course one could invest several times as much for a new system that is twice as fast in benchmarks but for now id rather save a few hundred bucks and invest when the competition becomes stagnant again or when some software I use really demands for it because of new instructions.
Great write-up! Love my 2600k still to this day and solid at 4.6GHz on air the whole time! I do see an upgrade this year though. She's been a beast!! Never thought the 300A Celeron OC to 450 would get beat! haha
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
213 Comments
Back to Article
Ironchef3500 - Friday, May 10, 2019 - link
Still running one of these...warreo - Friday, May 10, 2019 - link
same here, it's still running greatNetmsm - Friday, May 10, 2019 - link
No! It dose not run great, this is 9700k that runs very disappointing.flyingpants265 - Saturday, May 11, 2019 - link
Hah, I get your point. But as of this moment, 9700k is one of the best desktop CPUs out there.Netmsm - Saturday, May 11, 2019 - link
:)It'd be better to say 9700k is one of the best Intel's desktop blah, blah, blah.
jgraham11 - Monday, May 13, 2019 - link
9700k can pump out the most frames per second but it is not the best by any means, its utilization it typically more than %80. Just like a few years ago when all those quad cores were doing so great compared to AMDs more cores and more thread approach. Now those quad cores that put out all those frames are struggling to keep up in modern titles, those AMD processors are still putting out descent frame rates! Another example of AMD's fine wine technology.With that said, is the frames per second really a good metric to determine longevity of a processor?? Or should be looking at CPU utilization as well.
lmcd - Thursday, January 21, 2021 - link
This article is old but "fine wine" about AMD's old processors is pure delusion. 2600k-age AMD looks horrible. Bulldozer was always horrible, and Piledriver has looked worse with age. Even Excavator gets absolutely smoked by most old Intel CPUs. While obviously not identical and much higher power, an Intel 3960X still went even with nearly every Ryzen 1 CPU. Fine wine my ass.yankeeDDL - Sunday, May 12, 2019 - link
Actually, this is a pretty fair summary. The 9700K, 9 years later, offers about 40% advantage over the 2600 (except in gaming, where more cores don't matter, today), which is quite abysmal.Vayra - Monday, May 13, 2019 - link
More cores don't matter? What results have you been looking at for gaming? 4K ultra?yankeeDDL - Monday, May 13, 2019 - link
Obviously, I was referring at the article. "More cores" meant going from 4 of the 2600 to 8 of the 9700. And no, they don't matter, unless you see a benefit of running at 300fps instead of 250fps. At high res, when the fps start coming close to 60fps, the 2600 and the 9700k are basically equivalent.A different story would be going from 2 to 4, but this would have nothing to do with the article...
Is it clear now?
MxClood - Saturday, May 18, 2019 - link
In most test here it's around 100% or more increase in perf, i don't see where it's 40%.Also when you increase the graphics/resolution in gaming, the FPS are the same because the GPU becomes the bottleneck of FPS. You could put any futuristic cpu, the fps would be the same.
So why is it an argument about disappointing/abysmal performance.
Beaver M. - Wednesday, May 22, 2019 - link
After so many decades being wrong you guys still claim CPU power doesnt matter much in games.Youre wrong. Again. Common bottleneck today in games is the CPU, especially because the GPU advancement has been very slow.
Spunjji - Wednesday, May 22, 2019 - link
GPU advancement slowing down *makes the CPU less relevant, not more*. The CPU is only relevant to performance when it can't meet the bare minimum requirements to serve the GPU fast enough. If the GPU is your limit, no amount of CPU power increase will help.LoneWolf15 - Friday, May 17, 2019 - link
Is it abysmal because of the CPU though, or because of the software?Lots of software isn't written to take advantage of more than four cores tops, aside from the heavy hitters, and to an extent, we've hit a celing with clock speeds for awhile, with 5GHz being (not exactly, but a fair representation of) the ceiling.
AMD has caught up in a big way, and for server apps and rendering, it's an awesome value and a great CPU. Even with that, it still doesn't match up with a 9700K in games, all other things being equal, unless a game is dependent on GPU alone.
I think most mainstream software isn't optimized beyond a certain point for any of our current great CPUs, largely because until recently, CPU development and growth has stagnated. I'm really hoping real competition drives improved software.
Note also that it hasn't been like the 90s in some time, where we were doubling CPU performance every 16 months. Some of that is because there's too many limitations to achieving that doubling, both software and hardware.
I'm finding considerable speed boosts over my i7-4790K that was running at 4.4GHz (going to an i9-9900K running constantly at 4.7GHz on all cores) in regular apps and gaming (at 1900x1200 with two GTX 1070 cards in SLI), and I got a deal on the CPU, so I'm perfectly happy with my first mainboard/CPU upgrade in five years (my first board was a 386DX back in `93).
peevee - Tuesday, May 14, 2019 - link
Same here. i7-2600k from may 2011, with the same OCZ Vertex 3.8 years, twice the cores, not even twice the performance in real world. Just essentially overclocked to the max from the factory.
Remember when real life performance more than doubled every 2 years? On the same 1 core, in all apps, not just heavily multithreaded? Good thing AMD at least forced Intel go from 4 to 6 to 8 in 2 years. Now they need to double their memory controllers, it's the same 128 bits since what, Pentium Pro?
Mr Perfect - Friday, May 10, 2019 - link
Same here. Over the years I've stuffed it full of RAM and SSD and been pleased with the performance. I'm thinking it's time for it to go though.In 2016 I put a 1060 in the machine and was mildly disappointed in the random framerate drops in games (at 1200p). Assuming it was the GPU's fault, I upgraded further in 2018 to a 1070 Ti some bitcoin miner was selling for cheap when the market crashed. The average framerates went up, but all of the lows are just as low as they ever where. So either Fallout 4 runs like absolute garbage in certain areas, or the CPU was choking up both GPUs.
When something that isn't PCIe 3 comes out I suppose I can try again and see.
ImOnMy116 - Friday, May 10, 2019 - link
For whatever it's worth, in my experience Fallout 4 (and Skyrim/Skyrim SE/maybe all Bethesda titles) are poorly optimized. It seems their engine is highly dependent on IPC, but even in spite of running an overclocked 6700K/1080 Ti, I get frame drops in certain parts of the map. I think it's likely at least partially dependent on where your character is facing at any given point in time. There can be long draw distances or lots of NPCs near by taxing the CPU (i.e. Diamond City).Mr Perfect - Friday, May 10, 2019 - link
Yeah, that makes sense. F4's drops are definitely depended on location and where the character is facing for me too.The country side, building interiors and winding city streets you can't see very far down are just fine. Even Diamond City is okay. It's when I stand at an intersection of one of the roads that runs arrow straight through Boston or get up on rooftops with a view over the city that rates die. If the engine wants pure CPU grunt for that, then the 2600 just isn't up to it.
Strangely, Skyrim SE has been fine. The world is pretty sparse compared to F4 though.
Vayra - Monday, May 13, 2019 - link
Fallout 4 is simply a game of asset overload. That happens especially in the urban areas. It shows us that the engine is past expiry date and unable to keep up to the game's demands of this time. The game needs all those assets to at least look somewhat bearable. And its not efficient about it at all; a big part of all those little items also need to be fully interactive objects.So its not 'strange' at all, really. More objects = more cpu load and none of them can be 'cooked' beforehand. They are literally placed in the world as you move around in it.
Vayra - Monday, May 13, 2019 - link
This is also part of the reason why the engine has trouble with anything over 60 fps, and why you can sometimes see objects falling from the sky as you zone in.amrs - Saturday, May 11, 2019 - link
But what speed of RAM did you stuff it with? Fallout 4 has been shown to benefit from RAM faster than DDR3-1600.Mr Perfect - Monday, May 13, 2019 - link
DDR3-1600, as luck would have it. Do you have a link handy for those benchmarks?Hyper72 - Friday, May 10, 2019 - link
I'm sitting here with an aging Ivy - 3630QM, that can't be overclocked. I'm really dreaming of an upgrade!eek2121 - Saturday, May 11, 2019 - link
Wait for AMD then. Apparently (according to AMD) are going to quadruple (at least for Rome, which uses the same Zen 2 architecture) and only half that is core count.Targon - Sunday, May 12, 2019 - link
What many are expecting from Ryzen 3rd generation at this point: a significant IPC boost(anywhere from 10-15 percent), and potentially 5GHz on 8 or even 12 cores. Not enough information to know if the 16 core version will be able to hit 5GHz on all cores or not right now. Considering that Ryzen 2700X is hitting 4.3GHz on 8 cores, 12 cores@5GHz will be a significant boost combined with the IPC improvements as well.May 27th is soon enough to get the official clocks and core counts, and then we get to wait for independent benchmarks on overclocking on X370, X470, and then X570.
Zoomer - Thursday, June 13, 2019 - link
I see I purchased my SB pricematched to MC in 2011 (thanks NCIX! and RIP). Maybe it'll make it a decade. Will give time for DDR5 to mature. Don't want to be stuck on a platform with obsolete DDR4.StevoLincolnite - Friday, May 10, 2019 - link
I am running Sandy-Bridge-E... So even less of a need to upgrade... 6-cores, PCI-3.0, Quad-Channel DDR3... Overclocks to 5Ghz if I need...I could upgrade, but I haven't reached a point where it's holding me back yet in gaming.
mode_13h - Saturday, May 11, 2019 - link
But if something wants AVX2, you're SOL.StevoLincolnite - Saturday, May 11, 2019 - link
Haven't come across it yet. When that day comes... I imagine it will be the same when I dragged my feet when CPU's with SSE, SSE2, SSE3 and so on came out... I will upgrade when the need arises.mode_13h - Tuesday, May 14, 2019 - link
I think Oculus requires it, as they were fairly explicit in their platform requirements of >= Haswell, which is the first gen with AVX2.Danvelopment - Sunday, May 12, 2019 - link
This, they're dime a dozen because enterprise are dumping them and consumers are too scared to buy them. Mine is 8 core 16 thread with quad channel DDR3.StevoLincolnite - Friday, May 10, 2019 - link
Running a Sandy-Bridge-E setup... So even less of a need to rush out and upgrade... 6 Cores, PCI-E 3.0, Quad Channel DDR3... Overclocks to 5Ghz.Haven't found anything that I can't run yet. Been an amazing rig.
marc1000 - Sunday, May 12, 2019 - link
I'm trying to stay more focused on work and learning this year, so stopped using my i5-2500k@4ghz and re-activated an old laptop with i7-2620m (max 3.1ghz) with 12gb ram and an average SSD.As today world is heavily web-based for office-like productivity (basically reading emails, accessing online systems, and creating some documents), I'm actually amazed that this laptop is serving me so well. I use a newer i5-8350u at work, which obviously is faster, but the difference is not that much.
for users that want to stay at the top of the game, upgrading makes sense. for users that just want to use the device, it does not (unless your work actually depends of such performance increases).
soliloquist - Monday, May 13, 2019 - link
Still rockin' a 2500K!Over the years I have stuffed it full of RAM and SSDs and still works well for my needs.
AdhesiveTeflon - Monday, May 13, 2019 - link
I still have some CAD users rocking it on a 2600 (non-K) and and SSD just fine too.I left the PC world when the 2600K was king (and the glorious Q6600 before it) and came back when the i7-6xxx series was mid-life and man was I disappointed in the lack of performance jumps that we were so accustomed to from the athlon 64 -> Core 2 Duo/Quad -> i7-2600.
Alperian - Tuesday, May 14, 2019 - link
I'm still running one of these too and they were great like my Northwood before it.I am soon getting a 9900k R0 stepping if I hear good things and relegate this PC to a home Ubuntu server.
I do wish I could afford to upgrade more regularly though. 8 years is too many.
Marlin1975 - Friday, May 10, 2019 - link
Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
nathanddrews - Friday, May 10, 2019 - link
I originally wanted a 3770K, but missed the window to get a good deal when they were newer. My 3570K+1080Ti still scratches most of my itches, but it's the MMO-style games that really tank my CPU performance and starve my GPU.olde94 - Friday, May 10, 2019 - link
I had a 2500k and had to admit tha VR needed the 4 threads full so i found a brand new 3770k for 80$ which gave me 4 extra threads for the system. This was for me enough to pull most games with my gtx 970 as i rarely play MMO's....... but rendering have me keen eyed on a threadripper......
philehidiot - Friday, May 10, 2019 - link
Olde94.... I am desperately looking for an excuse to buy a Threadripper. I just can't find one.I suspect I'm just going to invest the money in a really sweet gun for target shooting instead but the nerd part of me still wants to cheap out on the gun and get a Threadripper....
nandnandnand - Friday, May 10, 2019 - link
Just get a 12 or 16-core Ryzen in 2+ months.mode_13h - Saturday, May 11, 2019 - link
If you get a gun, you'll just have to waste more money on ammo.And the thing about targets is they don't shoot back. So, it gets boring pretty quickly. Paintball is more fun.
Ghodzilla5150 - Saturday, May 11, 2019 - link
I just built an AMD Rig with a Ryzen 7 2700X, ASRock X470 Taichi Ultimate, Sapphire Nitro+ RX 590, 32gb G.SKILL RIPJAWS Series V & 2x 1TB M.2 drives (1 for OS and other for Gaming). Boots to Win 10 Pro in 8 seconds. Blazing fast in games.I just bought a Smith & Wesson 686 Plus 357 Magnum so I know what it's like to want a gun as well. I'm looking at getting a LMT Valkyrie 224.
mode_13h - Saturday, May 11, 2019 - link
Get a Ryzen 9 with 16 cores.MrCommunistGen - Friday, May 10, 2019 - link
I'm in almost exactly the same boat. I have a 3770K on Z77 running at 4.2GHz. That's all that I could get out of my chip without thermal throttling under heavy load with a 212 EVO... already running the lowest possible voltage that it is stable. Remounted the cooler several times, upgraded the fan, and switched out the paste to Thermal Grizzly but it didn't help enough to get me to 4.3GHz.I considered throwing a bigger cooler at it but decided to save that money for my next build instead.
Running 1440p 75Hz Freesync (only 48-75Hz range) display that I picked up before Vega launched with the intention of buying Vega when it released -- but I missed buying it at launch, then it was unavailable, then it was expensive, then the crypto boom meant you couldn't get one... so I bought a 1080Ti instead. Even with the newly added Freesync compatibility I'm getting a reasonable bit of stutter that frustrates me.
Strongly considering Zen2 when it comes out. I never seriously considered upgrading to anything else so far, not Haswell through KBL due to lack of performance increase for the price, and not CFL or CFL-R due to high cost. 2700X just doesn't quite have enough single-thread performance increase, but based on the swirling rumors I think Zen2 will get there.
Polyclot - Saturday, May 11, 2019 - link
I have a 2600k/z77-a. I was under the impression that the mobo wouldn't go above 4.2. At least that's where I'm at. Love the combo. No complaintsCaedenV - Saturday, May 11, 2019 - link
Nope, the cap is for the non-K chips. There you have a 42x multiplier cap with a 100 MHz clock, so you are limited to 4.2... unless you also change the base clock, but that causes other issues that are not worth the effort to address.If you have a K chip, the only limits are your RAM, and cooling. Almost all Sandy chips can hit 4.5GHz, with a majority capable of going above 4.8!
XXxPro_bowler420xXx - Saturday, May 11, 2019 - link
I have a non k 3770 running at 4.2ghz all core, 4.4 single. It's also undervolted to 1.08V and hits a MAX temp of 55-56C after months of use on a corsair AIO and liquid metal . Usually runs in the high 40s under load. Before de-lidding it, it ran in the high 60s at 4.2ghz on a corsair air cooler and arctic mx4 paste. Why are your temperatures so high?AsRock z77 extreme 4 and 16GB 2133 ram.
XXxPro_bowler420xXx - Saturday, May 11, 2019 - link
Also I agree with you on zen 2. Finally a worthy successor.CaedenV - Saturday, May 11, 2019 - link
Yep, if I were to upgrade today, it would be an AMD chip. And that is hard to say/admit with all of my inner Intel fanboy.fangdahai - Friday, May 10, 2019 - link
Same here, 3770. It's still fast enough......at least no big difference with the last Intel CPU.Fallen Kell - Saturday, May 11, 2019 - link
Yeah. In many cases it is very sad when you look at this article. It has effectively taken a decade to finally get to the point that there is a worthwile upgrade in CPU performance. Prior to this, we were seeing CPU performance double every couple of years. A case in point is to look at an article from 2015 that did a comparison of CPUs over the last decade (i.e. ~2005 - 2015) and over that timeframe you saw a 6x performance increase in memory bandwidth and 8x - 10x CPU computational increase. But looking from 2011 to 2019 we barely see a doubling in performance (and then only on select use cases), while at the same time the price of said CPU is 25% more. It is no wonder why people have not been upgrading. Why spend $1000 for new CPU, motherboard, RAM to only gain 25-40% performance? We are just finally hitting that point now that people start to consider it worth that price.That all being said, it would have been nice to have included at least 1 AMD CPU in theses benchmarks for comparison. Sure, we can go to the review bench to get it, but having it here for some easy comparison would have been nice, especially given how Intel has seemed to have decided to innovating and purposely taking a dive (almost as if they feared regulatory actions from the USA/EU for effectively being a "monopoly" and to avoid such actions decided to simply stop releasing anything really competitive until AMD was able to get their act together again and have a competitive CPU...).
Zoomer - Thursday, June 13, 2019 - link
Funny thing is, last time it happened, Intel needed AMD to give it a kick in the nuts. Maybe this time too?mode_13h - Saturday, May 11, 2019 - link
I figured I'd wait for PCIe 4.0, to upgrade. With Zen2, I guess my chance is here.Wardrop - Saturday, May 11, 2019 - link
Yep, same. Hoping to replace my 3770k with Zen 2. Looking to down-size my chassis too with a Sliger case. Hopefully Zen 2 doesn't disappoint.Marlin1975 - Friday, May 10, 2019 - link
Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
gambiting - Friday, May 10, 2019 - link
Still have a 2600(not even the K model) running in a living room PC, paired with a GTX1050Ti and an SSD - runs everything without any issues, been playing Sekiro and Division 2 on it without any problems, locked 1080p@60fps. Progress is all good and fine, but these "old" CPUs have loads of life in them still.Potatooo - Wednesday, May 15, 2019 - link
Me too. I haven't had much time for video games the last couple of years to justify $$$, but putting a 1050ti in an old i2600 office PC has kept me happy the last 18 month's or so (eg 55ish fps for Far Cry 5 ND medium/1080, 70 fps+ Forza 7/FH4 high/1080). I'm about to try a S/H RX580 which will probably be a bridge too far, but at least I'll get freesync.GNUminex_l_cowsay - Friday, May 10, 2019 - link
Dare I look at the CIV 6 benchmarks; knowing they are pointless? What sort of idiot tests cpu performance in CIV 6 using FPS rather than turn times? I don't know who specifically but they write for anandtech.RealBeast - Friday, May 10, 2019 - link
Certainly not a Civ 6 player. ;)Targon - Monday, May 13, 2019 - link
I made a similar comment, Civ6 added a new benchmark with Gathering Storm as well that is even more resource intensive. Turn length will show what your CPU can do, without GPU issues getting in the way.Zoomer - Friday, June 14, 2019 - link
Articles says that bmrk is being developed.nonoverclock - Friday, May 10, 2019 - link
Interesting article! I'm still sitting on an i7 4770 and am debating an upgrade, would be also interesting to see a Haswell i7 in the mix.HomerrK - Friday, May 10, 2019 - link
I'm one of those who bought the 2600K back in the day. A few months ago I made the move to the 9900K. Cores and price don't matter so much as feeling it will be a chip that will offer great bang for the buck for years. I think it is the spiritual successor to the 2600K and that it was a mistake to omit it.RSAUser - Saturday, May 11, 2019 - link
Not even close, it's near double the price.The Ryzen 2700 at $300 would be a way better "successor" as it's within a lot of people's budgets, offers good gaming performance and with 8 cores is probably going to last quite a while as we move to higher threading.
The Ryzen 2 chips moving to 7nm will probably have the largest leap in a while, so whichever one comes in around the $300 mark will probably be the "true" successor of the 2600K.
Targon - Monday, May 13, 2019 - link
The issue that some will have with the 2700X is that the clock speeds are not up there at the 5GHz mark, which is what many Intel systems have been able to hit for over four years now. Third generation Ryzen should get to the 5GHz mark or possibly beyond, so there wouldn't be any compromises. Remember, extra cores will only result in better performance in some areas, but single threaded and many older programs benefit more from higher clock speeds(with similar IPC).Don't get me wrong, I have a Ryzen 7 1800X in this machine and wouldn't step down to a quad-core chip again on the desktop, but I do appreciate that some things just want higher clock speeds. I expect a 40 percent boost in overall performance by switching from this 1800X to the 16 core Ryzen if it hits 5GHz, and that doesn't even count the increase in core count. I may end up paying $600 or more for the CPU though, but that will keep me happy for at least another five years.
crimson117 - Friday, May 10, 2019 - link
Finally retired my i5-2500K last spring for a Ryzen 2700X.But boy what a good run that CPU had.
jayfang - Friday, May 10, 2019 - link
Likewise only recently "demoted" my i5-2500K - still has tons of grunt as family PC / HTPCgijames1225 - Friday, May 10, 2019 - link
Same boat. I used a 2400k and 2500k for my two main PCs for years and years. Just replaced the 2500k with a Ryzen 5 1600 (they were $80 at Microcenter for some blessed reason). Tripling the thread count has down wonders for my compile times, but it's just amazing how strong and long lasting the IPC was on the 2ng generation Core i processors.qap - Friday, May 10, 2019 - link
You've convinced me. Staying with my Sandy Bridge for another year. At 1600p difference in CPU is not that high (definitely not worth 1000+ USD for completely new system) and for day to day work it is plenty fast. Up to four threads there's very little to gain and only when more threads are at play there is large enough difference (same goes for Ryzen only there I would gain almost nothing up to four threads).Perhaps Zen 2 will change that, or maybe 10nm CPUs from intel when they finally arrive with new CPU architecture and not rehash of 4 year old Skylake.
RSAUser - Saturday, May 11, 2019 - link
10nm Intel desktop is earliest 2021 or so probably, wouldn't bother holding out for that.PeachNCream - Friday, May 10, 2019 - link
The only reason why I upgraded from a Sandy Bridge laptop to a Haswell-U laptop was because it was $30 cheaper to get a refurb PC with Windows 10 preloaded than it was to just buy a Windows 10 Pro license for my Sandy Bridge system so I could finally get off WIndows 7. Oddly enough, I spend more time on my Sandy Bridge laptop after moving it to Linux Mint than I do on the newer Windows 10 laptop. The Haswell-U is simply here for a handful of things that I can't do in Linux which are mainly a few games lacking a Linux version that are iffy or uncooperative in WINE. It really had nothing at all do do with a lack of compute power and more to do with EOL on 7. I'd argue that these days, pretty much any Sandy or newer system is adequate from a compute power perspective for most mundane chores and a number of heavy lift tasks.29a - Friday, May 10, 2019 - link
You can buy a Win Pro license for about $7, I've done it multiple times.MDD1963 - Saturday, May 11, 2019 - link
sounds real 'legit', does it not?29a - Monday, May 13, 2019 - link
They're legit, they activate. They just come from the grey market.BushLin - Saturday, May 11, 2019 - link
You can still take the free upgrade from Win 7 to Windows 10, Microsoft never stopped this from working. Do one upgrade the dirty way, get activated and future clean installs will activate too.Targon - Monday, May 13, 2019 - link
You could have thrown a Windows 10 flash drive in there and upgraded your Windows 7 to 10 for free.Irata - Friday, May 10, 2019 - link
Thanks for the article - it is really interesting.I think it shows very well why the PC market was stagnant for a long time. Depending on ones use case, the only upgrade that seems worth while is going from the top Early 2011 4C CPU to the top late 2018 8C consumer CPU.
I would love to see a similar article comparing the top of the line GPU with the 2600k in this time frame to see what performance difference a GPU upgrade made and contrast this with a CPU upgrade.
siberian3 - Friday, May 10, 2019 - link
I am running a 2600k at stock on 16 gig 1333 ram ddr3 and dont plan to upgrade until mobos with ddr5 and pci express 4 i only play 1080p anyway so thats enough for me i guess29a - Friday, May 10, 2019 - link
Thank you, thank you, thank you. I've been wanting to read something like this for a while.djayjp - Friday, May 10, 2019 - link
Hey, I know! Let's benchmark a CPU at 4K+ using a mid-range GPU! Brilliant....Ian Cutress - Friday, May 10, 2019 - link
Guess what, there are gaming benchmarks at a wide range of resolutions!eva02langley - Friday, May 10, 2019 - link
I am not sure what is the goal of this? Is it for saying that Sandy Bridge is still relevant, Intel IPC is bad or games developers are lazy?One thing for sure, it is time to move on from GTA V. You cannot get anything from those numbers.
Times to have games that are from 2018 and 2019 only. You cannot just bench old games so your database can be built upon. It doesn't represent the consumer reality.
BushLin - Saturday, May 11, 2019 - link
Yeah, why benchmark a game where the results can be compared against all GPUs and CPUs from the last decade. </s>StevoLincolnite - Sunday, May 12, 2019 - link
GTA 5 is still demanding.Millions of gamers still play GTA 5.
It is one of the most popular games of all time.
Ergo... It is entirely relevant having GTA 5 benchies.
djayjp - Friday, May 10, 2019 - link
Then the GPU is still totally relevant.MDD1963 - Saturday, May 11, 2019 - link
Of course it is....; no one plays at 720P anymore....PeachNCream - Sunday, May 12, 2019 - link
I'd argue that hardly anyone ever played PC games at that resolution. 720p is 1280x720. Computer screens went from 4:3 resolutions to 16:10 and when that was the case, most commonly the lower resolution panels were 1280x800. When 16:9 ended up taking over, the most common lower resolution was 1366x768. Very few PC monitors were ever actually hit 720p. Even most of the low res cheap TVs out there were 1366 or 1360x768.Zoomer - Friday, June 14, 2019 - link
Doesn't matter, the performance will be similar.fep_coder - Friday, May 10, 2019 - link
My threshold for a CPU upgrade has always been 2x performance increase. It's sad that it took this many generations of CPUs to get near that point. Almost all of the systems in my upgrade chain (friends and family) are Sandy Bridge based. I guess that it's finally time to start spending money again.kgardas - Friday, May 10, 2019 - link
Indeed, it's sad that it took ~8 years to have double performance kind of while in '90 we get that every 2-3 years. And look at the office tests, we're not there yet and we will probably never ever be as single-thread perf. increases are basically dead. Chromium compile suggests that it makes a sense to update at all -- for developers, but for office users it's nonsense if you consider just the CPU itself.chekk - Friday, May 10, 2019 - link
Thanks for the article, Ian. I like your summation: impressive and depressing.I'll be waiting to see what Zen 2 offers before upgrading my 2500K.
AshlayW - Friday, May 10, 2019 - link
Such great innovation and progress and cost-effectiveness advances from Intel between 2011 and 2017. /sYes AMD didn't do much here either, but it wasn't for lack of trying. Intel deliberately stagnated the market to bleed consumers from every single cent, and then Ryzen turns up and you get the 6 and now 8 core mainstream CPUs.
Would have liked to see 2600K versus Ryzen honestly. Ryzen 1st gen is around Ivy/Haswell performance per core in most games and second gen is haswell/broadwell. But as many games get more threaded, Ryzen's advantage will ever increase.
I owned a 2600K and it was the last product from Intel that I ever owned that I truly felt was worth its price. Even now I just can't justify spending £350-400 quid on a hexa core or octa with HT disabled when the competition has unlocked 16 threads for less money.
29a - Friday, May 10, 2019 - link
"Yes AMD didn't do much here either"I really don't understand that statement at all.
thesavvymage - Friday, May 10, 2019 - link
Theyre saying AMD didnt do much to push the price/performance envelope between 2011 and 2017. Which they didnt, since their architecture until Zen was terrible.eva02langley - Friday, May 10, 2019 - link
Yeah, you are right... it is AMD fault and not Intel who wanted to make a dime on your back selling you quadcore for life.wilsonkf - Friday, May 10, 2019 - link
Would be more interesting to add 8150/8350 to the benchmark. I run my 8350 at 4.7Ghz for five years. It's a great room heater.MDD1963 - Saturday, May 11, 2019 - link
I don't think AMD would have sold as many of the 8350s and 9590s as they did had people known that i3's and i5's outperformed them in pretty much all games, and, at lower clock speeds, no less. Many people probably bought the FX8350 because it 'sounded faster' at 4.7 GHz than did the 2600K at 'only' 3.8 GHz' , or so I speculate, anyway... (sort of like the Florida Broward county votes in 2000!)Targon - Tuesday, May 14, 2019 - link
Not everyone looks at games as the primary use of a computer. The AMD FX chips were not great when it came to IPC, in the same way that the Pentium 4 was terrible from an IPC basis. Still, the 8350 was a lot faster than the Phenom 2 processors, that's for sure.artk2219 - Wednesday, May 15, 2019 - link
I got my FX 8320 because I preferred threads over single core performance. I was much more likely to notice a lack of computing resources and multi tasking ability vs how long something took to open or run. The funny part is that even though people shit all over them, they were, and honestly still are valid chips for certain use cases. They'll still game, they can be small cheap vhosts, nas servers, you name it. The biggest problem recently is finding a decent AM3+ board to put them in.cwolf78 - Friday, May 10, 2019 - link
Is there any way you can do a similar comparison with the i5 CPUs? I have a 3570k OC to 4.2 GHz and its starting to struggle in some games. E.g., I can get over 60 fps in AC Odyssey for the most part, but there's all sorts of annoying spikes where the min FPS will tank for whatever reason. I'm running a GTX 970 that's OC'ed pretty close to a 980 and I don't know if it would be worth upgrading that or if my CPU would strangle anything faster. Also, whats the performance difference between an OC 3570k and a OC 3770k in modern games?RSAUser - Saturday, May 11, 2019 - link
This is mostly due to being 4 threads, that's also why I wouldn't go with anything <8 threads as you'll see it happen more and more as we all move to higher core counts.Plus Ubisoft has probably got the buggiest/worst optimized games, last one I can think of that was all right was Black Flag, mostly because they didn't change the engine and just changed the story line/map.
uibo - Friday, May 10, 2019 - link
At what voltage did you run the 2600k?abufrejoval - Friday, May 10, 2019 - link
I owned pretty much every iteration of Intel and AMD since the 80286. I pushed them all on relatives and friends to make space for the next iteration.But everything since Sandy Bridge stuck around, both because there was no reason to move them out and I had kids to serve. Mine was a 2600 no-K, because I actually wanted to test VT-d and for that you needed to use a Q-chipset and -K was not supported.
Still drives the gaming rig of one of my sons, while another has the Ivy Bridge (K this time but not delivering beyond 4 GHz). Got Haswell Xeons, 4 and 18 core, a Broadwell as Xeon-D 8 Core, Skylake in notebooks and Kaby Lakes i7-7700K in workstations and an i7-7700T in a pfSense.
Those newer i7s were really just replacing AMDs and Core-2 systems being phased out over time, not because I was hoping for extra performance: AT made it very clear for years, that that simply won’t happen anymore with silicon physics.
What I really wanted from Intel, more cores instead of a useless iGPU, more PCIe lanes, more memory channels I eventually got all from the e5-2696v3 I scored for less than $700 on eBay.
Zen simply came a little too late, a couple of Phenom II x4-6 and three generations of APUs taught me not to expect great performance nor efficiency from AMD, but at least they were budget and had become reliable (unlike the K2-K3+s).
With the family all settled and plenty of systems in all sizes and shapes the only reason to buy CPU any time soon would be to replace failed parts. And fail they just don’t, at least not the CPUs.
And then I must have 100GB or so in DDR3, which I really don't buy again as DDR4 or 5. DDR3-2400 is really just fine with Kaby Lakes.
I overclocked a bit here and there, mostly out of curiosity. But I got bitten far to often with reliability issues, when I was actually working on the machines and not playing around, so I keep them very close to stock for years now: And then it’s simply not worth the trouble, because the GPU/SSD/RAM is far more important or nothing will help anyway (Windows updates…).
Nice write-up, Ian, much appreciated and not just because it confirms my own impressions.
WasHopingForAnHonestReview - Friday, May 10, 2019 - link
Nice reply. Thanks. My 2600k is just cranking along as my darknet browsing machineRSAUser - Saturday, May 11, 2019 - link
The Zen chips actually have pretty good efficiency, I was expecting way worse before it came out since AMD hadn't been competitive in years. Zen 2 will be quite interesting, mostly due to the node shrinkage hopefully bringing way lower power envelopes and maybe cheaper CPUs, since we all need that saving for the mess that the GPU market has become.Targon - Tuesday, May 14, 2019 - link
Don't discount the significant IPC improvements that are expected from the third generation Ryzen processors(not the APUs which are Zen+ based from what I have read).evilspoons - Friday, May 10, 2019 - link
Still have a 2600k at 4.6 GHz with proper turbo support (slows down when idle). Went from GTX 680s in SLI to a single GTX 1080 and it plays most games just fine.That being said I'd love to throw in a Ryzen 7 2700X but only if one of you pays for it... 😁
rocky12345 - Friday, May 10, 2019 - link
Nice flash back review thank you. I am still on a i7 2600K@5.1GHz with 32GB DDR3@2400MHz and very tight timings. It took a while to dial in the memory since Sandy does not really support this speed gracefully like it's newer brothers & sisters do. I have 2 Samsung 512GB SSD drives in raid zero so plenty fast for windows drive and some games installed as well as 2 4TB 7200RPM hard drives.I think some of the issues you were having with the OC 4.7GHz was probably do to either memory not 100% stable or the CPU may have just been at the edge of stable because it probably wanted just a tad bit more voltage. on my system I had random problems when it was new due to memory timings and finding just the right voltage for the CPU. After getting all of that dialed in my system is pretty much 100% stable with 5.1GHz and DDR3@2400MHz and has been running this way since 2011.
So going from these charts for the gaming results & mine at 5.1GHz would place my system faster than the i7 7700K stock and a slightly over clocked one as well. Though I am 100% sure a i7 7700K fully overclocked would get better FPS since their IPC is like what 10%-12% better than a Sandy clock for clock and then if you throw in AVX2 My Sandy would get hammered.
I am going to be upgrading my system this summer not because I feel my system is slow but more because I know because of it's age that something could fail such as main board or CPU and it would be costly to try to replace either of those so time for the big upgrade soon. I probably will move this system to do secondary duties and have it as a back up gaming system or there for my friends to use when we get to together for a gaming session. I have not fully decided which way to go but am leaning towards maybe AMD Ryzen with Zen 2 and at least 8/16 CPU and maybe a 12/24 CPU if they release more than 8 cores on the main stream desktops.
isthisavailable - Friday, May 10, 2019 - link
Still running a i5 3450. Runs fine and maintains 60 FPS for 95% of the time.XXxPro_bowler420xXx - Saturday, May 11, 2019 - link
I am running a 3570 as my computer here at school. With a $50 1050Ti and 16gb of ram.godrilla - Friday, May 10, 2019 - link
I would love to see a 6 core i7 980xe overclocked to 4.3 ghz with 2 ghz 12 gig ram triple channel memory vs all these quad cores. < my rig. Playing all games at max settings for example shadow of Tomb Raider max settings at 3440x1440p getting 60fps gsync helps with frame variance smoothness. Metro Exodus extreme settings plus tesselation, physx and hairworks getting average 60fps same resolution with 1080ti ftw3.Ratman6161 - Friday, May 10, 2019 - link
"there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost"I have to disagree with that statement. My reason for my trusty 2600K still running is that its a wonderful "hand-me-down" system. I was running my 2600K as my primary system right up until I went Ryzen. At that point, my old system became my wife's new system. I toned down the overclock to 4.2 Ghz so I could slap a cheap but quiet cooler on it and for her uses (MS Office, email, web browsing, etc) it is a great system and plenty fast enough. My old Samsung 850 EVO SDD went along with it since in my newer system I've got a 960 EVO, but other than gaining that SSD along the way, its had no significant upgrades since 2011.
For someone who could easily get by on something like an i3-8100 or i5-7xxx, the 2600K hand-me-down is a great option.
WJMazepas - Friday, May 10, 2019 - link
My main PC still have a i5-760 so i believe its time to upgradexrror - Friday, May 10, 2019 - link
lol indeed!HStewart - Friday, May 10, 2019 - link
Personally I have not owned or cared for a desktop since my Dual Xeon 5150, it 12 years old and for a while until later i7's came out it was fastest machine in around. Back then I was into 3D rendering and even built a render farm - also serious into games with latest NVidia Graphics cards.But since then I went mobile and less graphics and try to less games but still like get out Command & Conquer and Company of Hero's - never much a first person shooter. So for me a higher end laptop would do me fine - for a longest time Lenovo Y50 was good - but Lenovo for me had build issues... but when the Dell XPS 13 2in1 came out it was great for some things portability was great and still use it because it nice to travel with documents and such. But I wanted a faster machine so when the Dell XPS 15 2in1 was announce, I jump onto bandwagon almost fully loaded 4k screen is probably a waste on it because I am getting older - graphics is slightly better than the 3 year old Y50, but CPU is extremely faster than the Lenovo. Some older games have trouble with GPU, and professional graphics like Vue 2016 have trouble with GPU.
But I will be 60 in couple of years and need to grow up from games.
I think my next computer is going to be something different, I want a portable always online - cellular device - I thought about a iPad with cellular but I think I am going wait for Lakefield device, small device with long battery life and connected. My experience with iOS and Android over time is always the same thing - great when first started out - but later there battery drop and performance drops with OS upgrades - when if you think about it no different than with Windows. Even though I am a technical person, never a Linux person - just does not fit with me even when I try it.
eva02langley - Friday, May 10, 2019 - link
GTA V is 5 years old... your game suites is horrible. At this point, I would just do a 3Dmark benchmark.Qasar - Saturday, May 11, 2019 - link
eva02... the games they test.. i dont even play them.....eastcoast_pete - Friday, May 10, 2019 - link
Thanks Ian! The most disappointing aspect of the newer Intel i7s vs. Sandy Bridge is the underwhelming progress on performance/Wh. How much more efficiency did the multiple changes in manufacturing and design really gain? Judging by the numbers, not that much. The amazing thing about Sandy Bridge was that it did boost performance, and did so at significantly improved perf/Wh. At this moment, we seem to be back to Athlon vs. P4 days: the progress is most noticeable with the chips that say "AMD" on them.Qwertilot - Friday, May 10, 2019 - link
In general, I think they did gain a lot of perf/Wh. Just not at the very top end. They've been pushing the clocks on the recent i7's incredibly hard.HStewart - Friday, May 10, 2019 - link
I think one needs to look at more than just the basic benchmarks and especially multithreading.. Single thread performance has almost triple in new machines also with AVX,I think it would be nice to see what quad core cpus of Sandy Bridge and new ones do without hyperthreading. it would be nice to see the effects of hyperthreading on and off on different. benchmarks.
RSAUser - Saturday, May 11, 2019 - link
How many AVX2 workloads do you have? Adobe's suite has AVX, FF as well, past that can't think of anything that needs AVX2 support where it would be noticeable in my day-to-day stuff, pretty much nothing is interdependent in games, and even those cases where it is, it's not worth the effort of implementation for a tiny gain.AVX512 is pretty much in the ML space, wouldn't be running most of that stuff on my home machine.
HStewart - Friday, May 10, 2019 - link
"we seem to be back to Athlon vs. P4 days"This time instead of just adding frequency - they are adding cores instead of architexture on both sides currently. Maybe 2nd half will be different.
Targon - Tuesday, May 14, 2019 - link
AMD isn't sitting still, and IPC improvements from Ryzen 3rd generation are expected to be in the 13-15 percent range compared to the second generation. Clock speeds are also expected to be significantly higher, though a lot of Intel fans seem to really be pushing that AMD won't have faster than a 4.7GHz clock speed from the new generation. That IPC improvement is all about architecture improvements, clock speed is from the fab process jump.IVIauricius - Friday, May 10, 2019 - link
I've got mine paired with an RX Vega 56 in a Hackintosh. Still gets it done when compiling games for iOS. I had to move to more cores on my main PC, though. Thank you Amazon for that $250 1920X when Threadripper 2 dropped last year! :)BunnyRabbit - Friday, May 10, 2019 - link
Keeping my 3770 until probably next year when there is ddr5 + PCI Express 4.0/5 + usb 4 support.znd125 - Friday, May 10, 2019 - link
Great article. Conclusions are clear and fair.Khenglish - Friday, May 10, 2019 - link
I'm using a 3920xm at 4.4ghz, which is the mobile equivalent of the 3770K. This review just reaffirms that for 4K there is no benefit to something new.With how much the 9700K leads the 7700K at lower resolutions though this makes me think that old quads without HT are really suffering now. I am curious how the 2500K and 3570K are fairing. Probably not well.
poohbear - Friday, May 10, 2019 - link
I upgraded from a 2500k (also a legend!) to a 4790k, it was an ok upgrade, and i said next time im gonna upgrade only when its 8 cores. So, i guess that time has come, but im waiting for 10nm. So...from what im reading about 10nm for desktops ill be waiting until 2021....utmode - Friday, May 10, 2019 - link
why not AMD 7nm? unless you are in tribalism.xrror - Friday, May 10, 2019 - link
I'd wait and see how Zen2 clocks later this year if you're itching to upgrade. If they can manage some 12 and 16 thread parts with base speeds over 4.5Ghz around the $300 mark it's going to get rather interesting ;)Mind you, I don't honestly know if that will happen. If it's more "4Ghz base, 5.xGhz turbo" then that's... process node disappointment again. More notebook chips that can't crunch full thread loads (again).
Worst case though, the 4790K you have now isn't a bad chip at all to be "stuck" with. Unless your paranoid about power, overclock that to 4.5Ghz and you'll probably be good to wait another year or two if zen2 doesn't end up being compelling to you.
Targon - Tuesday, May 14, 2019 - link
What's the base speed on Intel chips? What do you see for the base speed on second generation Ryzen chips? If AMD has a BASE speed of 4.0GHz with boost to 4.8-5.0GHz, that is going to be a lot better than anything Intel has on the desktop.gglaw - Friday, May 10, 2019 - link
Very fun article to read since most of us here very likely went through a 2600K phase or still have one, but why did you not include super popular newer games that are known to scale better with memory bandwidth, thread count, etc? Overwatch and BF5 to name a couple. Especially OW has gotten very big in the competitive scene and there is so much out there on every type of setting and scaling to compare and scales almost infinitely with memory bandwidth so just the change from DDR3 to DDR4 is drastic.TelstarTOS - Friday, May 10, 2019 - link
8700k and 9900k missing from the comparison. Anyway i did upgrade from my 2600k@4.8 to a 9900k@5.2. Doubling cores AND threads makes a huge difference. The SB platform is really old.mode_13h - Friday, May 10, 2019 - link
Thanks for this. It's relevant to my interests.Awful - Friday, May 10, 2019 - link
Great article. Still running a 2500K @ 4.8GHz- talk about good value! Holding out for Zen 2 / Ryzen 3000 to replace it with what will hopefully be another long lasting winner...Joshthornton - Friday, May 10, 2019 - link
"Intel also launched its first overclockable dual core with hyperthreading, the Core i3-7350K". If I remember correctly , the 655k was multiplier unlocked and the entire westmere line was bclk overclockable making this statement not quite true. It should say, "their first overclockable dual core with hyperthreading in almost 9 years", or "the first modern dual core with hyperthreading that is truly overclockable/unlocked."xrror - Friday, May 10, 2019 - link
I had to look that up, the 655K keeps showing up as an i5 though?Rajinder Gill's Anandtech article:
https://www.anandtech.com/show/3742/intels-core-i5...
Joshthornton - Saturday, May 11, 2019 - link
Yep, but it's also a dual core w/HT. The first gen core i series was a but different . The 750 is when you stepped up to 4 cores.Peskarik - Saturday, May 11, 2019 - link
Reading this article on 2600K. Ha!MDD1963 - Saturday, May 11, 2019 - link
my 7700K (turbos to 4.7 GHz all core in Balanced mode) will likely stand pat for quite a while...!Targon - Tuesday, May 14, 2019 - link
Quad-core will seem weak in another two years or so.zodiacfml - Saturday, May 11, 2019 - link
Higher core counts are long overdue. I thought my i3-8100 I bought at end of 2017 is decent. Turns out, it is entry level for doing something creative on a PC. A few months after, AMD arrived with the highr core count Ryzens.I hope to get an 8 core this yaear or next.
yankeeDDL - Saturday, May 11, 2019 - link
Great article. Would love to see more of this kind. I commented along this line on some previous articles. Not everybody upgrades from one gen to the next (in fact, who does?), so incremental review are useful only from a technical perspective (which is already quite a bit), but somebody with a 4-5 years old PC would struggle to find a reference.Personally, I would have loved to see Ryzen 2*** added to the picture (not sure if there's something that costs as much as the i7-9700K, but a 2800 seems relevant).
Thanks again.
trentbg - Saturday, May 11, 2019 - link
Still running i7-2700k on my main pc and looking to replace it any time soon, at 4.7ghz most games run over 100fps. My other machine is still on Xeon W3690, OCd to comfortable 4ghz on all 6 cores and again is crashing every game, why upgrade?teamet - Saturday, May 11, 2019 - link
For anyone still using a Core i7-2600K for CPU testing, even when overclocked, it’s time to feel the benefits of an upgradeRSAUser - Saturday, May 11, 2019 - link
For most the benefits are minimal, so my reply is, "why"?Icehawk - Saturday, May 11, 2019 - link
My OC’d 3770k is still going strong, moved it to my wife and I run a 8700k now but I only upgraded because her 3rd gen i5 died not because I felt a strong performance need. I’m primarily a gamer. Where I did see an uplift (and needed it desperately) is in my HEVC transcoding.Sahrin - Saturday, May 11, 2019 - link
Why on Earth would you post an upgrade comparison of only Intel CPU's?BushLin - Saturday, May 11, 2019 - link
Because during that period AMD's products were trash?StrangerGuy - Saturday, May 11, 2019 - link
One thing I want to point out that modern games are far less demanding relative to the CPU versus games in the 90s. If anyone thinks their 8 year old Sandy Bridge quad is having it sort of rough today, they are probably not around to remember running Half-Life comfortably above 60 FPS at least needed a CPU that was released 2 years later.versesuvius - Saturday, May 11, 2019 - link
There is a point in every Windows OS user computer endeavors, that they start playing less and less games, and at about the same time start foregoing upgrades to their CPU. They keep adding ram and hard disk space and maybe a new graphic card after a couple of years. The only reason that such a person that by now has completely stopped playing games may upgrade to a new CPU and motherboard is the maximum amount of RAM that can be installed on their motherboard. And with that really comes the final PC that such a person may have in a long, long time. Kids get the latest CPU and soon will realize the law of diminishing returns, which by now is gradually approaching "no return", much faster than their parents. So, in perhaps ten years there will be no more "Tic", or "Toc" or Cadence or Moore's law. There be will computers, baring the possibility that dumb terminals have replaced PCs, that everybody knows what they can expect from. No serendipity there for certain.Targon - Tuesday, May 14, 2019 - link
The fact that you don't see really interesting games showing up all that often is why many people stopped playing games in the first place. Many people enjoyed the old adventure games with puzzles, and while action appeals to younger players, being more strategic and needing to come up with different approaches in how you play has largely died. Interplay is gone, Bullfrog, Lionhead....On occasion something will come out, but few and far between.Games for adults(and not just adult age children who want to play soldier on the computer) are not all that common. I blame EA for much of the decline in the industry.
skirmash - Saturday, May 11, 2019 - link
I still have an i7-2600 in an old Dell based upon an H67 chipset. I was thinking about using it as a server and updating the board to get updated connectivity. updating the board and using it as a server. Z77 chipset would seem to be the way to go although getting a new board with this chipset seems expensive unless I go used. Anyone any thoughts on this - whether its worthwhile etc or a cost effective way to do it?skirmash - Saturday, May 11, 2019 - link
Sorry for the typos but I hope you get the sentiment.Tunnah - Saturday, May 11, 2019 - link
Oh wow this is insane timing, I'm actually upgrading from one of these and have had a hard time figuring out what sort of performance upgrade I'd be getting. Much appreciated!Tunnah - Saturday, May 11, 2019 - link
I feel like I can chip in a perspective re: gaming. While your benchmarks show solid average FPS and all that, they don't show the quality of life that you lose by having an underpowered CPU. I game at 4K, 2700k (4.6ghz for heat&noise reasons), 1080Ti, and regularly can't get 60fps no matter the settings, or have constant grame blips and dips. This is in comparison to a friend who has the same card but a Ryzen 1700XNewer games like Division 2, Assassin's Creed Odyssey, and as shown here, Shadow Of The Romb Raider, all severely limit your performance if you have an older CPU, to the point where getting a constant 60fps is a real struggle, and benchmarks aside, that's the only benchmark the average user is aiming for.
I also have 1333mhz RAM, which is just a whole other pain! As more and more games move into giant open world games and texture streaming and loading is happening in game rather than on loading screens, having slow RAM really affects your enjoyment.
I'm incredibly grateful for this piece btw, I'm actually moving to Zen2 when it comes out, and I gotta say, I've not been this excited since..well, Sandy Bridge.
Death666Angel - Saturday, May 11, 2019 - link
"I don’t think I purchased a monitor bigger than 1080p until 2012."Wow, really? So you were a CRT guy before that? How could you work on those low res screens all the time?! :D I got myself a 1200p 24" monitor once they became affordable in early 2008 (W2408hH). Had a 1280x1024 19" before that and it was night and day, sooo much better.
PeachNCream - Sunday, May 12, 2019 - link
Still running 1366x768 on my two non-Windows laptops (HP Steam 11 and Dell Latitude e6320) and it okay. My latest, far less uses Windows gaming system has a 14 inch panel running 1600x900. Its a slight improvement, but I could live without it. The old Latitude does all my video production work so though I could use a few more pixels, it isn't the end of the world as is. The laptop my office issued is a HP Probook 640 G3 so it has a 14 inch 1080p panel which to have to scale at 125% to actually use so the resolution is pretty much pointless.PeachNCream - Sunday, May 12, 2019 - link
Ugh, phone auto correct...I really need to look over anything I type on a phone more closely. I feel like I'm reading comment by a non-native English speaker, but its me. How depressing.Death666Angel - Sunday, May 12, 2019 - link
I've done some horrendous posts when I used my phone to make a comment somewhere. Mostly because my phone is trained to my German texting habits and not my English commenting habits. And trying to mix them leads to sub par results in both areas, so I mostly stick to using my phone for texting and my PC and laptop for commenting. But sometimes I have to write something via my phone and it makes a beautiful mess if I'm not careful.Death666Angel - Sunday, May 12, 2019 - link
Well, laptops and desktops (with monitors) are in a different category anyway, at least that's how I see it. :-)I work with a 13.3" laptop with a 1440p resolution and 150% scaling. It's not fun, but it does the job. The advantage of the larger screen real estate with a 15" or 17" laptop is outweight by the size and weight increase. I've also done work on 1024x768 monitors and it does the job in a pinch. But I've tried to upgrade as soon as the new technology was established, cheap and good enough to make it worth it without having to pay the early adopter fee or fiddle around to get it to work. Even before Win7 made it a breeze to have multiple windows in an orderly grid, I took full advantage of a multi window and multi program workflow for research, paper/presentation writing, editing and media consumption. So it is a bit surprising to see someone like Ian, a tech enthusiast with a university doctorate be so late to great tech that can really make life easier. :D
Showtime - Saturday, May 11, 2019 - link
Great article. Was hoping to see all the CPU's tested (my 4770k), but I think it shows enough. This isn't the 1st article showing that lesser CPU's can run close to the best CPU's when it come to 4k gaming. Does that look to change any time soon? I was thinking I should upgrade this year, but would like to know if I should be shooting for an 8 core, or if a 6 will be a decent enough upgrade.Consoles run slower 8 core proc's that are utilized more efficiently. At some point won't pc games do the same?
Targon - Tuesday, May 14, 2019 - link
There is always the question about what you do on your computer, but I wouldn't go less than 8 cores(since 4-core has become the base on the desktop, and even laptops should never be sold with only 2 cores IMO). If you look at the history, when AMD wasn't competitive and Intel stopped trying to actually innovate, quad-core was all you saw on the desktop, so game developers didn't see a reason to support more threads(even though it would have made sense). Once Ryzen came out with 8 cores, and Intel finally responded, you have to expect that every game developer will design with the potential that players will have 8+ core processors, so why not design with that in mind?Remember, a program that is properly multi-threaded in design will work on lower-core processors, but will scale up well when processors with more cores are being used. So going forward, quad-core would work, but 8 or more threads WILL feel a lot better, even for overall use.
CaedenV - Saturday, May 11, 2019 - link
This was a fascinating article! And what I am seeing in the real world seems to reflect this.For the most part, the IPC for general use has improved, but not by a whole lot. But if doing anything that hits the on-chip GPU, or requiring any kind of decrypt/encrypt, then the dedicated hardware in newer chips really makes a big difference.
But at the end of the day, in real-world scenarios, the CPU is simply not the bottle neck for most people. I do a lot of video ripping (all legally purchased, and only for personal use), and the bottleneck is squarely on the Blu-Ray drive. I recently upgraded from a 4x to a 10x drive, and the performance bump was exactly what was expected. Getting a faster CPU or GPU will not help there.
I do a bit of video editing, and the bottle-neck there is still almost always in storage. The 1gbps connection to the NAS, and the 1GBps connection to my RAID0 of SSDs.
I do a bit of gaming at 4k, and again the bottleneck there is squarely on the GPU (GTX1080), and as your tests show, at lower resolution my chip will be slower than a new chip... but still faster than the 60-120fps refresh of the monitor.
The real reason for an upgrade simply isn't the CPU for most people. The upgrade is the chipset. Faster/more RAM, M.2 SSDs, more available throughput for expansion cards, faster USB/USB-C ports, and soon(ish) 10gig Ethernet. These are the things that make life better for the enthusiast and the normal user; and the newer CPUs are simply more capable of taking advantage of all the extra throughput, where Sandy Bridge would perhaps choke when dealing with these newer and faster interfaces that are not available to it.
All that said; I am still not convinced to upgrade. Every previous computer was simply broken, or could not do something after 2-3 years, so an upgrade was literally necessary. But now... my computer is some 8 years old now, and I am amazed at the fact that it still does it all, and does it relatively quickly. Without it being 'broken' it is hard to justify dropping $1000+ into a new build. I mean... I want to upgrade. But I also want to do some house projects, and replace a car, and do stuff with the kids... *sigh* priorities. Part of me wishes that it would break to give me proper motivation to replace it.
webdoctors - Saturday, May 11, 2019 - link
Great timing, I've been using the same chip for 7 or 8 years now and never felt the need to upgrade until this year, but I will upgrade end of this year. DDR4 finally dropped in price and my GTX1070TI I think is getting throttled when the CPU ain't overclocked.atomicWAR - Saturday, May 11, 2019 - link
Gaming at 4K with a i7 3930K @ 4.2ghz (4.6ghz capable when needed) with 2 GTX 1080s...I was planning a new build this year but after reading this I may hold off even longer.wrkingclass_hero - Sunday, May 12, 2019 - link
I've got a 3930K as well. I was planning on upgrading to Threadripper 3 when that comes out, but if it gets delayed I may wait a bit longer for a 5mm Threadripper.mofongo7481 - Saturday, May 11, 2019 - link
I'm still using a sandy bridge i5 2400 overclocked to 3.6Ghz. Still playing modern stuff @ 1080p and pretty enjoyable.Danvelopment - Sunday, May 12, 2019 - link
I think the conclusion is slightly off for gaming, from what I could see it's not that the newer processors were only better higher resolutions, it's that the newer systems were better able to keep the GPU fed with data, resulting in a higher maximum frame rate.So at lower resolutions/quality settings, when the GPUs could let loose they could achieve much higher FPS.
My conclusion from the results wouldn't be to keep it for higher res gaming, but to keep it for gaming if you're still using a 60Hz display (which I am). I bet if you tuned quality settings for all of the GPUs to run at 60 FPS your results would sit pretty close at any resolution.
I'm currently running an E5-2670 for my gaming machine with quad channel DDR3 (4x8GB) and a 1070. That's the budget upgrade path I'd probably recommend at 60Hz.
mr_tawan - Sunday, May 12, 2019 - link
Just upgraded to Core i7 4790 (from i5 4460) late last year. At first I was thinking about upgrading to the shiny Ryzen 7, but overall cost is pretty high considering I have my H97 mainboard with 16GB of memory. I don't want to shell out that much money and getting stuck at older platform, again.It does work ok, with the performance around the current gen Core i5 I guess (with less power efficiency). Consider what I paid, I think it's not too bad.
just4U - Sunday, May 12, 2019 - link
A interesting read there Ian. I started to notice a slow down on 2600K class systems a few years ago when I worked on them.. (I hadn't used one since 2014) For me.. If I can notice those slowdowns in real time then it's time to move away from that CPU. The 4790K appears to still be holding up ok but older 3000/2000 chips not so well.crotach - Sunday, May 12, 2019 - link
Still running 3930k Sandy Bridge.Maybe Ryzen 3000 will give me a reason to upgrade.
AndrewJacksonZA - Sunday, May 12, 2019 - link
Best quote out of the entire article:"In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate" :-)
But seriously though, for me, when I upgraded from a Core2Duo E6750 with 4GB of RAM to an i7-6700 (non-K) with 16GB of RAM, it was simply amazing. I was fully expecting that going from an i7-2600K to an i7-9700K would be similar - and it is for things like compiling but not for things like gaming.
Thanks for the aricle, Ian! Dig the LAN setup. :-)
Targon - Sunday, May 12, 2019 - link
Why would you test a CPU and use a framerate test from Civilization 6, rather than the turn length benchmark which is a true test of the CPU rather than the GPU? Turn based games SHOULD be there as CPU tests, and only caring about the framerates seems to be wrong.Oxford Guy - Sunday, May 12, 2019 - link
When your overclock fails in one test you're unstable.When it fails in four, as in this article, you're both unstable and laughable.
"Had issues". "For whatever reason". I will assume this is all intended to be humor.
DeltaIO - Monday, May 13, 2019 - link
Interesting article to read. I've only recently upgraded from my 2600k to the 9700k, even that was begrudgingly as the 2600k itself still works fine, however the motherboard simply decided to give up on me.I've got to say though, the difference in the subsystems (NVMe vs SSD makes for some great load times for pretty much everything) as well as other tangible benefits (gaming at higher frame rates) is quite apparent now I have upgraded.
I would have upgraded far sooner had Intel not chosen to keep changing the sockets, swapping out just a CPU is far simpler than rebuilding the entire system.
Tedaz - Monday, May 13, 2019 - link
Expecting i9-9900K joins the article.Badelhas - Monday, May 13, 2019 - link
I an still with a 2500K overclocked to 4.8Ghz, 8Gb of DDR3 1600Mhz RAM and, a 850 Evo SSD and a Nvidia 1070. I honestly see no reason to upgrade.IAN: All your testing basically demonstrated that there is no real reason that justifies spending 400 bucks for a new CPU, 200 bucks for a new Motherboard and 100 bucks for new DDR4 Ram - This totals 700 dollars. But your conclusion is that we should upgrade?! I dont get it.
tmanini - Monday, May 13, 2019 - link
Go ahead and re-read his "Bottom Line" concluding articles: gives a few specific recommendations where is may and may not be to your advantage. And if you aren't desiring/needing all of the other new bells/whistles that go along with newer boards and architecture, then you are set (he says).Seems pretty clear.
Midwayman - Monday, May 13, 2019 - link
I think the biggest thing I noticed moving to a 8700k from a 2600k was the same thing I noticed moving from a core 2 duo to a 2600k. Less weird pauses. The 2600k would get weird hitches in games. System processes would pop up and tank the frame rate for an instant, or just an explosion would trigger a physics event that would make it stutter. I see that a lot less with a couple extra cores and some performance overhead.tmanini - Monday, May 13, 2019 - link
I agree, the user experience is definitely improved in those ways. Granted, many of us think our time is a bit more important than it likely really is. (does waiting 3 seconds really ruin my day?)ochadd - Monday, May 13, 2019 - link
Enjoyed the article very much.Magnus101 - Monday, May 13, 2019 - link
You get about 3Xperformance when going from an upclocked 2600k@4.5GHz to a 8700k@4.5GHz when working in DAW:s (Digital Audio Workstation), i.e running dozens and dozens of virtual instruments and plugins when making music.The thing is that it is a combination of applications that:
1. Use all the SSE/AVX or whatever all the streaming extensions that makes parallell flotaing point calculations go much faster. DAW is all about floating point calculations.
2. Are extremely real-time dependent to get ultra low latency (milliseconds in single digits).
This makes even the 7700 k about double in performance in some scenarios when compared to an equally clocked 2600k.
mikato - Monday, May 13, 2019 - link
"and Intel’s final quad-core with HyperThreading chip for desktop, the 7700K""the Core i7-7700K, Intel’s final quad-core with HyperThreading processor"
Did I miss some big news?
mapesdhs - Monday, May 13, 2019 - link
"... the best chips managed 5.0 GHz or 5.1 GHz in a daily system."Worth noting that with the refined 2700K, *all* of them run fine at 5GHz in a daily system, sensible temps, a TRUE and one fan is plenty for cooling. Threaded performance is identical to a stock 6700K, IPC is identical to a stock 2700X (880 and 177 for CB R15 Nt/1t resp.)
Also, various P67/Z68 mbds support NVMe boot via modded BIOS files. The ROG forum has a selection for ASUS, search for "ASUS bolts4breakfast"; he's added support for the M4E and M4EZ, and I think others asked the same for the Pro Gen3, etc. I'm sure there are equivalent BIOS mod threads for GIgabyte, MSI, etc. My 5GHz 2700K on an M4E has a 1TB SM961 and a 1TB 970 EVO Plus (photo/video archive), though the C-drive is still a venerable Vector 256GB which holds up well even today.
Also, RAM support runs fine with 2133 CL9 on the M4E, which is pretty good (16GB GSkill TridentX, two modules).
However, after using this for a great many years, I do find myself wanting better performance for processing images & video, so I'll likely be stepping up to a Ryzen 3000 system, at least 8 cores.
mapesdhs - Monday, May 13, 2019 - link
Forgot to mention, someting else interesting about SB is the low cost of the sibling SB-E. Would be a laugh to see how all those tests pan with with a 3930K stock/oc'd thrown into the mix. It's a pity good X79 boards are hard to find now given how cheap one can get 3930Ks for these days. If stock performance is ok though, there are some cheap Chinese boards which work pretty well, and some of them do support NVMe boot.tezcan - Monday, May 13, 2019 - link
I am still running 3930k, prices for it are still very high ~$500. Not much cheaper then what I paid for it in 2011. I am yet to really test my GTX 680's in SLI. Kind of a waste, but they are driving many displays throughout my house. There was an article where some Australian bloke guy runs an 8 core sandy bridge - e (server chip) vs all modern intel 8 core chips. It actually had the lowest latency so was best for pro gamers, lagged a little behind on everything else- but definitely good enough.dad_at - Tuesday, May 14, 2019 - link
I run 3960X at ~ 4 GHz on X79 ASUS P9X79 and have nvme boot drive with modified BIOS. So it is really interesting to compare 2011/2012 6c/12t to 8700K or 9900K. I guess it's about 7700K stock, so modern 4c/8t is like old 6c/12t. Per core perf is about 20-30% up on average and this includes higher frequency ... So IPC is only about 15% up: not impressive. Of course in some loads like AVX2 heavy apps IPC could be 50% up, but such case is not common.martixy - Monday, May 13, 2019 - link
Oh man... I just upgraded my 2600K to a 9900K and a couple days later this article drops...The timing is impeccable!
If I ever had a shred of buyer's remorse, the article conclusion eradicated it thoroughly. Give me more FPS.
I saw a screenshot of StarCraft 2. On a mission which I, again, coincidentally (this is uncanny) played today. I can now report that the 9900K can FINALLY feed my graphics card in SC2 properly. With the 2600K I'd be around 20-60 FPS depending on load and intensity of the action. With the new processors, it barely ever drops below 60 and usually hovers around 90FPS. Ingame cinematics also finally run above the "cinematic" 30 FPS I saw on my trusty old 2600K.
Ranger90125 - Tuesday, May 14, 2019 - link
Using a 4790K for years and increasingly disillusioned with Intel's shady practices and lack of progress. Last AMD processor was an Athlon 64 3400 from the glory days of Intel decimated by the competition. Next processor will be 7nm Zen and I look forward to Intel being under the cosh for as long as AMD can manage it. Thanks for a great nostalgic read...I liked the lean and mean Cutress LAN machine :)akyp - Tuesday, May 14, 2019 - link
In less than 5 months my i7-860 will celebrate its 10th birthday. I've been keeping an eye on Ryzen 3 and Navi but never feel the need to upgrade (unless something goes wrong). It doesn't feel any slower than my work-issued i7-6700.curley60 - Tuesday, May 14, 2019 - link
About 5 years ago I went backwards and downgraded(?) my Core i7 2600K to a Gulftown Core i7 990x when they became affordable. The Core i7 990x on my Asus Rampage Formula is running @ 4.660 and is really quite faster in all benchmarks than the Core i7 2600K. Those gulftown processors were ahead of their time. Sure a core i7 7700k is 18% faster in single core work but the 990x destroys it in multi-threaded work. As long as it keeps running I'm going to keep using it with my current GTX 1080ti.Potatooo - Wednesday, May 15, 2019 - link
Would like to see comparisons with a more budget GPU (e.g. 1060/580) and 1080p gaming, probably a more realistic pairing.Bash99 - Wednesday, May 15, 2019 - link
It's wired Handbrake 1.1 hevc 1080p encoding can have 60 fps with x265, even in very fast setting, I can only got 1x fps.rexhab - Thursday, May 16, 2019 - link
I just upgrad from a 5 2500 to a i7 2600K ;) ^^ballsystemlord - Thursday, May 16, 2019 - link
Spelling and grammar corrections:"Sandy Bridge as a whole was a much more dynamic of a beast than anything that's come before it."
Excess "of a":
"Sandy Bridge as a whole was a much more dynamic beast than anything that's come before it."
"They also have AVX2, which draw a lot of power in our power test."
Missing "s":
"They also have AVX2, which draws a lot of power in our power test."
oktat - Sunday, May 19, 2019 - link
would you update the civilization vi ai turn time when technical issues fixed?bullshooter4040 - Wednesday, May 22, 2019 - link
This was a fun article to read through. A great look into the CPU that defined the decade and a wonderful send-off (or not!?!) to the greatest CPU processor since the Core 2 Duo.Up until last year, I had the younger cousin: i5 2500k, which with a lack of hyper-threading, made it much more difficult to keep up in much more CPU intensive tasks (even for a gamer) in 2018 and I made the switch to team orange.
Ryzen is here now, promising longevity, of not just its CPU, but more importantly - the AM4 platform - something that Intel did not accomplish with any of it's processors.
With the Ryzen 3000 series, It's time to jump on board.
PyroHoltz - Thursday, May 30, 2019 - link
NVMe is fully possible on the 2600k gen motherboards, just takes a bit of BIOS modifications to add the appropriate drivers.monglerbongler - Friday, June 14, 2019 - link
You don't need to buy a new computer every year and with an intelligently made upfront investment you can potentially keep your desktop, with minimal or zero hardware upgrades, for a *very* long time/news at 11
If there is any argument that supports this its Intel's consumer/prosumer HEDT platforms.
The X99 was compelling over X58. The x299 is not even remotely compelling. I still have my old X99/ i7-5930k (6 core 40 lane PCIe3). its still fantastic, but thats at least partially because I bit the bullet and invested in a good motherboard and GPU at the time. All modern games still play fantastically and it can handle absolutely anything I throw at it.
More a statement of "future proofing" than inherent performance.
Sancus - Saturday, June 15, 2019 - link
It's always disappointing to see heavily GPU bottlenecked benchmarks in articles like these, without a clear warning that they are totally irrelevant to the question at hand.It also feeds into the false narrative that what resolution you play at matters for CPU benchmarks. What matters a lot more is what GAME you're playing, and these tests never benchmark the actually CPU bound multiplayer games that people are playing, because benchmarking those is Hard.
BlueB - Friday, June 21, 2019 - link
So if you're a gamer, there is STILL no reason for you to upgrade.Hogan773 - Friday, July 12, 2019 - link
I have a 2600K system with ASRock moboNow that there is so much hype about the Ryzen 3, is that my best option if I wanted to upgrade? I guess I would need a new mobo and memory in addition to the CPU. Otherwise I can use the same SSD etc.
tshoobs - Wednesday, July 17, 2019 - link
Still running my 3770 at stock clocks - "not a worry in the world, cold beer in my hand".Added an SSD and upgraded to a 1070 from the original GPU, . Best machine I've ever had.
gamefoo21 - Saturday, August 10, 2019 - link
I was running my X1950XT AIW at wonder level overclocks with a Pentium M overclocked, and crushing Athlon 64 users.It would have been really interesting to see that 7700K with DDR3. I run my 7700K @ 5Ghz with DDR3-2100 CL10 on a GA-Z170-HD3. Sadly the power delivery system on my board is at it's limits. :-(
But still a massive upgrade from a FX-8320e and MSI 970 mobo that I had before.
gamefoo21 - Saturday, August 10, 2019 - link
I forgot to add that it's 32GB(8GB x 4) G.Skill CL9 1866 1.5V that runs at 2100 CL10 at 1.5V but I have to give up 1T command rate.The GPU that I carried over is the Fury X. Bios modded of course so it's undervolted, underclocked and the HBM timings tightened. Whips the stock config.
The GPU is next up for upgrading, but I'm holding out for Navi with hardware RT and hopefully HBM. Once you get a taste of the low latency it's hard to go back.
OpenCL memory bandwidth for my Fury X punches over 320GB/s with single digit latency. The iGPU in my 7700K, is around 12-14GB/s and the latency is... -_-
BuffyCombs - Thursday, February 13, 2020 - link
There are several things about this article I dont like1. In the Game Tests, i actually dont care if one CPU is 50 Percent better when one shows 10 FPS and the other 15. Also I don’t care if it is 200 or 300 fps. So I would change to scale into a simple metric and that is: is it fun to play or not.
2. Development is not mentioned: The Core Wars has just started and the monopoly of intel is over. Why should we invest in new processors when competition has just begun. I predict price per performance will fall faster in the next years than it did in the previous 10 years. So buying now is buying into an overpriced and fast developing marked.
3. There is no Discussion if one should buy a used 2600k system today. I bought one a few weeks ago. It was 170 USD, has 16 GB of Ram and a gtx760. It plays all the games I throw at it and does the encoding of some videos I take in classes every week. Also I modified its cooler so that it runs very very silent. Using this system is a dream! Of course one could invest several times as much for a new system that is twice as fast in benchmarks but for now id rather save a few hundred bucks and invest when the competition becomes stagnant again or when some software I use really demands for it because of new instructions.
scrubman - Tuesday, March 23, 2021 - link
Great write-up! Love my 2600k still to this day and solid at 4.6GHz on air the whole time! I do see an upgrade this year though. She's been a beast!! Never thought the 300A Celeron OC to 450 would get beat! hahaSirBlot - Monday, July 25, 2022 - link
I get 60fps SotTR cpu game and render with rtx 3060ti with ray tracing on medium and everything else ultra. 2600k @4.2SirBlot - Monday, July 25, 2022 - link
60fps min cpu game and cpu render benchmark. Could probably ray tracing up a bit in the actual game for most of it.denise23 - Sunday, October 24, 2021 - link
Nice one https://www.thesims3.com/mypage/macca_star86/myblo...denise23 - Sunday, October 24, 2021 - link
https://www.thesims3.com/mypage/macca_star86/myblo...