It’s hard not to notice that NVIDIA has a bit of a problem right now. In the months since the launch of their first Kepler product, the GeForce GTX 680, the company has introduced several other Kepler products into the desktop 600 series. With the exception of the GeForce GT 640 – their only budget part – all of those 600 series parts have been targeted at the high end, where they became popular, well received products that significantly tilted the market in NVIDIA’s favor.

The problem with this is almost paradoxical: these products are too popular. Between the GK104-heavy desktop GeForce lineup, the GK104 based Tesla K10, and the GK107-heavy mobile GeForce lineup, NVIDIA is selling every 28nm chip they can make. For a business prone to boom and bust cycles this is not a bad problem to have, but it means NVIDIA has been unable to expand their market presence as quickly as customers would like. For the desktop in particular this means NVIDIA has a very large, very noticeable hole in their product lineup between $100 and $400, which composes the mainstream and performance market segments. These market segments aren’t quite the high margin markets NVIDIA is currently servicing, but they are important to fill because they’re where product volumes increase and where most of their regular customers reside.

Long-term NVIDIA needs more production capacity and a wider selection of GPUs to fill this hole, but in the meantime they can at least begin to fill it with what they have to work with. This brings us to today’s product launch: the GeForce GTX 660 Ti. With nothing between GK104 and GK107 at the moment, NVIDIA is pushing out one more desktop product based on GK104 in order to bring Kepler to the performance market. Serving as an outlet for further binned GK104 GPUs, the GTX 660 Ti will be launching today as NVIDIA’s $300 performance part.

  GTX 680 GTX 670 GTX 660 Ti GTX 570
Stream Processors 1536 1344 1344 480
Texture Units 128 112 112 60
ROPs 32 32 24 40
Core Clock 1006MHz 915MHz 915MHz 732MHz
Shader Clock N/A N/A N/A 1464MHz
Boost Clock 1058MHz 980MHz 980MHz N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 6.008GHz GDDR5 3.8GHz GDDR5
Memory Bus Width 256-bit 256-bit 192-bit 320-bit
VRAM 2GB 2GB 2GB 1.25GB
FP64 1/24 FP32 1/24 FP32 1/24 FP32 1/8 FP32
TDP 195W 170W 150W 219W
Transistor Count 3.5B 3.5B 3.5B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 40nm
Launch Price $499 $399 $299 $349

In the Fermi generation, NVIDIA filled the performance market with GF104 and GF114, the backbone of the very successful GTX 460 and GTX 560 series of video cards. Given Fermi’s 4 chip product stack – specifically the existence of the GF100/GF110 powerhouse – this is a move that made perfect sense. However it’s not a move that works quite as well for NVIDIA’s (so far) 2 chip product stack. In a move very reminiscent of the GeForce GTX 200 series, with GK104 already serving the GTX 690, GTX 680, and GTX 670, it is also being called upon to fill out the GTX 660 Ti.

All things considered the GTX 660 Ti is extremely similar to the GTX 670.  The base clock is the same, the boost clock is the same, the memory clock is the same, and even the number of shaders is the same. In fact there’s only a single significant difference between the GTX 670 and GTX 660 Ti: the GTX 660 Ti surrenders one of GK104’s four ROP/L2/Memory clusters, reducing it from a 32 ROP, 512KB L2, 4 memory channel part to a 24 ROP, 384KB L2, 3 memory channel part. With NVIDIA already binning chips for assignment to GTX 680 and GTX 670, this allows NVIDIA to further bin those GTX 670 parts without much additional effort. Though given the relatively small size of a ROP/L2/Memory cluster, it’s a bit surprising they have all that many chips that don’t meet GTX 670 standards.

In any case, as a result of these design choices the GTX 660 Ti is a fairly straightforward part. The 915MHz base clock and 980MHz boost clock of the chip along with the 7 SMXes means that GTX 660 Ti has the same theoretical compute, geometry, and texturing performance as GTX 670. The real difference between the two is on the render operation and memory bandwidth side of things, where the loss of the ROP/L2/Memory cluster means that GTX 660 Ti surrenders a full 25% of its render performance and its memory bandwidth. Interestingly NVIDIA has kept their memory clocks at 6GHz – in previous generations they would lower them to enable the use of cheaper memory – which is significant for performance since it keeps the memory bandwidth loss at just 25%.

How this loss of render operation performance and memory bandwidth will play out is going to depend heavily on the task at hand. We’ve already seen GK104 struggle with a lack of memory bandwidth in games like Crysis, so coming from GTX 670 this is only going to exacerbate that problem; a full 25% drop in performance is not out of the question here. However in games that are shader heavy (but not necessarily memory bandwidth heavy) like Portal 2, this means that GTX 660 Ti can hang very close to its more powerful sibling. There’s also the question of how NVIDIA’s nebulous asymmetrical memory bank design will impact performance, since 2GB of RAM doesn’t fit cleanly into 3 memory banks. All of these are issues where we’ll have to turn to benchmarking to better understand.

The impact on power consumption on the other hand is relatively straightforward. With clocks identical to the GTX 670, power consumption has only been reduced marginally due to the disabling of the ROP cluster. NVIDIA’s official TDP is 150W, with a power target of 134W. This compares to a TDP of 170W and a power target of 141W for the GTW 670. Given the mechanisms at work for NVIDIA’s GPU boost technology, it’s the power target that is a far better reflection of what to expect relative to the GTX 670. On paper this means that GK104 could probably be stuffed into a sub-150W card with some further functional units being disabled, but in practice desktop GK104 GPUs are probably a bit too power hungry for that.

Moving on, this launch will be what NVIDIA calls a “virtual” launch, which is to say that there aren’t any reference cards being shipped to partners to sell or to press to sample. Instead all of NVIDIA’s partners will be launching with semi-custom and fully-custom cards right away. This means we’re going to see a wide variety of cards right off the bat, however it also means that there will be less consistency between partners since no two cards are going to be quite alike. For that reason we’ll be looking at a slightly wider selection of partner designs today, with cards from EVGA, Zotac, and Gigabyte occupying our charts.

As for the launch supply, with NVIDIA having licked their GK104 supply problems a couple of months ago the supply of GTX 660 Ti cards looks like it should be plentiful. Some cards are going to be more popular than others and for that reason we expect we’ll see some cards sell out, but at the end of the day there shouldn’t be any problem grabbing a GTX 660 Ti on today’s launch day.

Pricing for GTX 660 Ti cards will start at $299, continuing NVIDIA’s tidy hierarchy of a GeForce 600 at every $100 price point. With the launch of the GTX 660 Ti NVIDIA will finally be able to start clearing out the GTX 570, a not-unwelcome thing as the GTX 660 Ti brings with it the Kepler family features (NVENC, TXAA, GPU boost, and D3D 11.1) along with nearly twice as much RAM and much lower power consumption. However this also means that despite the name, the GTX 660 Ti is a de facto replacement for the GTX 570 rather than the GTX 560 Ti. The sub-$250 market the GTX 560 Ti launched will continue to be served by Fermi parts for the time being. NVIDIA will no doubt see quite a bit of success even at $300, but it probably won’t be quite the hot item that the GTX 560 Ti was.

Meanwhile for a limited period of time NVIDIA will be sweeting the deal by throwing in a copy of Borderlands 2 with all GTX 600 series cards as a GTX 660 Ti launch promotion. Borderlands 2 is the sequel to Gearbox’s 2009 FPS/RPG hybrid, and is a TWIMTBP game that will have PhysX support along with planned support for TXAA. Like their prior promotions this is being done through retailers in North America, so you will need to check and ensure your retailer is throwing in Borderlands 2 vouchers with any GTX 600 card you purchase.

On the marketing front, as a performance part NVIDIA is looking to not only sell the GTX 660 Ti as an upgrade to 400/500 series owners, but to also entice existing GTX 200 series owners to upgrade. The GTX 660 Ti will be quite a bit faster than any GTX 200 series part (and cooler/quieter than all of them), with the question being of whether it’s going to be enough to spur those owners to upgrade. NVIDIA did see a lot of success last year with the GTX 560 driving the retirement of the 8800GT/9800GT, so we’ll see how that goes.

Anyhow, as with the launch of the GTX 670 cards virtually every partner is also launching one or more factory overclocked model, so the entire lineup of launch cards will be between $299 and $339 or so. This price range will put NVIDIA and its partners smack-dab between AMD’s existing 7000 series cards, which have already been shuffling in price some due to the GTX 670 and the impending launch of the GTX 660 Ti. Reference-clocked cards will sit right between the $279 Radeon HD 7870 and $329 Radeon HD 7950, which means that factory overclocked cards will be going head-to-head with the 7950.

On that note, with the launch of the GTX 660 Ti we can finally shed some further light on this week’s unexpected announcement of a new Radeon HD 7950 revision from AMD. As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance.

For this review we’re going to include both the 7950 and 7950B in our results. We’re not at all happy with how AMD is handling this – it’s the kind of slimy thing that has already gotten NVIDIA in trouble in the past – and while we don’t want to reward such actions it would be remiss of us not to include it since it is a new reference part. And if AMD’s credibility is worth anything it will be on the shelves tomorrow anyhow.

Summer 2012 GPU Pricing Comparison
AMD Price NVIDIA
Radeon HD 7970 GHz Edition $469/$499 GeForce GTX 680
Radeon HD 7970 $419/$399 GeForce GTX 670
Radeon HD 7950 $329  
  $299 GeForce GTX 660 Ti
Radeon HD 7870 $279  
  $279 GeForce GTX 570
Radeon HD 7850 $239  

 

That Darn Memory Bus
Comments Locked

313 Comments

View All Comments

  • TheJian - Monday, August 20, 2012 - link

    LOL.. I can't read your language, and am unsure of any of the cards speeds etc in that link.

    You're comparing something you can do on your own, NOT out of the box. Which I already proved you can easily hit ridiculous speeds with the 660TI.

    So how do I know I got a special binned chip before buying like your forum (again we can't read your language - most of us anyway)? But again I'd say it's not out of the box at those speeds. There is a ref card in GREEN for the 660TI or did you miss that?
    1150mhz?
    http://www.guru3d.com/article/radeon-hd-7950-overc...
    I know, exactly what this article runs at...LOL. Only using 79 watts more to get it done and he NEEDED 1.25v to do it. There is a reason AMD has this as default on the BOOST (not all chips can easily do it...they're not purposefully running hot and overvolted ya know...They have to in order to get more to do it!). No amount of cooling will save you money on your electric bill. Your magical 1150mhz is examined in great detail in that article with caveats regarding how long your life may be...LOL. OC at your own risk. Firmware in the 600 series makes this impossible on the 600 series cards. Roll your own dice thanks. Feb2012 article, there isn't some magical binned versions of these chips YOU can magically guarantee I'll get. Not all chips are created equal and no manufacturer is guaranteeing your speeds or anywhere near them. Point me to your magically binned advertised chips? I can't see them on newegg. So you must have some magical website I didn't know I could buy from. Enlighten me please.

    Crysis 1 and warhead? Already debunked it's relevance. But here if you missed it:
    Games based on Cryengine v2? 7 total. CryEngine3? Check out the list, including crysis 2, and the coming crysis 3:
    https://en.wikipedia.org/wiki/CryEngine
    Crysis and warhead (from march 2008 for warhead, earlier for crysis) are NOT relevant. There are only 5 other games made on it. Point me to some benchmarks showing something I can read (#1) and where I can actually see the test setup (#2). Until then all your benchmarks are meaningless. Also don't bother showing me anything over 1920x1200 and claiming victory as I already debunked that as being less than 2% market share according to steampowered.com hardware survey AND more importantly no 24in or below is sold with a native res above 1920x1200 on newegg! 68 monitors without even ONE being above 1920x1200 recommended.

    I already showed crysis 2 7950boost review vs. ref 660ti being a wash even at 2560x1920 (though a useless res). If you have to force the 660TI into something we'd never run at to show a victory your results are pointless. I'm sorry, does MSI sell your twinfrozr at 1150core out of the box? I must have missed that version on newegg. Value at $320? Not out of the box, and I can do the same thing for $299 on 660TI if I'm going to be doing overclocking myself and they are guaranteed out of box 100mhz over on both core and boost as shown before. Also I can't damage it (built into 600 series, the firmware won't let me do a dangerous overclock to shorten its life). The only two cards for $319 on newegg in 7950 are AFTER rebate #1 and only clocked at 800mhz #2. They're not going to spend on quality components to HELP your overclocking/life of the chip at that price. Quite a few of the overclocked 660's are SILENT in use.
    "For the card in a fully stressed status (in-game) 39 dBA, now that simply is silent. So if you do not tweak the card or something, during gaming you can expect a silent card."
    http://www.guru3d.com/article/evga-geforce-gtx-660...
    And that one kind of sucks russian.
    http://www.guru3d.com/article/evga-geforce-gtx-660...
    All of the 660TI's out of the box on the heels of the 7970 in anno2070. But I know, if you can get a magically binned chip, you might be able to hit a speed that makes the 7970 look like crap for bang/buck and at a speed not warrantied out of the box. But I can get almost 7970 doing nothing, no worries for less $$. So what's your point? :) Note 3 of the 660's beat the 670gtx out of the box...LOL. You got some version where the 7950 beats a 7970 out of the box for $299? I know I "CAN" get lucky, but no guarantee @1.175 like you say. Or do you think AMD is just stupid and clocks all the boost versions (that aren't out yet) at 1.25v for nothing? A hell of a lot of them WON'T get to boost guaranteed without 1.25 or they wouldn't be doing it and purposely making their cards look like shite in reviews. How dumb do you think AMD is? It's a reference for a reason.

    Ryan's review of the 7970ghz edition notes NV cards shipped clocks may not be the highest you'll get even default out of the box, only guaranteed (they will perform based on the tdp, better!):
    "Every 7970GE can hit 1050MHz and every 7970GE tops out at 1050MHz. This is as opposed to NVIDIA’s GPU Boost, where every card can hit at least the boost clock but there will be some variation in the top clock".
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Where out of the box all radeon cards perform the same (except watts used varies), but NV cards can go higher than out of the box even out of the box on boost speeds...LOL.
    Same article:
    "With that said there’s no such thing as a free lunch, and in practice the 7970GE’s power consumption has still increased relative to the 7970, as we’ll see in our benchmarks."
    These chips aren't special or wattage wouldn't climb at all. Your magical 7950 isn't special either.
    Skyrim 1920x1200 same article - Gtx580/670/680 cpu limited and BEATING the ghz 7970 edition 98fps to 86fps! Note no improvement from 7970 vs. 7970 ghz edition. with 4xmsaa/16af. (neither shows a difference at 2560x1600 useless res either...so overclocking to ghz edition didn't improve the scores over the regular 7970? in either res...LOL)
    I can hear you say, that's not the 660TI. Got me..:
    http://www.anandtech.com/show/6159/the-geforce-gtx...
    What's that? AT 98% of the user res of 1920x1200 (and below) at 4xmsaa/16af all 660ti's beat the 7970? But the 660ti CRUMBLES you said at mxsaa...LOL. Whatever dude. I can keep going on...
    The 7950/7950B/7970/7970ghz all score the same at 1920x1200. You'll have to check both articles to get the 7970ghz edition as Ryan conveniently left it out of the benchmarks in the 660ti review...LOL. Gee why? Because it got nothing here too? Including warhead vs. crysis2 with HD and enhancement pack instead? A 2008 game vs. 2012 that has a crapload more games based on CryEngine 3.0? Only 7 on CryEngine 2.0 (and 2 of them are crysis 1 and warhead...LOL).
    Check all the 1920x1200 scores (sorry I already proved 2% or less run above this and most of those OVER 2560x1600, usually with more than one card), anand's games (as everywhere else) are maxed out at every res. You can't turn anything else on to help your cards. :)
    Shogun, 660's beating 7970
    "Overall this has become a fairly NVIDIA-friendly benchmark, with the GTX 660 Ti challenging even the 7970 at 1920."
    Challenging Ryan? Every 660ti but the reference beats the 7970 (which arguably NOBODY on newegg SELLS, most are clocked much faster, MSI N660 1019/1097boost far higher than 915 ref $299 since launch) ...But again he draws his conclusion based on 2560x1600, which by his own words just below the first benchmark (worthless Warhead from 2008 instead of crysis 2 maxed out) these cards are designed for 1920x1200/1080!
    Dirt3, tied with 7950 at 1920x1200, but again Biased Ryan (?)"while the GTX 660 Ti falls behind at 2560 as it runs out of memory bandwidth.". WHO FREAKING CARES what happens where 2% or less of us run, and at a res by your own words "For a $300 performance card the most important resolution is typically going to be 1920x1080/1200". TYPICALLY?...ROFL. should say 98% of users would use this or BELOW (actually only 29% use these two). He goes further in dirt too...LOL "Looking at the minimum framerates that 660/7950 standoff finally breaks in NVIDIA’s favor. Though a lack of memory bandwidth continues to pose a problem at 2560"
    Yeah, I know, because you've beat it like a dead horse as much as you can, it runs out again where NONE OF US RUN. Damn, as I read the review there is nothing to say but Ryan is getting some cash from AMD :) Jeez, twice on the same freaking page about the 2560 crap.

    Sorry ryan, AMD lost this round at 1920x1200 where 98% of us run (or below) and nothing you can say about 2560 changes the world for 98% of us where it's either a WASH or a dominating victory for 660 (heck all 600 series cards, you can argue none are for above these resolutions in single cards, 98% no matter what they have, gtx690/680/670/660 etc all run 1920x1200 or below, no 24in monitor on newegg is ABOVE THIS). Who you advertising for?
    Portal 2 - LANDSLIDE (even at useless 2560 beats 7970 >25% nevermind the far slower 7950 here...LOL), 45% faster than 7950 at 1920x1200...ROFL. Guess you better have a 35-45% magical card just to catch the out of the box 660 here...ROFL.
    "If NVIDIA could do this on more games then they’d be in an excellent position.". Umm...They did ryan, just quit looking at the res only 2% of us use. He points out the 660 can handle SSAA here (more taxing than MSAA Russian!!) so they concentrated on it.
    Google ssaa vs. msaa and you'll find stuff like this "SSAA theorhetically AA's the whole screen and would give a much more consistent AA. MSAA simply is limited to edges." and "Of course, there's a reason why people don't use SSAA: it costs a fortune". Tougher....Yet smoking on 660...Things like this are about the GPU/Shaders etc...NOT memory bandwidth as ryan beats like a dead horse at 2560x1600.
    Battlefield 3 same anandtech ryan article? 1920x1200 4xmsaa
    LANDSLIDE, ALL 660'S WINNING vs 7970!
    http://www.anandtech.com/show/6159/the-geforce-gtx...
    Pay attention to what Ryan is doing here people. 17% faster than the 7950B! FXAA is worse same page "At 1920 with FXAA that means the GTX 660 Ti has a huge 30% performance lead over the 7950, and even the 7970 falls behind the GTX 660 Ti." That's the GREEN REF bar that NOBODY sells as already noted, AMP speeds are almost had for $299. So really it's more like >34% faster than 7950B, not vs. 7950 reg. OH and it IS playable then at MSAA as ryan says is disappointing for the REF version...LOL. I know, and nobody will buy it (on accident maybe?), so for the rest of us, MSAA is ok with battlefield dropping to minimums of ~30fps. Just keep getting those digs in where you can... :)
    Sorry russian, I'd like to destroy more of your data, but the rest aren't benchmarked here, and I can't be bothered to do more than I already have just now to prove you wrong on the "crushed" comments... :) I think I proved my point. Sniper Elite2? What's that? Sell much?
    http://www.metacritic.com/game/pc/sniper-elite-v2
    Nope metacritic score 65...I wouldn't even pirate a yellow game (under 74 score pretty unanimous not so good), let alone pay for it or care about it's benchmark. Too many shooters at 80+ scores.
    Dirt Showdown is based on the same engine as dirt 3 here. If it performs worse, I'm thinking it's a driver issue...But never mind:
    http://www.metacritic.com/game/pc/dirt-showdown Score 72, user score 4.8 out of 10...ouch Gamespy quote:
    GameSpy wrote: “DiRT: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing DiRT series.”
    Keeping quoting useless games if you'd like though. :) Sory, their review is frontpage at metacritic :)

    I'd like to see some bulletstorm, alan wake, serious sam3 benches so please LINKS? Hardware sites only please. I'd prefer review sites, rather than a forum.. People like ryan will slowly become useless with too many of these reviews. Forum users have no such worries and can more easily post anything they want. It's a good addition to something I'm looking into if I have questions after regular reviews, but I wouldn't want to base my buying purchase on forum posters benchmarks. Note I'm not posting my OWN benchmarks here after I do who knows what to my 660 TI (that I don't own yet...LOL), I'm pointing to results of review sites using stuff we BUY.

    Personally I'm buy this card (and my 22nm quad soon) to run it below default to soundly beat my Radeon 5850 but do it without driving me out of my AZ computer room. The quad should give me a great boost also at 3ghz (3770K downclocked) vs my current heater in the E3110 3ghz dual core. Sounds crazy, I know...But this week it hit 114 outside! I have a great cpu that can easily clock to 3.6ghz also - prime95 stable below default of stable 1.25v-1.35v reg E8400's! The default for my chip is 1.08 and boots well below this at 3ghz stable!...So it gives you an idea of the heat in AZ and it's affect on even what I'd call one of the best E8400 3ghz wolfdales in the world. I can't beat the heat with my bada$$ cool e3110 (a purposely bought xeon s775 for better thermals). Electric is already $250 here so, reducing temps is kind of up to my PC itself :) You can bet I went through a bunch of places to pick my specific week/lot/country of origin on that baby :) I'm no stranger to ocing, but I don't think people will all rush home with their shiny new $330 7950 (not boost) and OC the heck out of it to beat a TI that runs cooler by default and OC's out of the box at .987 volts. ONE more dig at Ryan...LOL :

    " As you’ll see in our benchmarks the existing 7950 maintains an uncomfortably slight lead over the GTX 660 Ti, which has spurred on AMD to bump up the 7950’s clockspeeds at the cost of power consumption in order to avoid having it end up as a sub-$300 product. The new 7950B is still scheduled to show up at the end of this week, with AMD’s already-battered product launch credibility hanging in the balance."

    Umm...Must be looking at those 2560x1200 2% user benchems again eh? At the res we all play at, and all 24in monitors on Newegg (68!) are at (1920x1200/1080), WHAT LEAD? IN that 2008 warhead game? Debunked already. The rest, at best the race is a wash (and not often, pretty much metro2033 ~4% faster 7950Boost) and the rest are landslides and at times landslides vs. the 7970 ghz edition. You got the AMD part right though...Only two makers actually announced it...None seem to even care as they already sell tons of 900mhz, which is faster than the 850 boost and you should have added here as you wouldn't buy a boost when you get get a 900 for likely less...LOL Just like NV cards basically come OC'd no matter what you buy at newegg...Review what we buy, who cares what AMD/NV want?

    You still confused about the conclusion you SHOULD have stated Ryan? 660TI rocks, and is a no brainer for all people using 24in and below (98%). For the other 2%...LOL. Whatever. Go ahead and remain confused about that...WE DON'T CARE. You can't be this dumb (I hope not), so I'll give you the benefit of the doubt and run with it can only be bias.

    Jeez, I just had to check real quick at 27in...ROFL.
    http://www.newegg.com/Product/ProductList.aspx?Sub...
    Check the recommended resolutions on the left side people. 41 at 1920x1080! Only 11 others, and they are not even 2560x1600! They are 2560x1440! My god man, you aren't even right at 27in! OK, now I think you're just a freaking moron. Still confused Ryan? Anand, you there? Still care about your site? Let me know if you need a new reviewer :) This is borderline ridiculous to not have a conclusion even at 27inches! Only 20% are LESS than the res Ryan draws all conclusions from the other 80% of the 27inchers on newegg recommend LOWER than tested 1920x1200...ROFLMAO. NOT a single 27in has a recommended resolution of 2560x1600 (is only 1440...less stressful). I digress...For now...ROFL.
  • RussianSensation - Monday, August 20, 2012 - link

    1) MSI TwinFrozr has been binned to include 80% ASIC 7950 chips. It will hit 1100-1200mhz on 1.175V. Every card.

    2) 3D Center compiled 12 professional reviews and GTX660Ti lost to an 800mhz 7950 overall:

    http://www.3dcenter.org/artikel/launch-analyse-nvi...

    3) BitTech and Tom's Hardware already showed that a 1300mhz GTX660Ti cannot even match a stock GTX670 in graphics intensive situations:

    Here
    http://www.tomshardware.com/reviews/geforce-gtx-66...

    and here:
    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    4) Considering HD7950 1167mhz keeps up easily with a 1300mhz GTX670, that leaves GTX660Ti overclocked to 1300mhz in 1 spot only - BEHIND:
    http://forums.anandtech.com/showpost.php?p=3385635...

    The MSI TF3 7950 is a PROVEN overclocker and it competes with a $400+ GTX670 for almost $100 less.

    Further, you failed to mention that more and more games are starting to use DirectCompute for global lighting model and shadows:

    - Dirt Showdown
    - Sniper Elite 2
    - Sleeping Dogs

    GTX660Ti/670/680 are a no show in those games, choking.

    OTOH, HD7950 can be easily overclocked to reach an overclocked 660Ti in BF3:

    7950 OC in BF3 = 69.7 fps
    http://tpucdn.com/reviews/Sapphire/HD_7950_Flex/im...

    GTX660Ti OC in BF3 = 69 fps
    http://tpucdn.com/reviews/Palit/GeForce_GTX_660_Ti...

    Overall, it's SIMPLE mathematics. An 1167 mhz 7950 keeps up with a 1300mhz GTX670. A 1300mhz GTX660Ti cannot overcome 24 ROP / 192-bit bus memory bottlenecks against a stock GTX670.

    Thus, by definition HD7950 OC > GTX660Ti OC.
  • Galidou - Tuesday, August 21, 2012 - link

    Good calculation there, logical but still we shouldn't forget about nvidia's advantage. Not everyone is overclocking like enthusiasts do. For anyone not fiddling with clocks and voltages, Nvidia is the clear choice. Overclocked 7950 might be better, but using even the aftermarket coolers will need a well ventilated case. Crossfire will mean lowering your overclock unless you watercool them...

    And most of the peolpe don't own superclocked CPU to get rid of the bottleneck it might cause. So Nvidia having a better relation with lower frequency cpus performs better at 1080p where lots of games are simply cpu limited unless you got a beast at 5ghz.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    It's not a good calculation by the amd fanboy - I went to his forum link, then to the review he linked, and saw the 660Ti SMASHING his 7950 black edition $350+ card to bits.
    He lists one game then a bunch of power and heat charts and goes on a PR selling spree... boy it's amazing... talk about obsessed fanboyism...
  • Galidou - Tuesday, August 28, 2012 - link

    His example shows the 7950 once overclocked, getting to the level of a gtx 660 ti, and this game is battlefield 3, which has always been better on Nvidia.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    Okay, so an expensive 7950, or an aftermarket HS, or water cooling, and expensive air positive case with lots of fans, then a healthy PS for the extra voltage, then endless twiddling and driver hacks for stability.
    So +$50 on the cooler or card, $50 or $100 extra on the case and fans, then add in $100 for the CPU to be able to take advantage...
    After all that dickering around and dollars, just amd fan boy out and buy a rear exhaust 7970 and be cheaper and somewhat stable at stock.
    Right ? I mean WTH.
    Then we have the less smooth problem on the 78xx 79xx series vs nVidia - the gaming is not as smooth - plus you don't have adaptive v-sync - another SMOOTH OPERATOR addition.
    These are just a few reasons why the amd prices have plummeted.
    I suppose now if you go amd you shouldn't worry much about losing a lot of value quickly, but for 8 months we took a giant hit in the wallet for buying AMD, now our cards are worth CRAP compared to what we paid for them a short time ago - with the 6 months+ of crap driver support.
    It's great - yeah just great - amd did such a great job.
  • anubis44 - Thursday, August 23, 2012 - link

    Jesus Christ, TheJian, you wrote a goddam Russian novel when you could have just come out and merely said that you want to have Jen Hsun's baby, you silly nvidiot.

    Nvidia has simply pushed the default clocks on their cards much harder than AMD. So what? So AMD leaves more o/c'ing performance on the table. Big deal. That's hardly a decisive, knock-out blow for nvidia. As a matter of fact, I'm selling my Gigabyte Windforce GTX670 2Gb tomorrow (gorgeous cooler setup by the way, and utteryly silent) because, for $400, it only beats a 7870 by about 3-5FPS on my 3 monitor setup (5040x1050 resolution) in most games, and the thought that I could buy a 7870 for $240, or a Gigabyte 7950 card with 3Gb of memory for $300 made me ill. Long and short of it: if you're playing at 1920x1080, the GTX660 Ti looks pretty good (except for those AMD-optimized games) but if you're running 2560x1080 or higher, AMD's 3Gb-equipped 7950 is going to have the extra memory and muscle to keep your minimum frame rates playable, while the 2Gb GTX660 Ti is going to choke.

    Besides, I'm sick of nvidia's shitty 3 monitor driver support. Every time I update the video driver, I have to perform brain surgery to get 2 of the monitors to come back up again. On the other hand, the Asus Direct CUII (another outstanding cooler) 7850 I had temporarily about 2 months ago for a few days drove my 3 monitor setup instantly, and setup took about 2 minutes. The AMD driver even 'guessed' the bezel compensation accurately the first time), and played Diablo III at a solid 60FPS at 5040x1050 on one card with all the quality settings at maximum. That card now costs $189.99 after factory rebate here in Canada:

    http://www.ncix.com/products/?sku=69494&vpn=GV...

    I think nVidia is going to have a HUGE fight on its hands.
  • TheJian - Friday, August 24, 2012 - link

    Must people if you'd read all that, don't run over 1920x1200. The amount who do is <2% and you have to spend a lot to do it reliably over 30fps as hardocp showed etc. You made my point. It's great where 98% of us use it. Which is pretty much what the walls have been saying :) I won't apologize for being complete ;) But feel free to call me wordy, I'll accept it. Out of the 5 games tested at hardocp they found 2 (batman/withcer2) that hit 10-15fps (for a while) and 16fps on the 7950. You'd have to double it to have a good time in those games, which was my point. These are for lower res. specifically 1920x1200 or less. At which both do a great job. No disputing that.

    People usually resort to calling you nvidiot, and having the ceo's baby (really?...I'm a dude) when they lack an effective opposing opinion. Thanks for both. Look in the mirror. ;)
  • CeriseCogburn - Thursday, August 23, 2012 - link

    No need to pretend.
    TXAA = A GREAT NEW REVOLUTIONARY ANTI ALIASING FEATURE

    It's way better than morphological aa, the CRAP amd spewed out while you cheered.
  • dishayu - Thursday, August 16, 2012 - link

    Why does the URL say "the-geforce-gtx-670-review"? Anyone care to explain?

Log in

Don't have an account? Sign up now