Meet The GeForce GTX 660

For virtual launches it’s often difficult for us to acquire reference clocked cards since NVIDIA doesn’t directly sample the press with reference cards, and today’s launch of the GeForce GTX 660 launch is one of those times. The problem stems from the fact that NVIDIA’s partners are hesitant to offer reference clocked cards to the press since they don’t want to lose to factory overclocked cards in benchmarks, which is an odd (but reasonable) concern.

For today’s launch we were able to get a reference clocked card, but in order to do so we had to agree not to show the card or name the partner who supplied the card. As it turns out this isn’t a big deal since the card we received is for all practical purposes identical to NVIDIA’s reference GTX 660, which NVIDIA has supplied pictures of. So let’s take a look at the “reference” GTX 660.

The reference GTX 660 is in many ways identical to the GTX 670, which comes as no great surprise given the similar size of their PCBs, which in turn allows NVIDIA to reuse the same cooler with little modification. Like the GTX 670, the reference GTX 660 is 9.5” long, with the PCB itself composing just 6.75” of that length while the blower and its housing composes the rest. The size of retail cards will vary between these two lengths as partners like EVGA will be implementing their own blowers similar to NVIDIA’s, while other partners like Zotac will be using open air coolers not much larger than the reference PCB itself.

Breaking open one of our factory overclocked GTX 660 (specifically, our EVGA 660 SC using the NV reference PCB), we can see that while the GTX 670 and GTX 660 are superficially similar on the outside, the PCB itself is quite different. The biggest change here is that while the 670 PCB made the unusual move of putting the VRM circuitry towards the front of the card, the GTX 660 PCB once more puts it on the far side. With the GTX 670 this was a design choice to get the GTX 670 PCB down to 6.75”, whereas with the GTX 660 it requires so little VRM circuitry in the first place that it’s no longer necessary to put that circuitry at the front of the card to find the necessary space.

Looking at the GK106 GPU itself, we can see that not only is the GPU smaller than GK104, but the entire GPU package itself has been reduced in size. Meanwhile, not that it has any functional difference, but GK106 is a bit more rectangular than GK104.

Moving on to the GTX 660’s RAM, we find something quite interesting. Up until now NVIDIA and their partners have regularly used Hynix 6GHz GDDR5 memory modules, with that specific RAM showing up on every GTX 680, GTX 670, and GTX 660 Ti we’ve tested. The GTX 660 meanwhile is the very first card we’ve seen that’s equipped with Samsung’s 6GHz GDDR5 memory modules, marking the first time we’ve seen non-Hynix memory on a GeForce GTX 600 card. Truth be told, though it has no technical implications we’ve seen so many Hynix equipped cards from both AMD and NVIDIA that it’s refreshing to see that there is in fact more than one GDDR5 supplier in the marketplace.

For the 2GB GTX 660, NVIDIA has outfit the card with 8 2Gb memory modules, 4 on the front and 4 on the rear. Oddly enough there aren’t any vacant RAM pads on the 2GB reference PCB, so it’s not entirely clear what partners are doing for their 3GB cards; presumably there’s a second reference PCB specifically built to house the 12 memory modules needed for 3GB cards.

Elsewhere we can find the GTX 660’s sole PCIe power socket on the rear of the card, responsible for supplying the other 75W the card needs. As for the front of the card, here we can find the card’s one SLI connector, which like previous generation mainstream video cards supports up to 2-way SLI.

Finally, looking at display connectivity we once more see the return of NVIDIA’s standard GTX 600 series display configuration. The reference GTX 660 is equipped with 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. Like GK104 and GK107, GK106 can drive up to 4 displays, meaning all 4 ports can be put into use simultaneously.

The GeForce GTX 660 Review Just What Is NVIDIA’s Competition & The Test
Comments Locked

147 Comments

View All Comments

  • TemjinGold - Thursday, September 13, 2012 - link

    "For today’s launch we were able to get a reference clocked card, but in order to do so we had to agree not to show the card or name the partner who supplied the card."

    "Breaking open a GTX 660 (specifically, our EVGA 660 SC using the NV reference PCB),"

    So... didn't you just break your promise as soon as you made it AND show a pic of the card right underneath?
  • Sufo - Thursday, September 13, 2012 - link

    Haha, shhhh!
  • Homeles - Thursday, September 13, 2012 - link

    Reading comprehension is such an endangered resource...

    If it's the super clocked edition, it's obviously not a reference clocked card.
  • jonup - Thursday, September 13, 2012 - link

    Exactly my thoughts.
  • Ryan Smith - Thursday, September 13, 2012 - link

    Homeles is correct. That's one of the cards from the launch roundup we're publishing later today.. The reference-clocked GTX 660 we tested is not in any way pictured (I'm not quite that daft).
  • knutjb - Saturday, September 15, 2012 - link

    No matter what you try to say it still reads poorly. It should be blatantly obvious about which card was which up front, which the article wasn't. I should have to dig when scanning through.

    Also, your picking it as the better choice over a card that has been out how long, over slight differences... If nvivda really wanted to me to say wow I'll buy it now, the card would have been no more than 199 at launch. 10 bucks under is the best they can do for being late to the party? And you bought the strategy. I have been equally disappointed with AMD when they have done the same thing.
  • MrSpadge - Sunday, September 16, 2012 - link

    When reading Anadtech articles it's almost always safe to assume "he actually means what he's saying". Helps a lot with understanding.
  • thomp237 - Sunday, September 23, 2012 - link

    So where is this roundup? We are now 10 days on from your comment and still no signs of a roundup.
  • CeriseCogburn - Friday, October 12, 2012 - link

    I have been wondering where all the eyefinity amd fragglers have gone to, and now I know what has occurred.

    Eyefinity is Dead.

    These Kepler GPU's from nVidia all can do 4 monitors out of the box. Sure you might find a cheap version with 3 ports, whatever - that's the minority.

    So all the amd fanboys have shut their fat traps about eyefinity, since nVidia surpassed them with A+ 4 easy monitors out of the box on all the Kelpers.

    Thank you nVidia dearly for shutting the idiot pieholes of the amd fanboys.

    It took me this long to comment on this matter because nVidia fanboys don't all go yelling in unison sheep fashion about stuff like the little angry losing amd fans do.

    I have also noticed all the reviewers who are so used to being amd fan rave boys themselves almost never bring up multimonitor and abhor pointing out nVidia does 4 while amd only does 3 except in very expensive special cases.

    Yeah that's notable too. As soon as amd got utterly and totally crushed, it was no longer a central topic and central theme for all the review sites like this place.

    That 2 week Island vacation every year amd puts hundreds of these reporters on must be absolutely wonderful.
    I do hope they are treated very well and have a great time.
  • EchoOne - Wednesday, November 21, 2012 - link

    LOL dude,the 660ti vs the 7950 in eyefinity would get destroyed.I know this because my friend has a comp build with a phenom 965be 4.2ghz and 660ti with 16gb of ram (i built this for him) and i have a fx 6100 4.7ghz,16gb ram and a 7950 i run a triple monitor setup

    https://www.youtube.com/watch?v=ZRXGveviruw&fe...

    And his 660ti DIED trying to play the games at that res and at the same settings as i do.He had to take down his graphics settings from say gta4 from max settings down to about medium and high (i run very high)

    So yeah sure it can run a couple monitors out of the box but same with eyefinity.And trust me their nvidia surround is not as polished as eyefinity..But they get props for trying.

Log in

Don't have an account? Sign up now