People in the first few months of a relationship like to mention anniversaries a lot despite the (rather obvious) point that the word denotes a yearly period. "Milestone" would be more appropriate though it does sound less glamourous and perhaps a bit pessimistic (well, in the case of relationships, anyway). Might even seem cynical.
Just read the chart and it's not confusing... I did a double take the first time I glanced at it, but when I actually read it it made perfect sense. :)
In case it's not clear, since we have a price comparison chart at the bottom, the purpose of the prices up top is to help describe the cards. The fact that the 7970GE is listed for $500 next to the $550 7970 for example is to make it clear that it's launching at a lower price than the 7970. It helps offer some perspective on capabilities and the market segment it's designed for.
That said, we can always get rid of it if it's a problem.
Could I suggest adding when they launched on the line right above the prices? I can easily see how that is confusing, but also knowing how old each generation would be useful to see.
A whole new card launch and yet another pair of similarly named but differently performing products because they changed a few numbers that anyone can in several free, easily available programs.
I suppose they can do this because you can actually buy their products though, unlike the 6XX series.
Yeah, tough to find 6xx products indeed. There is something called the 'internet' you could check out. Your buddy who posted for you might be able to help you out. ;)
I doubt any AIB will actually release GE cards with reference cooling. Most likely they will be custom cooled, so the loudness of the reference card is a bit of a moot point.
It's good to see some decent driver improvements from AMD. I'm still quite happy about 7970 performance at 5760x1080, and it's enough for most games when OC'd. It would be interesting to see though, whether the GE has improved the max OC. Most likely it's no better though, and you'll be better off buying an old custom 7970 for a good price and OC'ing it to the same levels as the GE.
The GE chips are better binned parts, one would assume that they have a bit more room for higher clocks than the normal 7970 parts. Certainly the average overclock will be higher.
So we can deduce that the prior 7970 overclocks were sucking down an even larger amount of enormous electrical power as those chips are of lower bin.
I guess we need an overclocked power suction chart with an extended table for the amd housefire 7970.
Any savings on card price or a few frames at a resolution near no one owns will be gobbled up by your electric bill every month for years - save 1 watt or 9 watts at extended idle, but when you game it's 100+ watts and beyond with the overclocked 7970 - maybe they should be $300 with 3 games.
Well, it works both ways. You won't always be gaming, in addition there's all that compute hardware that, if properly harnessed, would save you money over competing solutions because you'd get the job done quicker. It used to be pointless to consider using anything for compute that wasn't a Quadro, Tesla or even FirePro, however those days are coming to an end.
Having a 7970 will make sense for compute if that's your bag (there's a reason for the die size plus the extra memory and bus width), but this time, NVIDIA enjoys a performance/watt advantage which might go unchallenged for a while. Unless, of course, that extra hardware on the 7970 is properly leveraged; future games, perhaps?
Nvidia isn't holding GK110 in its sleeve waiting for something. It is unfinished in the first place and there's no manufacturing capacity to produce such a large chip. Nvidia still struggles to fix GK104 design to have good yields. GK110 would be impossible to produce in since it is twice bigger and such will have at least 4 times less yield. Server market is not only much more profitable, it is operating on a contract basis. Nvidia will start to produce Tesla K20 in Q4 2012.
IF(?) desktop card based on GK110 will hit the market it won't be sooner than Q1 2013. And it is not something that you can change really.
"Of course this isn’t the first time we’ve had a hot & loud card on our hands – historically it happens to NVIDIA a lot – but when NVIDIA gets hot & loud they bring the performance necessary to match it. Such was the case with the GTX 480, a notably loud card that also had a 15% performance advantage on AMD’s flagship. AMD has no such performance advantage here, and that makes the 7970GE’s power consumption and noise much harder to justify even with a “performance at any cost” philosophy."
Very true, however the power consumption and heat difference between the 5870 and the 480 was definitely more pronounced.
The 680 is an incredible card, no doubt about it. It may not win in some titles, but it's hardly anywhere near unplayable either. AMD being right there at the very high end is fantastic but unless titles truly make use of GCN's compute ability, the extra power and noise are going to be hard to swallow. Still, I'd own either. :P
Toms added a custom cooler (Gelid Icy Vision-A) to theirs which reduced noise and heat noticably (about 6 degrees C and 7-8 dB). Still, it would be cheaper to get the vanilla 7970, add the same cooling solution, and clock to the same levels; that way, you'd end up with a GHz Edition clocked card which is cooler and quieter for about the same price as the real thing, albeit lacking the new boost feature.
Would it be possible to drop the 1920x1200 definition for test? 16/10 is dead, 1080p has been the standard for high definition on PC monitors for at least 4 years now, it's more than time to catch up with reality... Sorry for the rant, I'm probably nitpicking anyway...
I haven't been seeing many 16:10 monitors around thesedays, besides, since AT even tests iGPU performance at ANYTHING BUT 1080p your "enthusiast choice" argument is invalid. 16:10 is simply a l33t factor in a market dominated by 16:9. I'll take my cheap 27" 1080p TN's spaciousness and HD content nativiness over your pricy 24" 1200p IPS' "quality" anyday.
I went over this already with the amd fanboys. For literally YEARS they have had harpy fits on five and ten dollar card pricing differences, declaring amd the price perf queen.
Then I pointed out nVidia wins in 1920x1080 by 17+% and only by 10+% in 1920x1200 - so all of a sudden they ALL had 1920x1200 monitors, they were not rare, and they have hundreds of extra dollars of cash to blow on it, and have done so, at no extra cost to themselves and everyone else (who also has those), who of course also chooses such monitors because they all love them the mostest...
Then I gave them egg counts, might as well call it 100 to 1 on availability if we are to keep to their own hyperactive price perf harpying, and the lowest available higher rez was $50 more, which COST NOTHING because it helps amd, of course....
I pointed out Anand pointed out in the then prior article it's an ~11% pixel difference, so they were told to calculate the frame rate difference... (that keeps amd up there in scores and winning a few they wouldn't otherwise).
Dude, MKultra, Svengali, Jim Wand, and mass media, could not, combined, do a better job brainwashing the amd fan boy.
Here's the link, since I know a thousand red-winged harpies are ready to descend en masse and caw loudly in protest...
1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970. Here, the performance difference in favor of the GTX680 are even greater"
So they ALL have a 1920x1200, and they are easily available, the most common, cheap, and they look great, and most of them have like 2 or 3 of those, and it was no expense, or if it was, they are happy to pay it for the red harpy from hades card.
Your comparison article is more than a bit flawed. The PCLab results, in particular, have been massively updated since that article. Looks like they've edited the original article, which is a bit odd. Still, AMD goes from losing badly in a few cases to not losing so badly after all, as the results on this article go to show. They don't displace the 680 as the best gaming card of the moment, but it certainly narrows the gap (even if the GHz Edition didn't exist).
Also, without a clear idea of specs and settings, how can you just grab results for a given resolution from four or five different sites for each card, add them up and proclaim a winner? I could run a comparison between a 680 and 7970 in a given title with the former using FXAA and the latter using 8xMSAA, doesn't mean it's a good comparison. I could run Crysis 2 without any AA and AF at all at a given resolution on one card and then put every bell and whistle on for the other - without the playing field being even, it's simply invalid. Take each review at its own merits because at least then you can be sure of the test environment.
As for 1200p monitors... sure, they're more expensive, but it doesn't mean people don't have them. You're just bitter because you got the wrong end of the stick by saying nobody owned 1200p monitors then got slapped down by a bunch of 1200p monitor owners. Regardless, if you're upset that NVIDIA suddenly loses performance as you ramp up the vertical resolution, how is that AMD's fault? Did it also occur to you that people with money to blow on $500 graphics cards might actually own good monitors as well? I bet there are some people here with 680s who are rocking on 1200p monitors - are you going to rag (or shall I say "rage"?) on them, too?
If you play on a 1080p panel then that's your prerogative, but considering the power of the 670/680/7970, I'd consider that a waste.
16:10 snobs are seriously getting out-of-touch when they start claiming that their aspect ratio gives better color reproduction. There are plenty of high-quality 1080p IPS monitors on the market -- I'm using one.
That being said, it's not really important whether it's benchmarked at x1080 or x1200. There is a neglible difference in the number of pixels being drawn (one of the reasons I roll my eyes at 16:10 snobs). If you're using a 1080p monitor, just add anywhere from 0.5 to 2 FPS to the average FPS results from x1200.
Disclaimer: I have nothing *against* 16:10. All other things being equal, I'd choose 16:10 over 16:9. However, with 16:9 monitors being so much cheaper, I can't justify paying a huge premium for a measily 120 lines of vertical resolution. If you're willing to pay for it, great, but kindly don't pretend that doing so somehow makes you superior.
They can justify it, the are the amd fanboy. Ever DOLLAR counts when it comes to card pricing, five or ten bucks makes amd the WINNER !!!!!!!! and greatest card value ever for enthusiasts !!!!!!!!!!!
But then, moments later, the nearly unavailable and much more expensive montior is all theirs, at their bosom (moments before they harped amd wins in high rez triple screen no matter the data) - now suddenly they have a 1920x1200 IPS or whatever...
Here's why...
1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970. Here, the performance difference in favor of the GTX680 are even greater "
1920x1200: " GeForce GTX680 is on average 10.14% more efficient than the Radeon 7970. At slightly higher resolution appears to have slightly worse performance of the new card (compared to 1920x1080). "
That's an over 7% performance difference overall... nVidia still kicks amd's lousy second placer, but it's not SO embarrassing at 17%+....
See, now they all love 1920x1200 and will DEMAND as hyper-harpies that anand keep the monitor rez as is...
In the end it will just be anand "listening" to it's fan base.... R O F L
Dude, they COULD just run their 1920x1200 in 1920x1080 for the benches - it's not hard at all - but you know... amd doesn't look better than crappy as heck then..
I couldn't care less which it is as long as the image is good. I do think you're downplaying the framerate advantage of 1080p over 1200p though as we're talking an extra 11% screen area going from one to the other.
1200p used to be far more common and Apple are one of the manufacturers keeping it alive (along with 4:3 ratios).
Maybe I missed it in the article, but does the lack of hardware changes mean that existing 7970s can be "upgraded" by being flashed with a 7970GE BIOS (so long as they can hit the clock speeds)?
You said that you may use some extra setting for Scyrim... How about using some popular extra large texturemap upgrade? It would be more punishing to use those larger texturemaps, and In Scyrim, like Oblivion before, those texturemaps are guitep popular among users of more poverfull graphic cards!
"There’s a silver lining on this one, though. Ahead of this review, I let AMD know about our acoustic concerns and the company claims that most partner boards will employ third-party cooling, not its reference configuration."
So noise is not an issue at all. Cards like sapphire with dual x , gigabyte with windforce, powercolor with PCS+ have good cooler designs. Power will be more. But the Radeon HD 7970 Ghz edition frankly more than makes up for that with its performance at 1600p and multi monitor setups.
If you are on a 1080p monitor and want perf/watt , price perf and a cooler setup go for custom GTX 670. For the rest who have 1600p or multi monitor frankly there is only one option - a custom Radeon HD 7970 card or a custom Radeon HD 7970 Ghz edition.
AMD should have released a balanced 7970 in the first place, somewhere halfway in performance between the 7970 and GE. Now they have an overconservative card and an powerhungry monster
These features would require a new BIOS. As far as I'm aware, AMD does not support flashing their cards with new BIOS. Anyway, there's nothing there that you can't acheive via normal overclocking anyway (asides from the slightly better chip binning).
Did you try reaching AMD to comment on the rather low performance ceiling on Skyrim? looks as if their drivers are way more CPU hungry than Nvidia's and that's why they are getting capped at a lower rate. Maybe that's what usually hinders performance in other CPU limited titles like WoW?
"All the same AMD has also boosted their memory clocks from 5.5GHz to 6GHz, which will give the card 9% more memory bandwidth with it needs it." => "when it needs it"
Ryan As you mentioned Dirt Showdown will take the place of Dirt 3 in your test suite I would like to make a suggestion that a few more games be changed. Max Payne 3 and Alan Wake are good changes. Maybe Crysis Warhead could be replaced by Alan Wake and Portal 2 by Max Payne 3. Another very demanding game which could find a place is Witcher 2 Enhanced edition. Focusing on games released in the last 12 months in your test suite helps prospective buyers / gamers decide based on performance on recent titles which they will most probably be playing.
Good to hear it this time, as all the prior moaning has been amd fans wailing that the 680 core is already overclocked out of the box ! L M H O
I just want to see the 470, 480, 570, and 580 at the equivalent radeon clocks for those series, and see amd get SPANKED even more in those series.... to be fair, of course....
Yes, thanks so much for saying "you guys always say it" - no - it's not "you guys" - it's the amd fan boys !
This time they didn't moan and complain about fairness, because amd got beat anyway, and they wouldn't if it won, which it did not, I must point out, feeling the overwhelming need to state, again, and again.
They're also comparing the OCed version of the 7970e to the stock 7970e. That seems unfair to me. To be fair, you should only compare the OCed 7970e to itself, the OCed 7970e.
Amd LIED, with their false advertising about this card - their hot loud slow housefire...
Amd is an evil corporate monster who lies to the little children they sell their products to (and to soccer Mommies who actually pay the card price to keep the lies going - granted they pay less than Abu Dhabi oil sheiks)
It is ALSO more efficient. How clueless are you still ? Why do clueless Cluseau's respond ?
Look, if you ever decide to click the link and take a gander for an hour or two ( my estimation about how long it would take for you to get a round opinion of the massive database of the most popular reviewers concerning these tow cards, don't get back to me.
A gigantic thank you would be nice but I'm not expecting it.
Maybe silverblue needs a friend too, then you can spew name calling together, and giggle, that is likely the extent of the mental capacities, so have at it.
Yet amusingly, you failed to point out the error of the author's ways before somebody here pulled you up on it... I doubt that efficiency is a word that can be mis-translated; the author just used the wrong term. The very fact that you quoted two lines with the same incorrect term proves that you were happy enough to treat it - as is so often in your case - as factual. If anything, the 680 is probably something in the region of 10-15% more efficient per frame than the 7970 based off the collated results on that article, notwithstanding the fact that drivers have been significantly revised for both architectures since then.
You also stated that the article was '"their opinion" though, so "it's not wrong"' but you slate everybody else's conflicting opinions as wrong. Am I the only person seeing an issue with this approach?
I'm really confused as to why you even bother to visit here except to be a class-A troll, and I'm going to take some of my own advice and flat out ignore you from now on unless you actually say something of any use. Ordinarily, I wouldn't tell others what to do but on this occasion, I implore them to follow suit. We should put you in a room with Beenthere just for the hell of it.
You're goners in the head dude. The article, which you still obviously never looked at (as it will crush your amd fan heart), collates reviews from around the web, including this sites.
It's not an opinion, it's FACTS, as best we can get them, in one BIG mathematically deduced pile, and the word is meant to be FRAME RATES, which of course is all you amd fan boys claim you care about, unless of course you were spewing about eyefinity without 3 monitors and no $100 adapter that took a year and a half to come down to $35 not available...
Correct. for gaming at the highest settings on ultra high resolution single monitor and multi monitor configurations there is only one leader in the market. Its the Radeon HD 7970 Ghz edition.
From the review: " What we have here is a statistical tie so the consumer’s performance-oriented choice will ultimately come down to brand preference."
Yes, of course.... cherry picking again... so take a hot loud, housefire electrical sucking earth destroying sack of driver crap... it's the best card, of course...
That's great... all the penny pinchers here are right on it, I'm sure...
The FERMI's performance blew the amd card to shreds.
In this case, the house fire amd card sucks the low end of everything, an epic fail on every single metric, and amd has crap for compute software support so they lose there as well, just like any card loses when their driver software sucks in games. Worse yet, amd often takes years to fix anything, if ever...then drops support.
Fermi WON when it "insulted you".
Housefire amd loses everything - total freakin LOSER.
10 - 12% perf lead . So its not as simple as you think. The fact of the matter is Nvidia GTX 680 is losing the majority of games - Deus Ex, Alan Wake, Anno 2070, Crysis 2, Witcher 2, Witcher 2 EE, Dirt 3, Skyrim, Dirt Showdown, Crysis Warhead, Metro 2033. Also the margins in some of the games is very big - Dirt Showdown (50%), Crysis Warhead (15 - 25% depending on resolution) , Metro 2033 (20%), Witcher 2 (20%) , Anno (15%). Nvidia's wins in Shogun 2 clearly. BF3 is not a consistent win when you compare many reviews. Even Batman AC which runs better on GTX 680 with FXAA , runs faster on HD 7970 Ghz edition with 8X MSAA. So its clearly a case of you being in denial mode. so go on keep ridiculing others if that makes you happy.
From your link, Dirt Showdown, where you have just claimed a 50% lead for 7970...
" If the GeForce GTX 680 Radeon HD 7970 equals 1080p without advanced lighting, when it activated its dive performance, Nvidia does not have access to this patch soon enough that in order to propose specific optimizations. It will probably take one to two weeks for this to be corrected."
Let's see 0% or a tie = amd ahead by 50% !!!! according to raghu
LOL - I guess it's all in your heads - not even the reviewers own words can rallte the fantasies of amd out of control fanboys.
I'd say you're trolling but i think the fanboy and lack of intellect has you "doing what you believed was correct". I could be wrong here, for the 1st time ever, though.
The 7970GE isn't a card to buy with a reference cooling solution, obviously. With custom cooling solutions, the noise/temp won't be an issue. It's doubtful you'll see many, if any, manufacturers even release this card with the reference cooler.
In the normal clock tests you test 5760x1200 which is a very good thing. Could you not do the same resolution with your overclock tests as well? I would really like to see how triple monitor performance is overclocked.
Another thing I was wondering, does running triple monitor at 5760x1200 increase power usage of the card or make it run hotter?
1) Obviously it's a bit too late for that in this article, but we can certainly look into doing that for future articles.
2) Generally speaking, no. Unless a card is already operating well under its PT limit (in which case the game is likely CPU-bound), increasing the resolution doesn't significantly shift the power consumption. The actual display controllers are effectively "free" at these power levels.
If any of these people had been paying any attention at all in between articles ( meaning checking on the net) they would already know it takes about 1250 on the 7970 core to equal the 680oc.
1000 doesn't do it. 1050 nope. 1150 nay.
Hexus already proved same core speed results in the 7970 behind. That's already been linked in replies.. so here it is because the amd fans will descend calling names and declaring liar (though they likely saw it before and just can't for the fanboy of themremember, as most brains use the delete key a lot)
I think we can find a 7970ghz edition bios and put it on a regular 7970 and achieve the same performance. I also assume that you have a non reference 7979, like mine a gigabyte wind force you can get a lower temperature and 680 like performance. I just hope that bios is universal.
So, since the 7970 GE is essentially a tweaked OCed 7970, why not include a factory-overclocked nVidia 680 for fairness? There's a whole lot of headroom on those 680s as well that these benches leave untouched and unrepresented.
Standard graph colouring on Anandtech is that the current product is highlighted in green, specific comparison products in red. The graphs on page 3 for driver updates aren't a standard graph for video card reviews.
Also, typo noted on page 18 (OC Gaming Performance), the paragraph under the Portal 2 1920 chart: "With Portal 2 being one of the 7970GE’s biggest defEcits" -- deficits
Computer performance is a big factor in deciding in purchase as well and I am disappointed to not see any mention of this in the conclusion. AMD blows nVidia out the water when it comes to compute performance and this should not be taken lightly seeing as games right now are implementing more and more compute capabilities in games and many other things. Compute performance has been growing and growing and today at a rate higher than ever and it is very disappointing to see no mention of this in Anand's conclusion.
I use autoCAD for work all the time but I also enjoy playing games as well and with a workload like this, AMDs GPU provide a huge advantage over nVidia simply because nVidias GK104 compute performance is no where near that of AMDs. AMD is the obvious choice for someone like me.
As far as the noise and temps go, I personally feel if your spending $500 on a GPU and obviously thousands on your system there is no reason not tospend a couple hundred on water cooling. Water cooling completely eliminates any concern for temps and noise which should make AMDs card the clear choice. Same goes for power consumption. If you're spending thousands on a system there is no reason you should be worried about a couple extra dollars a month on your bill. This is just how I see it. Now don't get me wrong, nVidia has a great card for gaming, but gaming only. AMD offers the best of both worlds. Both gaming and compute and to me, this makes the 7000 series the clear winner to me.
It might help if you had a clue concerning what you're talking about.
" CAD Autodesk with plug-ins are exclusive on Cuda cores Nvidia cards. Going crossfire 7970 will not change that from 5850. Better off go for GTX580."
" The RADEON HD 7000 series will work with Autodesk Autocad and Revitt applications. However, we recommend using the Firepro card as it has full support for the applications you are using as it has the certified drivers. For the list of compatible certified video cards, please visit http://support.amd.com/us/gpudownload/fire/certifi... "
nVidia works out of the box, amd does NOT - you must spend thousands on Firepro.
Welcome to reality, the real one that amd fanboys never travel in.
It might help if you knew what you are talking about...
CAD as infer is AutoCAD by Autodesk and it doesn't have any CUDA dedicated plugin's. You are thinking of 3DS Max's method of Rendering called iRay. That's even fairly new from 2011 release.
There's isn't anything else that uses CUDA processors on a dedicated scale unless its a 3rd Party program or plugin. But not in AutoCAD, AutoCAD barely needs anything. So get it straight.
R-E-V-I-T ( with one T) requires more as there's rendering engine built in not to mention its mostly worked in as a 3D application, unlike AutoCAD which is mostly used in 2D.
Going Crossover won't help because most mid-range and high end single GPU's (AMD & NVIDIA) will be fine for ANY surface modeling and/ or 3D Rendering. If you use the application right you can increase performance numbers instead of increasing card count.
All Autodesk products work with any GPU really, there are supported or "certified" drivers and cards, usually only "CAD" cards like Fire Pro or Quadro's.
Nvidia's and AMD's work right out of the Box, just depends on the Add In Board partner and build quality NVIDIA fan boy. If you're going to state facts , then get your facts straight where it matters. Not your self thought cute remarks.
Do more research or don't state something you know nothing about. I have supported CAD and Engineering Environments and the applications they use for 8yrs now, before that 5 yrs more of IT support experience.
please put up a graph of the 680 overclocked to its maximum potential versus this to its maximum oc, that would be a different story i believe , not sure though. Please do it because on you 680 review there is no OC testing :/
- AMDs boost assumes the stock heatsink - how is this affected by custom / 3rd party heat sinks? Will the chip think it's melting, whereas in reality it's crusing along just fine?
- A simple fix would be to read out the actual temperature diode(s) already present within the chip. Sure, not deterministic.. but AMD could let users switch to this mode for better accuracy.
- AMD could implement a calibration routine into the control panel to adjust the digital temperature estimation to the atcual heat sink present -> this might avoid the problem altogether.
- Overvolting just to reach 1.05 GHz? I don't think this is necessary. Actually, I think AMD is generously overvolting most CPUs and some GPUs in the recent years. Some calibration for the actual chip capability would be nice as well - i.e. test if MY GPU really needs more voltage to reach the boost clock.
- 4 digit product numbers and only fully using 2 of them, plus the 3rd one to a limited extend (only 2 states to distinguish - 5 and 7). This is ridiculous! The numbers are there to indicate performance!!!
- Bring out cheaper 1.5 GB versions for us number crunchers.
- Bring an HD7960 with approx. the same amount of shaders as the HD7950, but ~1 GHz clock speeds. Most chips should easily do this.. and AMD could sell the same chip for more, since it would be faster.
How can you write a review like this, specifically to test one card against another, then only overclock one of them in the "OC gaming performance" section. Push the GTX680 as far as you can too otherwise those results are completely meaningless; for comparison.
I think that's the way people do every review. However, ordinarily I'd recommend looking back at the 680 review, but as we've seen with the new Catalyst drivers, performance can vary over a relatively short period of time. So, a future article such as "AMD's Radeon 7970 and NVIDIA's GTX 680: How Much Difference Can A Few Months Make?" might be very nice *hint hint*. ;)
For simplicity, the OC data should be put up on this graph for reference purposes and ease of use. Who on earth wants to troll a few reviews and collect this data manually? At the very least include a reference link to the previous article that compares the NVidia 680 and provides the OC scores.
Also, instead of a conclusion write up why not have a result summary showing all performed tests, the cards there were used as reference and provide a tabular view clearly showing the top runner of each test (or top 3).
If you're going to use winzip to game, and support evil proprietary corruption in software by amd while using open source, great, hypocrisy and lying to stone cold stupid amd fans for years works well ! Fluid sim - not a game DX11 DC - not a game SmallLux - not a game
Oops ! "Empty" suddenly applies to amd when it wins any "benchmarks that are not real world for end users, ever."
I guess empty crap no one uses, declared fraudulently, as a "win", sways the dark hollow spaces in the hearts and minds of the little amd fans. It's sad.
Hello, I am having 3960x and DX79SI and graphics card asus hd7970-dc2t-3gd5 i am not able to boot the computer. when i am bootiing the computer on mother board 2 digit led shows "00" duble zero and on led screen shows "0_" and stops, but i can reboot the computer useing ctl+atl+del. i can able to oparate bios. that means the computer is not in hanging mode.
I have the Xfx 7970 Ghz edition and I really am not sure what is the big deal with the noise. My Card is not that loud. Honestly Power control settings @ +20%, Gpu core 1175 and memory @ 1600 completely stable. The games I play are at 1080P MAx everything and my GPU rarely gets above 70C, which is only around 40% fan speed. @ 40% fan speed I literally cannot hear the GPU fan unless I have the speakers completely turned off and even still I have to listen carefully to actually discern that the noise I hear is coming from the Video Card. My experience in gaming the GPU fan noise is absolutely NOT an issue. when I'm running synthetic GPU benchmarking apps like geekbench's Furmark then the card will ramp up around 70% fan speed and you can hear it, but even then it is really not an Issue. I am using the latest catylist beta Driver 12.11 which as added 15% increase in BF3 FPS and 10% increase in Dirt 3, basically taking Nvidia's crown in virtually every game. I do lot's of Video transcoding and the openCL domination this card produces is amazing. Yesterday I trancoded a 1080P 5.3GB .mkv file to .mp4 with nero 11 when using AMD's app acceleration codec the transcode took 20 minutes as compared to 60 minutes when I used Nero's .mp4 codec at the same output settings. Durring the Transcoding the GPU stays at I believe 300 mhz with the GPU at 20% load average. when doing transcoding the gpu hoovers around 111F with the Fan at like 5%. I love this card. My Computer has three states, Idle 60% of the time, gaming and transcoding 40% of the time. At Idle with AMD's zero core this video card is using 10 watts less than Nvidia's 680, In gaming it's beating the 680 in almost every game now, and when it comes to encoding open cl and open gl it's basically a blowout averaging 75% more than the 680. If your an Nvidia fan (I formally was) and open CL is important to you, go with the Fermi cards because on most GPGPU processing they outperform with Kepler cards.
IF you question anything I've said do some google homework. Catalyst 12.11 actually does what they say, I can attest to it at least when it comes to Encoding, playing BF3 and Dirt 3
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
110 Comments
Back to Article
clumsyalex - Friday, June 22, 2012 - link
the first chart, the regular 7970 is priced higher than the ghz edition. the second chart shows it as lower howeverRyan Smith - Friday, June 22, 2012 - link
Actually those are a list of launch prices up top. The 7970 launched at $550, which is indeed higher than the $500 launch price of the 7970GE.EnerJi - Friday, June 22, 2012 - link
It's confusing and misleading. The first thing I thought when I saw it was that you had accidentally reversed the prices between the two models.Iketh - Friday, June 22, 2012 - link
that certainly isn't what I thought... i understood what was being presented to meCeriseCogburn - Saturday, June 23, 2012 - link
I love how amd has a birthday for tahiti at 6 months....Why wait a year for a birthday when you're a lying sack of crap corporate monster rip off crummy drivers fan boy mass brainwash co ?
Heck, two birthdays a year !!! amd is so great, they get two birthdays a year !
silverblue - Monday, June 25, 2012 - link
People in the first few months of a relationship like to mention anniversaries a lot despite the (rather obvious) point that the word denotes a yearly period. "Milestone" would be more appropriate though it does sound less glamourous and perhaps a bit pessimistic (well, in the case of relationships, anyway). Might even seem cynical.Captmorgan09 - Friday, June 22, 2012 - link
Just read the chart and it's not confusing... I did a double take the first time I glanced at it, but when I actually read it it made perfect sense. :)Ryan Smith - Friday, June 22, 2012 - link
In case it's not clear, since we have a price comparison chart at the bottom, the purpose of the prices up top is to help describe the cards. The fact that the 7970GE is listed for $500 next to the $550 7970 for example is to make it clear that it's launching at a lower price than the 7970. It helps offer some perspective on capabilities and the market segment it's designed for.That said, we can always get rid of it if it's a problem.
QChronoD - Friday, June 22, 2012 - link
Could I suggest adding when they launched on the line right above the prices? I can easily see how that is confusing, but also knowing how old each generation would be useful to see.Ryan Smith - Friday, June 22, 2012 - link
Now that's an excellent idea!Belard - Friday, June 22, 2012 - link
Agreed.Articuno - Friday, June 22, 2012 - link
A whole new card launch and yet another pair of similarly named but differently performing products because they changed a few numbers that anyone can in several free, easily available programs.I suppose they can do this because you can actually buy their products though, unlike the 6XX series.
ExarKun333 - Friday, June 22, 2012 - link
Yeah, tough to find 6xx products indeed. There is something called the 'internet' you could check out. Your buddy who posted for you might be able to help you out. ;)Pantsu - Friday, June 22, 2012 - link
I doubt any AIB will actually release GE cards with reference cooling. Most likely they will be custom cooled, so the loudness of the reference card is a bit of a moot point.It's good to see some decent driver improvements from AMD. I'm still quite happy about 7970 performance at 5760x1080, and it's enough for most games when OC'd. It would be interesting to see though, whether the GE has improved the max OC. Most likely it's no better though, and you'll be better off buying an old custom 7970 for a good price and OC'ing it to the same levels as the GE.
dagamer34 - Friday, June 22, 2012 - link
The GE chips are better binned parts, one would assume that they have a bit more room for higher clocks than the normal 7970 parts. Certainly the average overclock will be higher.CeriseCogburn - Saturday, June 23, 2012 - link
So we can deduce that the prior 7970 overclocks were sucking down an even larger amount of enormous electrical power as those chips are of lower bin.I guess we need an overclocked power suction chart with an extended table for the amd housefire 7970.
Any savings on card price or a few frames at a resolution near no one owns will be gobbled up by your electric bill every month for years - save 1 watt or 9 watts at extended idle, but when you game it's 100+ watts and beyond with the overclocked 7970 - maybe they should be $300 with 3 games.
silverblue - Monday, June 25, 2012 - link
Well, it works both ways. You won't always be gaming, in addition there's all that compute hardware that, if properly harnessed, would save you money over competing solutions because you'd get the job done quicker. It used to be pointless to consider using anything for compute that wasn't a Quadro, Tesla or even FirePro, however those days are coming to an end.Having a 7970 will make sense for compute if that's your bag (there's a reason for the die size plus the extra memory and bus width), but this time, NVIDIA enjoys a performance/watt advantage which might go unchallenged for a while. Unless, of course, that extra hardware on the 7970 is properly leveraged; future games, perhaps?
ltcommanderdata - Friday, June 22, 2012 - link
So do we think this will encourage nVidia to release a GeForce GK110 based product in the next few months rather than restrict it to Tesla?PsiAmp - Friday, June 22, 2012 - link
Nvidia isn't holding GK110 in its sleeve waiting for something. It is unfinished in the first place and there's no manufacturing capacity to produce such a large chip. Nvidia still struggles to fix GK104 design to have good yields. GK110 would be impossible to produce in since it is twice bigger and such will have at least 4 times less yield.Server market is not only much more profitable, it is operating on a contract basis. Nvidia will start to produce Tesla K20 in Q4 2012.
IF(?) desktop card based on GK110 will hit the market it won't be sooner than Q1 2013. And it is not something that you can change really.
silverblue - Friday, June 22, 2012 - link
"Of course this isn’t the first time we’ve had a hot & loud card on our hands – historically it happens to NVIDIA a lot – but when NVIDIA gets hot & loud they bring the performance necessary to match it. Such was the case with the GTX 480, a notably loud card that also had a 15% performance advantage on AMD’s flagship. AMD has no such performance advantage here, and that makes the 7970GE’s power consumption and noise much harder to justify even with a “performance at any cost” philosophy."Very true, however the power consumption and heat difference between the 5870 and the 480 was definitely more pronounced.
The 680 is an incredible card, no doubt about it. It may not win in some titles, but it's hardly anywhere near unplayable either. AMD being right there at the very high end is fantastic but unless titles truly make use of GCN's compute ability, the extra power and noise are going to be hard to swallow. Still, I'd own either. :P
piroroadkill - Friday, June 22, 2012 - link
While the noise is bad - the manufacturers are going to spew out non-reference, quiet designs in moments, so I don't think it's an issue.silverblue - Friday, June 22, 2012 - link
Toms added a custom cooler (Gelid Icy Vision-A) to theirs which reduced noise and heat noticably (about 6 degrees C and 7-8 dB). Still, it would be cheaper to get the vanilla 7970, add the same cooling solution, and clock to the same levels; that way, you'd end up with a GHz Edition clocked card which is cooler and quieter for about the same price as the real thing, albeit lacking the new boost feature.ZoZo - Friday, June 22, 2012 - link
Would it be possible to drop the 1920x1200 definition for test? 16/10 is dead, 1080p has been the standard for high definition on PC monitors for at least 4 years now, it's more than time to catch up with reality... Sorry for the rant, I'm probably nitpicking anyway...Reikon - Friday, June 22, 2012 - link
Uh, no. 16:10 at 1920x1200 is still the standard for high quality IPS 24" monitors, which is a fairly typical choice for enthusiasts.paraffin - Saturday, June 23, 2012 - link
I haven't been seeing many 16:10 monitors around thesedays, besides, since AT even tests iGPU performance at ANYTHING BUT 1080p your "enthusiast choice" argument is invalid. 16:10 is simply a l33t factor in a market dominated by 16:9. I'll take my cheap 27" 1080p TN's spaciousness and HD content nativiness over your pricy 24" 1200p IPS' "quality" anyday.CeriseCogburn - Saturday, June 23, 2012 - link
I went over this already with the amd fanboys.For literally YEARS they have had harpy fits on five and ten dollar card pricing differences, declaring amd the price perf queen.
Then I pointed out nVidia wins in 1920x1080 by 17+% and only by 10+% in 1920x1200 - so all of a sudden they ALL had 1920x1200 monitors, they were not rare, and they have hundreds of extra dollars of cash to blow on it, and have done so, at no extra cost to themselves and everyone else (who also has those), who of course also chooses such monitors because they all love them the mostest...
Then I gave them egg counts, might as well call it 100 to 1 on availability if we are to keep to their own hyperactive price perf harpying, and the lowest available higher rez was $50 more, which COST NOTHING because it helps amd, of course....
I pointed out Anand pointed out in the then prior article it's an ~11% pixel difference, so they were told to calculate the frame rate difference... (that keeps amd up there in scores and winning a few they wouldn't otherwise).
Dude, MKultra, Svengali, Jim Wand, and mass media, could not, combined, do a better job brainwashing the amd fan boy.
Here's the link, since I know a thousand red-winged harpies are ready to descend en masse and caw loudly in protest...
http://translate.google.pl/translate?hl=pl&sl=...
1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970.
Here, the performance difference in favor of the GTX680 are even greater"
So they ALL have a 1920x1200, and they are easily available, the most common, cheap, and they look great, and most of them have like 2 or 3 of those, and it was no expense, or if it was, they are happy to pay it for the red harpy from hades card.
silverblue - Monday, June 25, 2012 - link
Your comparison article is more than a bit flawed. The PCLab results, in particular, have been massively updated since that article. Looks like they've edited the original article, which is a bit odd. Still, AMD goes from losing badly in a few cases to not losing so badly after all, as the results on this article go to show. They don't displace the 680 as the best gaming card of the moment, but it certainly narrows the gap (even if the GHz Edition didn't exist).Also, without a clear idea of specs and settings, how can you just grab results for a given resolution from four or five different sites for each card, add them up and proclaim a winner? I could run a comparison between a 680 and 7970 in a given title with the former using FXAA and the latter using 8xMSAA, doesn't mean it's a good comparison. I could run Crysis 2 without any AA and AF at all at a given resolution on one card and then put every bell and whistle on for the other - without the playing field being even, it's simply invalid. Take each review at its own merits because at least then you can be sure of the test environment.
As for 1200p monitors... sure, they're more expensive, but it doesn't mean people don't have them. You're just bitter because you got the wrong end of the stick by saying nobody owned 1200p monitors then got slapped down by a bunch of 1200p monitor owners. Regardless, if you're upset that NVIDIA suddenly loses performance as you ramp up the vertical resolution, how is that AMD's fault? Did it also occur to you that people with money to blow on $500 graphics cards might actually own good monitors as well? I bet there are some people here with 680s who are rocking on 1200p monitors - are you going to rag (or shall I say "rage"?) on them, too?
If you play on a 1080p panel then that's your prerogative, but considering the power of the 670/680/7970, I'd consider that a waste.
FMinus - Friday, June 22, 2012 - link
Simply put; No!1080p is the second worst thing that happened to the computer market in the recent years. The first worst thing being phasing out 4:3 monitors.
Tegeril - Friday, June 22, 2012 - link
Yeah seriously, keep your 16:9, bad color reproduction away from these benchmarks.kyuu - Friday, June 22, 2012 - link
16:10 snobs are seriously getting out-of-touch when they start claiming that their aspect ratio gives better color reproduction. There are plenty of high-quality 1080p IPS monitors on the market -- I'm using one.That being said, it's not really important whether it's benchmarked at x1080 or x1200. There is a neglible difference in the number of pixels being drawn (one of the reasons I roll my eyes at 16:10 snobs). If you're using a 1080p monitor, just add anywhere from 0.5 to 2 FPS to the average FPS results from x1200.
Disclaimer: I have nothing *against* 16:10. All other things being equal, I'd choose 16:10 over 16:9. However, with 16:9 monitors being so much cheaper, I can't justify paying a huge premium for a measily 120 lines of vertical resolution. If you're willing to pay for it, great, but kindly don't pretend that doing so somehow makes you superior.
CeriseCogburn - Saturday, June 23, 2012 - link
They can justify it, the are the amd fanboy. Ever DOLLAR counts when it comes to card pricing, five or ten bucks makes amd the WINNER !!!!!!!! and greatest card value ever for enthusiasts !!!!!!!!!!!But then, moments later, the nearly unavailable and much more expensive montior is all theirs, at their bosom (moments before they harped amd wins in high rez triple screen no matter the data) - now suddenly they have a 1920x1200 IPS or whatever...
Here's why...
1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970.
Here, the performance difference in favor of the GTX680 are even greater "
1920x1200: " GeForce GTX680 is on average 10.14% more efficient than the Radeon 7970.
At slightly higher resolution appears to have slightly worse performance of the new card (compared to 1920x1080). "
That's an over 7% performance difference overall... nVidia still kicks amd's lousy second placer, but it's not SO embarrassing at 17%+....
See, now they all love 1920x1200 and will DEMAND as hyper-harpies that anand keep the monitor rez as is...
In the end it will just be anand "listening" to it's fan base.... R O F L
Dude, they COULD just run their 1920x1200 in 1920x1080 for the benches - it's not hard at all - but you know... amd doesn't look better than crappy as heck then..
CeriseCogburn - Saturday, June 23, 2012 - link
link (since the descending swarm won't see it above)http://translate.google.pl/translate?hl=pl&sl=...
silverblue - Monday, June 25, 2012 - link
I couldn't care less which it is as long as the image is good. I do think you're downplaying the framerate advantage of 1080p over 1200p though as we're talking an extra 11% screen area going from one to the other.1200p used to be far more common and Apple are one of the manufacturers keeping it alive (along with 4:3 ratios).
Ananke - Friday, June 22, 2012 - link
Real good dudes use 1920*1200silverblue - Monday, June 25, 2012 - link
Nah. With a card like these, I'd rather use 2560xwhatever. :PZok - Friday, June 22, 2012 - link
Maybe I missed it in the article, but does the lack of hardware changes mean that existing 7970s can be "upgraded" by being flashed with a 7970GE BIOS (so long as they can hit the clock speeds)?haukionkannel - Friday, June 22, 2012 - link
No it can not be upgraded... So what else has been changed?http://www.tomshardware.com/reviews/radeon-hd-7970...
Zok - Friday, June 22, 2012 - link
Well that's disappointing. Wonder if there are hardware changes or a workaround is possible.haukionkannel - Friday, June 22, 2012 - link
You said that you may use some extra setting for Scyrim... How about using some popular extra large texturemap upgrade? It would be more punishing to use those larger texturemaps, and In Scyrim, like Oblivion before, those texturemaps are guitep popular among users of more poverfull graphic cards!milkod2001 - Friday, June 22, 2012 - link
7950/70 games+computing /trade off :noise+power less efficient670/80 games/trade off:weak in computing
the only card I find as a good choice would be 670 but it needs to get to 300-350 price level
piroroadkill - Friday, June 22, 2012 - link
There will be custom designs with nice quiet coolers flooding the market in no time..raghu78 - Friday, June 22, 2012 - link
http://www.tomshardware.com/reviews/radeon-hd-7970..."There’s a silver lining on this one, though. Ahead of this review, I let AMD know about our acoustic concerns and the company claims that most partner boards will employ third-party cooling, not its reference configuration."
So noise is not an issue at all. Cards like sapphire with dual x , gigabyte with windforce, powercolor with PCS+ have good cooler designs. Power will be more. But the Radeon HD 7970 Ghz edition frankly more than makes up for that with its performance at 1600p and multi monitor setups.
If you are on a 1080p monitor and want perf/watt , price perf and a cooler setup go for custom GTX 670. For the rest who have 1600p or multi monitor frankly there is only one option - a custom Radeon HD 7970 card or a custom Radeon HD 7970 Ghz edition.
CeriseCogburn - Saturday, June 23, 2012 - link
Hogwash, the GTX 680's and GTX 670's are still SMOOTHER, and more enjoyable at high rez and multi monitor.Check the hundreds of reviews.
Tuvok86 - Friday, June 22, 2012 - link
AMD should have released a balanced 7970 in the first place, somewhere halfway in performance between the 7970 and GE. Now they have an overconservative card and an powerhungry monsterReikon - Friday, June 22, 2012 - link
On page 3:"For AMD the 7970GE will be launching with the Catalyst 10.7 beta, while NVIDIA has released the 304.48 betas for the entire lineup."
I think you mean 12.7
Ryan Smith - Friday, June 22, 2012 - link
Whoops. Thanks.fausto412 - Friday, June 22, 2012 - link
Will these new Power Tune and overclocking advancements trickle down 6900 series cards to unleash more performance safely?What would prevent AMD from affective the 6990 with these new advancements?
kyuu - Friday, June 22, 2012 - link
These features would require a new BIOS. As far as I'm aware, AMD does not support flashing their cards with new BIOS. Anyway, there's nothing there that you can't acheive via normal overclocking anyway (asides from the slightly better chip binning).silverblue - Friday, June 22, 2012 - link
...Another Mention (of) Deterministic, it seems.gonchuki - Friday, June 22, 2012 - link
Did you try reaching AMD to comment on the rather low performance ceiling on Skyrim? looks as if their drivers are way more CPU hungry than Nvidia's and that's why they are getting capped at a lower rate.Maybe that's what usually hinders performance in other CPU limited titles like WoW?
AnnihilatorX - Friday, June 22, 2012 - link
"All the same AMD has also boosted their memory clocks from 5.5GHz to 6GHz, which will give the card 9% more memory bandwidth with it needs it."=>
"when it needs it"
raghu78 - Friday, June 22, 2012 - link
RyanAs you mentioned Dirt Showdown will take the place of Dirt 3 in your test suite I would like to make a suggestion that a few more games be changed. Max Payne 3 and Alan Wake are good changes. Maybe Crysis Warhead could be replaced by Alan Wake and Portal 2 by Max Payne 3. Another very demanding game which could find a place is Witcher 2 Enhanced edition. Focusing on games released in the last 12 months in your test suite helps prospective buyers / gamers decide based on performance on recent titles which they will most probably be playing.
HighTech4US - Friday, June 22, 2012 - link
You are showing the OCed version of the 7970e in comparison to the stock GTX680.For fairness the stock GTX680 should be Overclocked also.
kyuu - Friday, June 22, 2012 - link
There's nothing fair or unfair about overclocking. Why do people bring this up every time there's one of these articles...CeriseCogburn - Saturday, June 23, 2012 - link
Good to hear it this time, as all the prior moaning has been amd fans wailing that the 680 core is already overclocked out of the box !L M H O
I just want to see the 470, 480, 570, and 580 at the equivalent radeon clocks for those series, and see amd get SPANKED even more in those series.... to be fair, of course....
Yes, thanks so much for saying "you guys always say it" - no - it's not "you guys" - it's the amd fan boys !
This time they didn't moan and complain about fairness, because amd got beat anyway, and they wouldn't if it won, which it did not, I must point out, feeling the overwhelming need to state, again, and again.
This sums it up well: AMD loses, as usual
seapeople - Friday, June 22, 2012 - link
They're also comparing the OCed version of the 7970e to the stock 7970e. That seems unfair to me. To be fair, you should only compare the OCed 7970e to itself, the OCed 7970e.Arbie - Friday, June 22, 2012 - link
It's probably too late now, but there were at least three more places where you could have used "performance crown" in the opening paragraphs.
CeriseCogburn - Saturday, June 23, 2012 - link
Amd LIED, with their false advertising about this card - their hot loud slow housefire...Amd is an evil corporate monster who lies to the little children they sell their products to (and to soccer Mommies who actually pay the card price to keep the lies going - granted they pay less than Abu Dhabi oil sheiks)
Wreckage - Friday, June 22, 2012 - link
Another successful paperlaunchI'm not sure why they could not wait until they had actual product available.
behrouz - Friday, June 22, 2012 - link
AMD Radeon HD 7970 GHz Edition = $499AMD Radeon HD 7970 = $549 ? or $449 ?
behrouz - Friday, June 22, 2012 - link
nevermind i got itLepton87 - Friday, June 22, 2012 - link
I don't agree that 7970GHz isn't any faster than GTX680.http://www.techpowerup.com/reviews/AMD/HD_7970_GHz...
Just look at the performance summaries. At 2560x1600 it's clearly the faster card.
Homeles - Friday, June 22, 2012 - link
Keep in mind, every 680 boosts differently. Every site is going to have different opinions because of this.CeriseCogburn - Saturday, June 23, 2012 - link
That's why a collated average is so helpful."
Summary of results at a resolution of 2560x1600:
GeForce GTX680 is on average 32.36% more efficient than GeForce GTX580,
GeForce GTX680 is on average 6.39% more efficient than the Radeon 7970. "
http://translate.google.pl/translate?hl=pl&sl=...
The GTX 680 wins. It's clear beyond any amd fanboys illusions, wishes, and fantasies, most often stated every time, till the day they croak it.
It's "their opinion" though, so "it's not wrong"... (if that tracks as true for you check your forehead for 3 stamped letters.)
thebluephoenix - Sunday, June 24, 2012 - link
Efficiency? You know that 5870 was far more efficient than GTX480. 6970 also compared well to GTX 580.Before calling people fanboys be sure that you aren't one.
For me it's simple, 7970 has good compute performance, and GTX 680 has PhysX.
7970 GE = 7970 OC Edition, still a very good card.
CeriseCogburn - Tuesday, June 26, 2012 - link
It's the wording used by the collator, a foreigner no doubt, "efficiency" you fool, since you didn't check the link.It MEANS FRAME RATE.
Leave it to the retarded, once again, to jump, screech, and FAIL.
thebluephoenix - Thursday, June 28, 2012 - link
Efficiency usually means energy efficiency. Perf/Watt, (or rMAX/rPeak, on Top500 site).Except for google translated polish pages, obviously.
Frame rate is speed, so the card is faster, not more efficient.
Go now, be (nV)idiot somewhere else.
CeriseCogburn - Saturday, June 30, 2012 - link
It is ALSO more efficient. How clueless are you still ? Why do clueless Cluseau's respond ?Look, if you ever decide to click the link and take a gander for an hour or two ( my estimation about how long it would take for you to get a round opinion of the massive database of the most popular reviewers concerning these tow cards, don't get back to me.
A gigantic thank you would be nice but I'm not expecting it.
Maybe silverblue needs a friend too, then you can spew name calling together, and giggle, that is likely the extent of the mental capacities, so have at it.
silverblue - Friday, June 29, 2012 - link
Yet amusingly, you failed to point out the error of the author's ways before somebody here pulled you up on it... I doubt that efficiency is a word that can be mis-translated; the author just used the wrong term. The very fact that you quoted two lines with the same incorrect term proves that you were happy enough to treat it - as is so often in your case - as factual. If anything, the 680 is probably something in the region of 10-15% more efficient per frame than the 7970 based off the collated results on that article, notwithstanding the fact that drivers have been significantly revised for both architectures since then.You also stated that the article was '"their opinion" though, so "it's not wrong"' but you slate everybody else's conflicting opinions as wrong. Am I the only person seeing an issue with this approach?
I'm really confused as to why you even bother to visit here except to be a class-A troll, and I'm going to take some of my own advice and flat out ignore you from now on unless you actually say something of any use. Ordinarily, I wouldn't tell others what to do but on this occasion, I implore them to follow suit. We should put you in a room with Beenthere just for the hell of it.
CeriseCogburn - Saturday, June 30, 2012 - link
You're goners in the head dude.The article, which you still obviously never looked at (as it will crush your amd fan heart), collates reviews from around the web, including this sites.
It's not an opinion, it's FACTS, as best we can get them, in one BIG mathematically deduced pile, and the word is meant to be FRAME RATES, which of course is all you amd fan boys claim you care about, unless of course you were spewing about eyefinity without 3 monitors and no $100 adapter that took a year and a half to come down to $35 not available...
Just face the facts for once, like a man.
raghu78 - Friday, June 22, 2012 - link
Correct. for gaming at the highest settings on ultra high resolution single monitor and multi monitor configurations there is only one leader in the market. Its the Radeon HD 7970 Ghz edition.http://www.hardwarecanucks.com/forum/hardware-canu...
Look at the 2560x 1600 extreme and 5760 x 1080 perf average. The HD 7970 Ghz edition is faster .
From the review
"In our opinion and with all other things being equal, the HD 7970 GHz Edition is the card to have for ultra high resolution gaming"
CeriseCogburn - Saturday, June 23, 2012 - link
From the review: " What we have here is a statistical tie so the consumer’s performance-oriented choice will ultimately come down to brand preference."Yes, of course.... cherry picking again... so take a hot loud, housefire electrical sucking earth destroying sack of driver crap... it's the best card, of course...
That's great... all the penny pinchers here are right on it, I'm sure...
silverblue - Monday, June 25, 2012 - link
Fermi?Low blow, I admit.
CeriseCogburn - Tuesday, June 26, 2012 - link
The FERMI's performance blew the amd card to shreds.In this case, the house fire amd card sucks the low end of everything, an epic fail on every single metric, and amd has crap for compute software support so they lose there as well, just like any card loses when their driver software sucks in games.
Worse yet, amd often takes years to fix anything, if ever...then drops support.
Fermi WON when it "insulted you".
Housefire amd loses everything - total freakin LOSER.
raghu78 - Tuesday, June 26, 2012 - link
FYI there are reviews showing much bigger winshttp://www.hardware.fr/articles/869-18/recapitulat...
10 - 12% perf lead . So its not as simple as you think. The fact of the matter is Nvidia GTX 680 is losing the majority of games - Deus Ex, Alan Wake, Anno 2070, Crysis 2, Witcher 2, Witcher 2 EE, Dirt 3, Skyrim, Dirt Showdown, Crysis Warhead, Metro 2033. Also the margins in some of the games is very big - Dirt Showdown (50%), Crysis Warhead (15 - 25% depending on resolution) , Metro 2033 (20%), Witcher 2 (20%) , Anno (15%). Nvidia's wins in Shogun 2 clearly. BF3 is not a consistent win when you compare many reviews. Even Batman AC which runs better on GTX 680 with FXAA , runs faster on HD 7970 Ghz edition with 8X MSAA. So its clearly a case of you being in denial mode. so go on keep ridiculing others if that makes you happy.
CeriseCogburn - Tuesday, June 26, 2012 - link
Hey fanboy, your summary page shows the 680 at their weighted average special importance to each game 127 frames, and 7970, 127 frames.LOL - Amazing how you got 20% and 50% from EQUAL.
I tell you, the lies and spinning exceed political debate norms.
CeriseCogburn - Tuesday, June 26, 2012 - link
From your link, Dirt Showdown, where you have just claimed a 50% lead for 7970..." If the GeForce GTX 680 Radeon HD 7970 equals 1080p without advanced lighting, when it activated its dive performance, Nvidia does not have access to this patch soon enough that in order to propose specific optimizations. It will probably take one to two weeks for this to be corrected."
Let's see 0% or a tie = amd ahead by 50% !!!! according to raghu
LOL - I guess it's all in your heads - not even the reviewers own words can rallte the fantasies of amd out of control fanboys.
http://translate.google.pl/translate?hl=pl&sl=...
I'd say you're trolling but i think the fanboy and lack of intellect has you "doing what you believed was correct".
I could be wrong here, for the 1st time ever, though.
meloz - Friday, June 22, 2012 - link
Excellent review, Ryan!Performance is great, but the noise is a big issue. I hope in future manufacturers pay more attention to this aspect.
kyuu - Friday, June 22, 2012 - link
The 7970GE isn't a card to buy with a reference cooling solution, obviously. With custom cooling solutions, the noise/temp won't be an issue. It's doubtful you'll see many, if any, manufacturers even release this card with the reference cooler.CeriseCogburn - Saturday, June 23, 2012 - link
So amd sucks, but it's all good.Maybe that should be their new pr slogan.
HighTech4US - Tuesday, June 26, 2012 - link
LOLor
AMD sucks, but in a good way
Margalus - Friday, June 22, 2012 - link
In the normal clock tests you test 5760x1200 which is a very good thing. Could you not do the same resolution with your overclock tests as well? I would really like to see how triple monitor performance is overclocked.Another thing I was wondering, does running triple monitor at 5760x1200 increase power usage of the card or make it run hotter?
Ryan Smith - Friday, June 22, 2012 - link
1) Obviously it's a bit too late for that in this article, but we can certainly look into doing that for future articles.2) Generally speaking, no. Unless a card is already operating well under its PT limit (in which case the game is likely CPU-bound), increasing the resolution doesn't significantly shift the power consumption. The actual display controllers are effectively "free" at these power levels.
Margalus - Friday, June 22, 2012 - link
thanksmedi01 - Friday, June 22, 2012 - link
Not retaking back performance crown?cmdrdredd - Friday, June 22, 2012 - link
Nope because it doesn't win every single benchmark. Just because it wins one resolution doesn't equate to being the fastest one.CeriseCogburn - Tuesday, June 26, 2012 - link
If any of these people had been paying any attention at all in between articles ( meaning checking on the net) they would already know it takes about 1250 on the 7970 core to equal the 680oc.1000 doesn't do it. 1050 nope. 1150 nay.
Hexus already proved same core speed results in the 7970 behind. That's already been linked in replies.. so here it is because the amd fans will descend calling names and declaring liar (though they likely saw it before and just can't for the fanboy of themremember, as most brains use the delete key a lot)
http://hexus.net/tech/reviews/graphics/37209-gefor...
RaistlinZ - Friday, June 22, 2012 - link
My regular stock 7970 overclocks higher than this Ghz Edition, and does it on lower voltages. At least this makes the prices drop on regular 7970's.owendingding - Friday, June 22, 2012 - link
I think we can find a 7970ghz edition bios and put it on a regular 7970 and achieve the same performance. I also assume that you have a non reference 7979, like mine a gigabyte wind force you can get a lower temperature and 680 like performance. I just hope that bios is universal.CeriseCogburn - Tuesday, June 26, 2012 - link
Someone already posted Tom's shows it is NOT universal and cannot just be flashed to any 7970.NOPE. Amd locks you out, because they are evil.
Ammaross - Friday, June 22, 2012 - link
So, since the 7970 GE is essentially a tweaked OCed 7970, why not include a factory-overclocked nVidia 680 for fairness? There's a whole lot of headroom on those 680s as well that these benches leave untouched and unrepresented.elitistlinuxuser - Friday, June 22, 2012 - link
Can it run pong and at what frame ratesRumpelstiltstein - Friday, June 22, 2012 - link
Why is Nvidia red and AMD Green?Galcobar - Friday, June 22, 2012 - link
Standard graph colouring on Anandtech is that the current product is highlighted in green, specific comparison products in red. The graphs on page 3 for driver updates aren't a standard graph for video card reviews.Also, typo noted on page 18 (OC Gaming Performance), the paragraph under the Portal 2 1920 chart: "With Portal 2 being one of the 7970GE’s biggest defEcits" -- deficits
mikezachlowe2004 - Sunday, June 24, 2012 - link
Computer performance is a big factor in deciding in purchase as well and I am disappointed to not see any mention of this in the conclusion. AMD blows nVidia out the water when it comes to compute performance and this should not be taken lightly seeing as games right now are implementing more and more compute capabilities in games and many other things. Compute performance has been growing and growing and today at a rate higher than ever and it is very disappointing to see no mention of this in Anand's conclusion.I use autoCAD for work all the time but I also enjoy playing games as well and with a workload like this, AMDs GPU provide a huge advantage over nVidia simply because nVidias GK104 compute performance is no where near that of AMDs. AMD is the obvious choice for someone like me.
As far as the noise and temps go, I personally feel if your spending $500 on a GPU and obviously thousands on your system there is no reason not tospend a couple hundred on water cooling. Water cooling completely eliminates any concern for temps and noise which should make AMDs card the clear choice. Same goes for power consumption. If you're spending thousands on a system there is no reason you should be worried about a couple extra dollars a month on your bill. This is just how I see it. Now don't get me wrong, nVidia has a great card for gaming, but gaming only. AMD offers the best of both worlds. Both gaming and compute and to me, this makes the 7000 series the clear winner to me.
CeriseCogburn - Sunday, June 24, 2012 - link
It might help if you had a clue concerning what you're talking about." CAD Autodesk with plug-ins are exclusive on Cuda cores Nvidia cards. Going crossfire 7970 will not change that from 5850. Better off go for GTX580."
" The RADEON HD 7000 series will work with Autodesk Autocad and Revitt applications. However, we recommend using the Firepro card as it has full support for the applications you are using as it has the certified drivers. For the list of compatible certified video cards, please visit http://support.amd.com/us/gpudownload/fire/certifi... "
nVidia works out of the box, amd does NOT - you must spend thousands on Firepro.
Welcome to reality, the real one that amd fanboys never travel in.
spdrcrtob - Tuesday, July 17, 2012 - link
It might help if you knew what you are talking about...CAD as infer is AutoCAD by Autodesk and it doesn't have any CUDA dedicated plugin's. You are thinking of 3DS Max's method of Rendering called iRay. That's even fairly new from 2011 release.
There's isn't anything else that uses CUDA processors on a dedicated scale unless its a 3rd Party program or plugin. But not in AutoCAD, AutoCAD barely needs anything. So get it straight.
R-E-V-I-T ( with one T) requires more as there's rendering engine built in not to mention its mostly worked in as a 3D application, unlike AutoCAD which is mostly used in 2D.
Going Crossover won't help because most mid-range and high end single GPU's (AMD & NVIDIA) will be fine for ANY surface modeling and/ or 3D Rendering. If you use the application right you can increase performance numbers instead of increasing card count.
All Autodesk products work with any GPU really, there are supported or "certified" drivers and cards, usually only "CAD" cards like Fire Pro or Quadro's.
Nvidia's and AMD's work right out of the Box, just depends on the Add In Board partner and build quality NVIDIA fan boy. If you're going to state facts , then get your facts straight where it matters. Not your self thought cute remarks.
Do more research or don't state something you know nothing about. I have supported CAD and Engineering Environments and the applications they use for 8yrs now, before that 5 yrs more of IT support experience.
aranilah - Monday, June 25, 2012 - link
please put up a graph of the 680 overclocked to its maximum potential versus this to its maximum oc, that would be a different story i believe , not sure though. Please do it because on you 680 review there is no OC testing :/MrSpadge - Monday, June 25, 2012 - link
- AMDs boost assumes the stock heatsink - how is this affected by custom / 3rd party heat sinks? Will the chip think it's melting, whereas in reality it's crusing along just fine?- A simple fix would be to read out the actual temperature diode(s) already present within the chip. Sure, not deterministic.. but AMD could let users switch to this mode for better accuracy.
- AMD could implement a calibration routine into the control panel to adjust the digital temperature estimation to the atcual heat sink present -> this might avoid the problem altogether.
- Overvolting just to reach 1.05 GHz? I don't think this is necessary. Actually, I think AMD is generously overvolting most CPUs and some GPUs in the recent years. Some calibration for the actual chip capability would be nice as well - i.e. test if MY GPU really needs more voltage to reach the boost clock.
- 4 digit product numbers and only fully using 2 of them, plus the 3rd one to a limited extend (only 2 states to distinguish - 5 and 7). This is ridiculous! The numbers are there to indicate performance!!!
- Bring out cheaper 1.5 GB versions for us number crunchers.
- Bring an HD7960 with approx. the same amount of shaders as the HD7950, but ~1 GHz clock speeds. Most chips should easily do this.. and AMD could sell the same chip for more, since it would be faster.
Hrel - Monday, June 25, 2012 - link
How can you write a review like this, specifically to test one card against another, then only overclock one of them in the "OC gaming performance" section. Push the GTX680 as far as you can too otherwise those results are completely meaningless; for comparison.silverblue - Tuesday, June 26, 2012 - link
I think that's the way people do every review. However, ordinarily I'd recommend looking back at the 680 review, but as we've seen with the new Catalyst drivers, performance can vary over a relatively short period of time. So, a future article such as "AMD's Radeon 7970 and NVIDIA's GTX 680: How Much Difference Can A Few Months Make?" might be very nice *hint hint*. ;)Temelj - Thursday, July 12, 2012 - link
For simplicity, the OC data should be put up on this graph for reference purposes and ease of use. Who on earth wants to troll a few reviews and collect this data manually? At the very least include a reference link to the previous article that compares the NVidia 680 and provides the OC scores.Also, instead of a conclusion write up why not have a result summary showing all performed tests, the cards there were used as reference and provide a tabular view clearly showing the top runner of each test (or top 3).
b3nzint - Wednesday, June 27, 2012 - link
So what about ; DX11 DirectCompute, SmallLuxGPU, Fluid simulation, WinZip 16.5 tests. amd is winning streak. Dont buy nvidia, its an empty thing!CeriseCogburn - Saturday, June 30, 2012 - link
If you're going to use winzip to game, and support evil proprietary corruption in software by amd while using open source, great, hypocrisy and lying to stone cold stupid amd fans for years works well !Fluid sim - not a game
DX11 DC - not a game
SmallLux - not a game
Oops ! "Empty" suddenly applies to amd when it wins any "benchmarks that are not real world for end users, ever."
I guess empty crap no one uses, declared fraudulently, as a "win", sways the dark hollow spaces in the hearts and minds of the little amd fans. It's sad.
yay123 - Saturday, June 30, 2012 - link
hi there I'm buying this card but my psu is cm gx550w does it fit well if I oc it?Temelj - Thursday, July 12, 2012 - link
If you can afford a card like this, why not just upgrade your power supply?Review System Requirements here: http://www.amd.com/us/products/desktop/graphics/70...
Jamahl - Thursday, July 5, 2012 - link
Comments totally ruined by CeriseCogburn's bullshit on every page.Is this maddoctor in disguise, or one of the other Nvidia zealots? Whatever, just IP ban this weirdo and be done with it.
Mauhi123 - Monday, October 15, 2012 - link
Dear All.Hello,
I am having 3960x and DX79SI and graphics card asus hd7970-dc2t-3gd5
i am not able to boot the computer. when i am bootiing the computer on mother board 2 digit led shows "00" duble zero and on led screen shows "0_" and stops, but i can reboot the computer useing ctl+atl+del. i can able to oparate bios. that means the computer is not in hanging mode.
Please Help me ASAP......
seansplayin - Tuesday, November 20, 2012 - link
I have the Xfx 7970 Ghz edition and I really am not sure what is the big deal with the noise. My Card is not that loud. Honestly Power control settings @ +20%, Gpu core 1175 and memory @ 1600 completely stable. The games I play are at 1080P MAx everything and my GPU rarely gets above 70C, which is only around 40% fan speed. @ 40% fan speed I literally cannot hear the GPU fan unless I have the speakers completely turned off and even still I have to listen carefully to actually discern that the noise I hear is coming from the Video Card. My experience in gaming the GPU fan noise is absolutely NOT an issue. when I'm running synthetic GPU benchmarking apps like geekbench's Furmark then the card will ramp up around 70% fan speed and you can hear it, but even then it is really not an Issue. I am using the latest catylist beta Driver 12.11 which as added 15% increase in BF3 FPS and 10% increase in Dirt 3, basically taking Nvidia's crown in virtually every game.I do lot's of Video transcoding and the openCL domination this card produces is amazing.
Yesterday I trancoded a 1080P 5.3GB .mkv file to .mp4 with nero 11 when using AMD's app acceleration codec the transcode took 20 minutes as compared to 60 minutes when I used Nero's .mp4 codec at the same output settings. Durring the Transcoding the GPU stays at I believe 300 mhz with the GPU at 20% load average. when doing transcoding the gpu hoovers around 111F with the Fan at like 5%.
I love this card.
My Computer has three states, Idle 60% of the time, gaming and transcoding 40% of the time. At Idle with AMD's zero core this video card is using 10 watts less than Nvidia's 680, In gaming it's beating the 680 in almost every game now, and when it comes to encoding open cl and open gl it's basically a blowout averaging 75% more than the 680. If your an Nvidia fan (I formally was) and open CL is important to you, go with the Fermi cards because on most GPGPU processing they outperform with Kepler cards.
IF you question anything I've said do some google homework. Catalyst 12.11 actually does what they say, I can attest to it at least when it comes to Encoding, playing BF3 and Dirt 3
Peters357 - Wednesday, June 27, 2018 - link
The majority of good front-loaders receive https://washersanddryersmaker.squarespace.com at the very least CEE Tier II form III, a recognition https://washersanddryerstop.jimdofree.com from the Consortium for Energy Performance https://canvas.instructure.com/courses/1351368/ass... for super effective washing machines any great http://bestwashersanddryers.wikidot.com/ HE top-loader has an Energy Star badge agitator https://www.wattpad.com/586634818-best-smart-washe... top-loaders don't even claim to be reliable assuming