AMD held a press briefing today on their upcoming 8000M graphics chips, which they are calling the "second generation GCN architecture" parts. We’ll have more on that in a moment, but while we were expecting (dreading) a rebranding prior to the call, it appears we are at least partially mistaken; there will be at least one completely new GPU with 8000M. (If you want additional background material, you can see the previous generation high-end 7000M announcement from April 2012 for reference.)

I’m not going to get too far into the marketing aspects, as we’ve heard all of this information before: AMD has improved Enduro Technology, they’re continuing to improve their drivers, and APP Acceleration has a few more applications. There have been a few major titles released in the past couple of months with AMD Gaming Evolved branding (Far Cry 3 is arguably the most notable of the offerings, with Hitman: Absolution and Sleeping Dogs also scoring well amongst critics and users), and Bioshock Infinite is at least one future release that I'm looking forward to playing.

Cutting straight to the chase, at this point AMD has released limited information on the core specifications for some of their 8000M GPUs, but they coyly note that at least one more GPU announcement will be forthcoming in Q2 2013 (8900M by all appearances). Today is a soft launch of high level details, with more architectural information and product details scheduled for January 7, 2013 at CES. AMD did not share any codenames for the newly announced mobile GPUs, if you’re wondering, other than the overall family name of “Solar” for the mobile chips (replacing the outgoing “London” series), but we do know from other sources that the 384 core part is codenamed "Mars" while the larger 640 core part is codenamed "Neptune". Here are the details we have right now:

AMD Radeon HD 8500M, 8600M, 8700M, and 8800M
  Radeon
HD 8500M
Radeon
HD 8600M
Radeon
HD 8700M
Radeon
HD 8800M
Stream Processors 384 384 384 640
Engine Clock 650MHz 775MHz 650-850MHz 650-700MHz
Memory Clock 2.0GHz/4.5GHz 2.0GHz/4.5GHz 2.0GHz/4.5GHz 4.5GHz
Memory Type DDR3/GDDR5 DDR3/GDDR5 DDR3/GDDR5 GDDR5
FP32 GFLOPS 537 633 537-691 992
FP64 GFLOPS 33 39 33-42 62

Obviously there are a lot of missing pieces right now, but what we immediately notice is that the core count on the 8500M/8600M/8700M means that we’re definitely looking at a new GPU. The only other time we’ve seen AMD do 384 cores is with Trinity, but that’s a VLIW4 architecture so we’re not seeing that again. Given the currently shipping Southern Islands chips (“London” on the mobile side) have 640 cores max for Cape Verde, 1280 max for Pitcairn, and up to 2048 for Tahiti, AMD has likely created a fourth SI derivative that drops down to two CU arrays, each with three CUs. (You can read more about the GCN/SI architecture in our earlier GPU coverage.) Performance is something of a wildcard with the new 384 core parts, and the choice of DDR3/GDDR5 memory will also influence the final result. We'll find out in the coming months how the 8500/8600/8700M stack up to NVIDIA's midrange "GT" offerings, which interestingly are also using 384 cores.

Also worth a quick note is that AMD is not discussing TDPs at this point in time—which is common practice for both AMD and NVIDIA. We expect the new "Mars" parts to be more power efficient than the outgoing Thames/Turks cores, thanks to the shrink to a 28nm process. However, AMD and NVIDIA typically stick to common power targets for laptops that are dictated by their OEM partners, which often means they'll play with clock speeds in order to hit a specific TDP. That's why all of the clock speeds listed in the above table have a qualifying "up to" prefix (which I omitted).

The final announced card is the one where we appear to have more of a rebrand/optimization of a previous generation chip. 8800M has the same 640 core count as Cape Verde/7800M, only with modified clocks this time. The earlier 7800M chips could clock up as high as 800MHz, so maximum core clock is actually down a bit, but they only ran the memory at up to 1GHz (4GHz effective) GDDR5. If AMD determined memory bandwidth was more important for that particular GPU than shader performance, the new 8800M would make sense. Also note that AMD isn’t including the boost clock speeds into the above chart; under the right circumstances, all of the new chips can run at higher clocks than the reference clock.


Radeon 7800M Left, Radeon 8800M Right

AMD isn’t calling the 8800M a rebrand, but we’re looking at the same core counts as Cape Verde and the same 28nm process technology, so we wouldn’t expect a substantial change in performance. There’s also the above chip shot as a point of reference. If the 8800M is substantially different from Cape Verde then the above images provided in AMD’s slides must be incorrect, as the new and old chips look the same. Minor tweaks to power use, caching, or other elements notwithstanding, we’re probably dealing with a die respin at most. But, there’s nothing inherently wrong with rebranding—AMD and NVIDIA have both been doing it for some time now. Don’t expect every “upgraded” GPU to be better; a 7400M isn’t faster than a 6700M, and likewise we expect 7700M and 7800M to be faster options than the 384 core 8500M/8600M/8700M and competitive with 8800M. Here’s a quick recap of the same core specs as above for the current 7700M/7800M parts:

AMD Radeon HD 7700M/7800M Specifications
  Radeon
HD 7730M
Radeon
HD 7750M
Radeon
HD 7770M
Radeon
HD 7850M
Radeon
HD 7870M
Stream Processors 512 512 512 640 640
Engine Clock 575-675MHz 575MHz 675MHz 675MHz 800MHz
Memory Clock 1.8GHz 4.0GHz 4.0GHz 4.0GHz 4.0GHz
Memory Type DDR3 GDDR5 GDDR5 GDDR5 GDDR5
FP32 GFLOPS 589-691 589 691 864 1024
FP64 GFLOPS 36.8-43.2 36.8 43.2 54 64

I’ll refrain from commenting too much more about performance of an unreleased part, but AMD indicated their 8870M should be substantially faster than NVIDIA’s current GT 650M GDDR5 (which isn’t too surprising considering clocks and core counts), and the 8770M should likewise be a healthy 20%+ bump in performance relative to the 7670M. I’d rather see comparisons with GTX 670MX and HD 7770M, respectively, but I suspect those wouldn’t be quite as impressive. Anyway, you can see AMD’s comparison charts in the complete slide deck gallery below. Availability of the new GPUs is slated for Q1 2013.

Comments Locked

88 Comments

View All Comments

  • CeriseCogburn - Tuesday, December 25, 2012 - link

    Hey, since you're such a raged out amd fanboy, why didn't you provide us all with the list for amd ?
    LOL
    Like I said, it's the amd stupid paper launch, and you rager amd people cannot stop pointing the finger away - you act like political parties.
    I'm not surprised you wasted your life collecting your nVidia hate post info there.
    ROFL - fanboy exposure extraordinaire.

    Why do you people do it ?

  • silverblue - Thursday, December 27, 2012 - link

    "Hey, since you're such a raged out amd fanboy, why didn't you provide us all with the list for amd ?"

    Somewhere in the first pages of these comments, I believe I did actually say that the 7700 to 7900 desktop series were GCN with the other parts carried over from the 6000 series (albeit not the 6900 series - VLIW4 only reappeared in Trinity; I don't class APUs as belonging to either desktop or mobile ranges as they seem to transcend both, and we won't see it again with any luck). A little later, I mentioned that that the 4000 and 5000 series were self contained, BUT I also said that the 6770 was a 5770 with an updated UVD block. If it serves to help you, the 6350 was the 5450, albeit with numbering that actually makes sense unlike what Jarred is correctly pointing out here. It is wrong for AMD to compare a lower model number to a higher one and highlight how much faster the latter is so everybody has to wait to see what the real deal is.

    I'll repeat myself for clarity as regards architecture carry overs - AMD's x600 and below lines on the desktop are generally the previous generation, and as for mobile, with the 7xxx series it's the same deal. With the 6xxx mobile series, it goes as follows:

    63xxM - Evergreen
    64xxM - NI
    65xxM - Evergreen
    66xxM - NI
    67xxM - NI
    68xxM - Evergreen
    69xxM - NI

    Now THAT particular series is messy and thankfully not likely to be repeated, however there's a very important point to make again - there's no instance of two separate architectures being employed for the same model number. The 7xxxM series is far cleaner, and happily the 8xxxM series doesn't back away from this as far as we can see. I'd be far happier with a fully new series but as we've seen from both companies, it just doesn't seem to happen. If yields have been as bad as we've been led to believe, I can't blame either firm for wanting to reuse silicon or chop off non-working parts. They both do it, and it's pointless arguing otherwise. The 7xx mobile series will most likely feature Kepler parts - I'm not going to rage about it because it makes sense, but what I WILL latch onto is if we end up with the GT 640 all over again. If AMD do it, you can bet your bottom dollar that I'll be all over them like a rash.

    The G80 was a great card, you're spot on, but so was the RV770. It was the first architecture that ATi/AMD had done right in a good few years. Equally, Fermi mark 1 was unrefined and greedy, and the HD 2xxx series was a lame duck. There's an essence of balance here that I'm really hoping you latch onto, because you haven't done so in the past.

    There's nothing "rager" about any of my posts. Why rage about something I cannot hope to change? If I'm going to make a purchase, I'll research it first so I don't get stung - how can this possibly be an issue? If I get something bad, that's my fault. If AMD brought out something really good, who here has the right to tell me I made a mistake? If I opt for a Kepler-based setup, do you really think I'm going to want to listen to "serves you right for buying rubbish in the past"?

    Are AMD this big, evil corporation that you want everybody to believe? Yes, sometimes their product lines make no sense, and yes, sometimes their drivers are all over the place, but have you noticed that people don't usually come out and say "these drivers are top-notch" UNLESS it's in defence to comments such as "xxxx's drivers are buggy and slow"? It's a thankless job, I'll wager, and you have to deal with the complaints on a daily basis as opposed to basking in praise. A couple of years back, I needed a spare card in a hurry after my ATi 9800 Pro died (it'd been overclocked for 4 years with an Arctic Cooling solution), and I unpacked my old Ti4200-8x, only to find that its shader quality was through the floor with the latest drivers. I didn't rage about it because it was a very old card and I'm surprised that people make drivers that work for cards released more than 5 years ago (big kudos to NVIDIA for their LTS). I looked for a replacement and found a 1650 Pro which was a dog but more or less matched the 9800 Pro. Was it a good card? No way - 128-bit DDR2 memory, low pipe counts - but it did the job for 6 months. When I went to build my next PC, I looked at an Intel-NV combo, but neither had anything within my price range, so I ended up with an AMD setup. That XFX 4830 of mine has done remarkably well even though I couldn't overclock it despite its huge cooler - bad batch I guess. It was cheaper (over here in the UK) and faster than the competing 9800 GT in most things, and I simply wasn't interested in CUDA nor PhysX. I will upgrade to a new build eventually and as things stand, there'll be an Intel CPU in there, but I want to see how the 8xxx vs 7xx battle pans out before I make a purchase. Either company could pull out a real cracker.

    What do you think would happen to the GPU market should AMD disappear? The enthusiast and higher end NV models would be a fair deal more expensive for a start. There might not even be a need for a yearly product cadence. Believe it or not, consumer choice is a very good thing, and I hope it's here to stay.

    Now, there's something I'd like to ask you. Why did you spend your Christmas Day throwing out wild and untrue accusations?
  • CeriseCogburn - Thursday, January 3, 2013 - link

    You spent your life doing nothing but obsessing, and lying about amd.
    Now they're almost dead, and you still whine.
    You know why I'll be deliriously happy when amd is dead and gone ?
    You should - you CRYBABY amd losers.
    You cannot even afford your crybaby cheap as dirt amd cards, and you're such cheap losers amd can't pay for itself, hasn't been able to for years.
    So, let's say your insane whine comes true at one tenth your claimed disaster- GUESS WHAT ?
    YOU STILL WON'T BE ABLE TO AFFORD THE NVIDIA CARD !
    LOL
    Plus, you're such a sour pussed hate filled amd fan, you won't even want one.
    ROFL
    Man I can hardly wait till they QUADRUPLE in price.
  • CeriseCogburn - Wednesday, December 19, 2012 - link

    See above for the actual rule applied.
    This is an amd piece, so it must be stated that it is not inherently wrong, and that of course nVidia does it, has been doing, does it, did it, and so forth.

    If it was an nVidia piece as in the years past, the raging complaint would be over the top, and no mention of amd doing it would be in the article.

    Usually "the infamous G80" would be cited several times and the endless spewing complaint would get a whole couple of paragraphs or a page and several rementions throughout.

    But this is amd the dirty evil vile company now, so it's not inherently wrong.
    Now you know the rules.
  • JarredWalton - Thursday, December 20, 2012 - link

    For all your vitriol against the AMD fans, you're even worse on the NVIDIA side. We've called out both companies for rebranding for years, and we'll continue to do so. I guess I can feel I did my job well when you're here implying I somehow sugar-coated this story to make AMD look good while MrDude is complaining that I missed an amazing announcement.
  • CeriseCogburn - Tuesday, December 25, 2012 - link

    You do a fine job, I have zero complaints with you. You work under certain pressures and realities. You have lots of work in a hectic schedule and it's not easy doing loads of benchmarks or coming up with something that has too keep certain powerful parties above in decent shape.
    No complaints with you, you're doing your job.

    I'm just doing my job helping the little posting amd fanboys face reality, something they don't want to ever do, and looks like soon won't ever have to if amd falls over and croaks.
  • just4U - Friday, December 28, 2012 - link

    What makes you think their amd fanboys? Most of us gave up complaing about rebranding after the venerable 8800 went thru so many name changes.. The thing for reviewers to do is make note of it in their reviews and right now ... this isn't a review and little is known.. just alot guess work.
  • CeriseCogburn - Thursday, January 3, 2013 - link

    What proof do you have you think at all ?

    The G80 and it's derivatives we're the greatest set of gaming cards for the entire world community of ALL TIME.
    They brought cards in at every price point and a hundred different combinations.
    They breathed LIFE into computer gaming.
    The amd fanboys, the sour little turds that they are, did nothing but complain.
    They deserve every retort and rebuttal they get, and the YEARS of archives here prove EXACTLY whom they are.
    Thanks for not being sentient.
  • Gigaplex - Saturday, December 29, 2012 - link

    You need to step it down a notch. Not everyone claiming NVIDIA abuses the rebranding worse than AMD is an AMD fanboy. I wouldn't be caught dead with a current generation discrete AMD mobile graphics chip as NVIDIA currently provides a better user experience if you need discrete graphics. However, I stand by the claims of others that NVIDIA has an especially bad rebrand track record. What chip is the desktop 640 based on? There's multiple versions that come in both Fermi and Kepler.

    Personally I'd just stick with Intel graphics or AMD Fusion as they're just about as fast if not faster than a fair amount of the discrete chips the OEMs throw in there as a marketing tick box even though there's no benefit to them.
  • CeriseCogburn - Thursday, January 3, 2013 - link

    I already know all of them.
    There are years of archives here bucko.

Log in

Don't have an account? Sign up now