AMD Teases Upcoming Video Card

by Ryan Smith on 5/22/2015 5:15 PM EST
Comments Locked

75 Comments

Back to Article

  • Jtaylor1986 - Friday, May 22, 2015 - link

    So I guess there will be a watercooled model
  • meacupla - Friday, May 22, 2015 - link

    I see no fan, but I also don't see any ports for connecting tubes to, and there are vents on the back of the card, so I'm guessing the fan is just not pictured.
  • jimmy$mitty - Friday, May 22, 2015 - link

    Actually the tubes are at the end of the GPU. You can see the end corner of the GPU, it is just blurry.

    So the rumored picture was legit. I do however hope the rumored price is not. $850 feels like too much for me. If it is for the WCE then fine, I will take a fan cooled version for $699. But $850 just hits the wallet too hard.
  • meacupla - Friday, May 22, 2015 - link

    ah, okay. I guess it's better than their usual reference hair dryer.
  • Oxford Guy - Friday, May 22, 2015 - link

    a series of tubes, eh?
  • slickr - Saturday, May 23, 2015 - link

    Why? Nvidia Titan X is like $1050, but all over Europe its going for like 1100 euros. So they are releasing equally fast card for $200 less or 250 euros less that is also shorter, consumes less power and is cooler.

    I bet you if Nvidia's Titan x turd cost $850, they would have released this at $750.
  • MrSpadge - Saturday, May 23, 2015 - link

    Titan X is not a realistic option for most buyers either. And don't be so sure about the new AMD card consuming less power than a Titan X. Hawaii consumes about the same as Titan X, and the new card will have to feed 40% more shaders. HBM will save some power, but there's no way it can make up for 40% more hardware. And they're putting the water cooler on it for a good reason. I'd expect between 250 and 300 W, depending on settings, model etc.
  • siberus - Saturday, May 23, 2015 - link

    Water cooling is most likely just the logical step for high end hbm products. Gpu pcb surface area seems to be shrinking substantially this generation leaving less room for big open air coolers prevalent in current graphic cards. Also I wouldn't be too surprised if the 390x stayed at or lowered power consumption from current 290x levels. Just the addition of water cooling should help lower consumption a bit plus hbm benefits and any secret sauce added from being a newer refined architecture. But like all product releases we'll just have to wait for its release to see how it stacks up.
  • Refuge - Monday, May 25, 2015 - link

    agreed, I've learned not to get excited until I see it in action anymore.
  • TheJian - Sunday, May 24, 2015 - link

    HBM1 might shave ~15 watts. Where do you get that it will consume less power than TitanX or 980TI coming shortly?

    Nvidia sure has a fast, low power "turd"...LOL. If AMD doesn't price the card high they'll end up losing money for yet ANOTHER quarter. They need to start telling people they're living in fantasy land and AMD's quarterly reports can't handle these stupidly low prices any longer. For the sake of the company (who has lost $6B+ in the last 12 years) you ALL should be hoping they price the card to MAKE MONEY. Get a better job if you can't afford the toys you want.
  • Xenonite - Sunday, May 24, 2015 - link

    With everyone seemingly only wanting less and less powerfull hardware (welcome to "good enough" computing), I highly doubt that AMD would spend the required amount on R&D and manufacturing of a truely high performance 300W GPU.
    No, I believe the increased price and possible water cooling is because of the smaller/thinner/weaker form factor of the graphics card.

    With the new leadership, I think AMD has realised the only way to increase sales is for their highest performance GPU to be both smaller and thinner and to consume less power than NVIDIA's GPUs. Today's consumers will flock to such a design even if it has a significantly worse price/performance ratio than the competition; hence, that is where the money is.
  • Refuge - Monday, May 25, 2015 - link

    The water cooler (If this is one I don't see it) is because of the HBM I'm sure. gotta make a heatsink that can fit perfectly on 5 surfaces, 4 of which are higher than 1.

    I wanna see one tore down, would be interesting to see. :)
  • creed3020 - Monday, May 25, 2015 - link

    Which is why AMD has added an IHS to the GPU chip. In the past the GPU was naked but this will no longer be the case to provide that uniform surface for a heatsink to hit all the chips on the interposer.
  • Friendly0Fire - Saturday, May 23, 2015 - link

    AMD needs at least one relatively affordable card that's competitive and not merely a rebrand.

    A 390X at $850 would either stick out if the 390 is much cheaper, or it'll indicate an increase in price for all of their GPUs using the new architecture, which wouldn't be good for them. I heard rumors that their top card might actually be branded as something else to replicate the Titan thing, which would be I think the most logical option. The 390X/390/380X/380 would be cut-down versions of it, priced accordingly.
  • stefstef - Sunday, May 24, 2015 - link

    you said it. they should have their new technology available on every pricepoint after the go to market. but amd isnt such a big company either and i doubt they can do that. despite i am a linux user and i will profit from this newer architecture in an estimated five to six years i am looking forward to this new gpu. hopefully in half a year they have a decent hd solution out with the new capabilities at around 85 bucks.
  • Nilth - Sunday, May 31, 2015 - link

    This. I need a relatively affordable (<400$) card with 4gb of the new vram or I will keep not spending a single dime on a new pc. I didn't buy a gtx970 for the 3.5gb issue (together with the other false information) and I would be glad to not buy another of their overpriced cards.
  • testbug00 - Saturday, May 23, 2015 - link

    Nothing points to Fiji using less power than Titan X...
  • TheJian - Sunday, May 24, 2015 - link

    HBM1 might shave ~15 watts. Where do you get that it will consume less power than TitanX or 980TI coming shortly?

    Nvidia sure has a fast, low power "turd"...LOL. If AMD doesn't price the card high they'll end up losing money for yet ANOTHER quarter. They need to start telling people they're living in fantasy land and AMD's quarterly reports can't handle these stupidly low prices any longer. For the sake of the company (who has lost $6B+ in the last 12 years) you ALL should be hoping they price the card to MAKE MONEY. Get a better job if you can't afford the toys you want.
  • TheJian - Sunday, May 24, 2015 - link

    HBM1 might shave ~15 watts. Where do you get that it will consume less power than TitanX or 980TI coming shortly?

    Nvidia sure has a fast, low power "turd"...LOL. If AMD doesn't price the card high they'll end up losing money for yet ANOTHER quarter. They need to start telling people they're living in fantasy land and AMD's quarterly reports can't handle these stupidly low prices any longer. For the sake of the company (who has lost $6B+ in the last 12 years) you ALL should be hoping they price the card to MAKE MONEY. Get a better job if you can't afford the toys you want.
  • TheJian - Sunday, May 24, 2015 - link

    great probably triple post...Why the first didn't show up is beyond me, just keeps showing stuck post. Really wish you could delete a post on here. refreshing page just reposts when showing stuck?
  • eanazag - Friday, May 22, 2015 - link

    I'm confident AMD will come out with their GPU on time (relatively). It is the same manufacturing node, so there is no concern there.

    It is the CPU we're all doubting being anywhere near to on time or near set expectations. Show me stuff on the CPU and I would be impressed and excited. I'm not talking about Carrizo either. I would love to buy an AMD CPU to build around; please give me a reason. I hate to feel like I have to say this also: update your chipset. AMD has even less excuse to be so far behind in chipset features.
  • Oxford Guy - Friday, May 22, 2015 - link

    For $59 you can get a 4 module 8 thread sort-of 8 core chip from Microcenter (counting the $40 off bundled motherboard) that can run very quietly at 4.3 GHz with a decent cooler. For certain workloads that's quite a value, even for a chip and chipset that old.
  • tabascosauz - Saturday, May 23, 2015 - link

    Time is ticking, and the value of AM3+ is falling off as fast as the GTX 970 flew off the shelves when it first came out. The three-part board design is unacceptable in 2015, the only valid reason for it being AMD's current financial situation. Between Vishera's IMC, power (and board) requirements, and the sheer age of the NB/SBs, there are a lot of sacrifices to be made for that budget value.

    The modernity of the Bolton FCH family is additional proof that even AMD stopped taking AM3+ seriously a long time ago, and is saving as much money as it can for Zen and the next generation of GCN. It's getting much harder to justify an AM3+ purchase; there is no doubt that some workloads favor Vishera's dollop of integer hardware, but that feature is about as relevant to most enthusiasts as FirePro's proprietary performance boosts are to gamers.

    Also, few people enjoy the privilege of living near a Microcenter.
  • slickr - Saturday, May 23, 2015 - link

    AMD's 390x is 20nm. Why do you think its over 9 months later than Nvidia's 900 series?

    It is 20nm, 250W, just as fast as the Titan X, but $200 cheaper, shorter and cooler.
  • jwcalla - Saturday, May 23, 2015 - link

    Who has a working 20nm process? GF?
  • testbug00 - Saturday, May 23, 2015 - link

    IBM, Samsung, Intel and TSMC. Of those, only TSMC is commercial. It also offers about zero benefits over their HPM 28nm. Especially for large GPUs.
  • testbug00 - Saturday, May 23, 2015 - link

    Nope. It's AMD's largest GPU ever designed. And, it's on 28nm.
  • medi03 - Saturday, May 23, 2015 - link

    Had there been a single AMD notebook with IPS screen, there would be a reason to buy one.
  • testbug00 - Sunday, May 24, 2015 - link

    Yes, I'm very sure that the HP business laptops offer AMD APUs and have IPS displays. They also cost WAY to much.
  • Innokentij - Friday, May 22, 2015 - link

    I hope it got DVI, my screen only likes that.
  • Hobbsmeerkat - Friday, May 22, 2015 - link

    I think I read that they are only going to have 3 Display Ports and a single HDMI, though you can get adapters pretty cheap.
  • SleepyFE - Friday, May 22, 2015 - link

    Yup. HDMI can send a DVI signal through, so passive adapters shouldn't cost much.
  • WorldWithoutMadness - Friday, May 22, 2015 - link

    Well, I don't really know about this but I tried passive hdmi-dvi adapter for my gigabyte brix and it didn't work.
    Maybe you guys can try it first but if it doesn't work, then probably it is time for monitor upgrade?
  • looncraz - Saturday, May 23, 2015 - link

    I've run my DVI monitors on a cheap ($7, IIRC) HDMI to DVI adapter for a LONG time.

    The HDMI video signal is effectively just the DVI signal using a different physical connector.
  • rtho782 - Saturday, May 23, 2015 - link

    Plenty of people around with 2560x1600 30" screens that are DL-DVI only.

    That requires expensive active adaptors that run to $100+ and introduce latency.
  • meacupla - Friday, May 22, 2015 - link

    I haven't found any cheap HDMI or DP to dual link DVI adapter.
    Which is the only input on, relatively popular, cheap korean IPS/PLS 27" 2560x1440 +90Hz monitors.
  • SirKnobsworth - Friday, May 22, 2015 - link

    Because DL-DVI requires active signal conversion since the number of lanes is different.
  • tipoo - Friday, May 22, 2015 - link

    1GB of GDDR5 takes 672mm sq
    1GB of HBM takes 35mm sq

    And that's not even considering that the 35mm2 of the HMB is split into four vertical stacks. HMB is awesome! I love that high end cards are going to be smaller now, and hope it trickles into mid range cards really fast.
  • tipoo - Friday, May 22, 2015 - link

    *I mean, it would be to scale to 4GB, whereas the GDDR5 would have to spread out more.
  • heffeque - Saturday, May 23, 2015 - link

    To cool off 35mm2 with 4 times the height will be a lot more complicated than cooling 672mm2.
  • MrSpadge - Saturday, May 23, 2015 - link

    Cooling memory has never been a real issue. And won't be with HBM since the actual memory chips are clocked far lower than for GDDR5.
  • testbug00 - Saturday, May 23, 2015 - link

    you still increase heat density overall. Which, requires more/better cooling. Or just running hotter with the same cooling.
  • meacupla - Friday, May 22, 2015 - link

    I'd love to see good gaming power from half height and/or single slot cards.

    Call it a niche, but the current top tier for this is 750Ti (half height, double slot) and R7 250 (half height, single slot).
    It would be nice to have more power than that.
  • just4U - Friday, May 22, 2015 - link

    The 960/70 has some nice mini's and supposedly the design allows for what your asking for... Since they are out there it's probably likely that Amd's partners will release some equivalent.
  • meacupla - Saturday, May 23, 2015 - link

    Those cards you speak of are 'short' cards, as in, they are still 'full height' and 'double slot' solutions, but have shorter PCBs.

    Apparently, the current limit is 750Ti, for both thermals and component density.
  • Morawka - Saturday, May 23, 2015 - link

    yeah but HBM has to fit on the GPU DIE, Regular GDDR5 doesn't, and has a plethora of placement options.
  • dragonsqrrl - Saturday, May 23, 2015 - link

    No it doesn't. The stacked modules share an interposer with the GPU die.
  • Pwnstar - Saturday, May 23, 2015 - link

    I would say you need to double check your "facts", HBM is not on the GPU die.
  • extide - Friday, May 22, 2015 - link

    Just release the thing already sheesh!!
  • chubbyfatazn - Friday, May 22, 2015 - link

    In before chizow says it'll suck
  • Ryan Smith - Friday, May 22, 2015 - link

    Hey now, let's not get personal.
  • at80eighty - Saturday, May 23, 2015 - link

    haha.
    he speaks truth all the same
  • Pwnstar - Saturday, May 23, 2015 - link

    Chizow hates AMD.
  • Mr Perfect - Saturday, May 23, 2015 - link

    Hey, maybe we should send someone to check on Chizow, he should have 10+ borderline troll posts in this topic by now, but he hasn't even shown yet. Hopefully he's just enjoying Memorial Day weekend early.
  • Michael Bay - Saturday, May 23, 2015 - link

    He doesn`t need to visit if amd drones are already assblasted.
  • Oxford Guy - Friday, May 22, 2015 - link

    well, since it apparently won't have a blower...
  • just4U - Friday, May 22, 2015 - link

    Interesting.. It looks like they may have finally upped their game on stock reference cooling.....
  • SirPerro - Saturday, May 23, 2015 - link

    Well it looks like 2 interposers with a total of 8GB will be on board in an X2 card

    Liquid cooling for just a single gpu sounds rather extreme
  • Pwnstar - Saturday, May 23, 2015 - link

    I would say it is misleading to say dual GPUs have double the RAM.
  • looncraz - Saturday, May 23, 2015 - link

    Not if you can actually use all that RAM, which is supposedly the case with DirectX 12/Windows 10, but is something could, in theory, happen with some crafty driver modifications (with supporting hardware mods).
  • CiccioB - Sunday, May 24, 2015 - link

    And you are connecting those separated memory pools with? PCI 3.0 at 16GB/s?
    Good luck for doing anything useful in a 3D game.
  • ES_Revenge - Tuesday, May 26, 2015 - link

    If you're using AFR (which I think is the most common way SLI and Crossfire are implemented these days) then no you don't really have double the RAM *effectively* as each GPU only has access to its own VRAM (3GB x 2 or what have you). In SFR though I think it works out that you have more VRAM because instead of each GPU rendering each entire [alternating] frame like in AFR, each GPU is only rendering half the frame. So it's only needing to use half the VRAM it would otherwise (approximately, I mean if one is rendering the sky and the other a lot of textures then it's not exactly 50/50)... So I *think* it is possible to have "double the VRAM" but since AFR is the most common method, it doesn't work like that in reality. And yeah PCIe is far too slow to effectively share RAM over, in a pool.
  • Alexvrb - Tuesday, May 26, 2015 - link

    I really do not care for AFR. I always wanted to see widespread use of a tiled multi-GPU rendering implementation - it would divide work up more evenly. Oh well... at least with the new lower-level APIs there might be some creative use of multiple GPUs that doesn't rely on AFR.
  • CiccioB - Sunday, May 24, 2015 - link

    AMD is already very late with this GPU, I expect the X2 coming too close to 14/16nm new GPU release to have any appeal.
  • RussianSensation - Thursday, May 28, 2015 - link

    They are not really that late in the context of the market's competitor. The Titan X launched for $1K in mid-March 2017 and 980Ti is only being launched June 2nd, 2015. Fiji XT is launching this June by all accounts, which means it's barely behind 980Ti. Where AMD is late is with GTX970/980 competitors but one could argue that R9 290/290X were good enough performance wise and they certainly offered superior price/performance. They got just tarnished brand image since nearly every professional site ignored the existence of cool and quiet after-market R9 290 series of cards. There are rumours that Hawaii enhanced will just be a slightly clocked version with 8GB of GDDR5 to better compete against 970/980, with possibly some improvements in perf/watt. Other than perf/watt, 970/980 barely moved the needle when it comes to high-end performance from 290X, which is now approaching 1.5 years old. Where AMD really needs better cards is in the mobile dGPU space and > $400 level (Fiji Pro and XT).
  • CPUGPUGURU - Saturday, May 23, 2015 - link

    AMD's hot watt wasting ReBranded Tonga glued to 4K gaming inept limited to only 4GB HBM1 is going to beaten like Lame Llano filled pinata by, highly over clocked performance per watt champion Maxwell GTX980, TitanX and upcoming GTX980ti.

    read weep and cry me a amazon river,

    The water cooled nature of this card has other tertiary benefits as well. While EVGA doesn’t unlock anything spectacular for overclockers (+87mV and +25% for the voltage and Power Limit respectively), our sample hit some impressive levels, running at 1620MHz for hours on end without the smallest hiccup. As you can imagine, the card is also very, very quiet due to the 120mm fan running alongside the low-RPM blower.

    There are so many positive points about the EVGA GTX 980 Hybrid that its potential downfalls may be overlooked. From a compatibility standpoint, there shouldn’t be any problem getting this card to fit into nearly any case on the market. However, any system that’s already been equipped with an All in One cooler may find itself with some limitations when installing the Hybrid since there are only so many accessible 120mm ports on some cases.

    With the GTX 980 Hybrid, EVGA has created an awesome graphics card that performs at an extremely high level, runs cool, grants acoustic-minded individuals a quiet environment and provides an impressive amount of overclocking headroom. It may not be able to perform up to the level of a TITAN X, EVGA’s iteration costs significantly less while still delivering some of the highest framerates around. What more is there to ask for?
  • formulav8 - Saturday, May 23, 2015 - link

    Are you posting on the correct article? Your post looks to be a bunch of gibberish.
  • CPUGPUGURU - Saturday, May 23, 2015 - link

    AMD's hot watt wasting ReBranded Tonga glued to 4K gaming inept because its limited to only 4GB HBM1 = AMD HBM1 390, I think that's what's pictured so its the correct article. But I do wish for a edit option as my point is that the 390 is 4K gaming cripple, and a ReBranded Tonga GPU that Maxwell highly over clocked 980, 980ti, and Titan X will be beating like a pinata.

    AMD 390 too late, too lame (hot watt wasting Tonga), too little (only 4GB memory) to 4K game.

    We will see if I'm right, AMD has already lost massive market share to performance per watt champion Maxwell and since all of AMD's 3xx series GPUs are ReBranded with one glued to HBM1 the market share beat will go on and on.

    Have a Great Memorial Weekend
  • testbug00 - Sunday, May 24, 2015 - link

    From an engineering standpoint, you can have over 1GB of HBM1 per 1024-bit bus. It's just very expensive. And an engineering challenge, but, AMD has plenty of engineers who can do the job.

    The extra projected revenue from it likely is lower than the cost to do it. And, memory usage depends on the game and resolution and settings. Not the resolution. TWIII for example...

    not to mention that due to moving things in and out of memory, you can have a 4GB and 12GB card with the same settings and have different RAM utilization amounts. It becomes a question of how much RAM the game needs to use if it only has what it needs in RAM. That amount will vary drastically game to game, in all likely-hood.
  • Hicks12 - Sunday, May 24, 2015 - link

    People seem to keep banging on about 4GB not being enough but AMD has thrown several developers onto this issue, compression will most likely be implemented and ensure less is more :P.

    Wait and see, should be interesting to see how it pans out !
  • ES_Revenge - Saturday, May 30, 2015 - link

    LOL you posted the same first paragraph twice, despite someone else telling you it was gibberish. "Hot Watt wasting" was lulz too. Yes it's true AMD's GPUs are nowhere near as power-efficient as Nvidia's current offerings but don't pretend the situation hasn't been the other way around before.

    You're way out in left-field for the most of your gibberish.

    1. As for the 3xx series, we already know they are essentially all rebrands and it's just an OEM line like HD 8xxx were.

    2. Tonga will never bear the x90 moniker either--it's the 380 and may not even be in the 4xx series unless Trinidad is just rebranded Tonga (like Curaco/Pitcairn), but who knows.

    3. Tonga will never be sold under the pretence that it will be capable of 4K gaming. AMD knows it can't and won't advertise it as such either.

    When Fiji drops it's more than likely going to be 4xx series. Sure most of the 4xx will probably be rebrands as well but *other than* power efficiency, there's really nothing wrong with Bonaire, Tonga, and Hawaii. In fact there's nothing wrong with Tahiti either though it is missing features like TrueAudio and 4K video decoding and it's "old" so it's almost certain AMD will not reuse it again.

    Most people buying desktop cards aren't *that* concerned with power efficiency BTW. It's great that 980, 980 Ti and Titan X (Titan X really? It's over $1k) are and always will be faster than Tonga. But they're not even in the same realm of pricing. You'd *expect* that GPUs costing 2-4 TIMES as much would be faster, no? Honestly comparing Tonga to a Titan X is so much nonsense I'm not even going to bother saying anything else.
  • silverblue - Monday, May 25, 2015 - link

    It's not quite drashek though.
  • D. Lister - Saturday, May 23, 2015 - link

    Teasing? They must realize they are already much too late to the party. Enough with the bloody teasing, show us some review samples and benchmarks already.
  • Gunbuster - Sunday, May 24, 2015 - link

    Enough with the teases. Come out with some terrible box art and marketing. Ruby is a leather thong please!
  • rupaniii - Thursday, May 28, 2015 - link

    That looks refreshingly short, even if I do have an EATX case, lol. Okay, I have 2 EATX cases ;) But a smaller card can't hurt if it's HBM

Log in

Don't have an account? Sign up now