Yeah, 10X the "gamers" are running Vega compared to the "previous generation" (whatever the hell that was).
I'm assuming that in AMD-world cryptocoin mining is assumed to be a game of Dwarf Fortress and clearly the only GPU that can give you ASCII fortress love is Vega.
They count every desktop gpu class product sold as a sale, just like a car maker would count every car sold as a sale, they do not count racing cars as normal cars nor does AMD count "pro" cards as a "normal" sale..notice the part where is says PC and Console, did it say workstation, server etc? LOL.
Nv would count every single graphics "core" as one sold no matter what the product is used in (car, laptop, tegra, nintendo switch, pro workstation etc)
as far as previous generation, that would be RX 400/RX 500 (same thing really besides clock speed bumps) they have basically switched to Vega top to bottom, I honestly hope they at least try to cater towards more mainstream clients that really have no need of or want to shell out many hundred $ for something they cannot effectively use (burn power for nothing) or get hamstrung by lack of potential performance and not have enough ooomph when need it.
baseline example, performance and power usage difference between RX 550 to 560 not much more extra power for quite a bit of extra performance (~$70 jump in price greedy AIB and resellers) but is held back by a 128bit memory bus,
RX 560 to RX 570 quite a bit more $ noticeable jump in average power consumption but at least RX 570 is not "hamstrung" by being on a measly 128bit bus
RX 570 to RX 580 a fair jump in price and power consumption (average normal usage, much more so when clocked up or poor optimized games/apps) because they are absolutely not being sold near MSRP, nor is the RX 560 and often not even the cheap as dirt 550, RX 580 really is not much a jump in performance in comparison to what 560 to 570 is, and of course resellers are often pricing the 570 identical to the 580..like buying a V6 for the same price as a V8 >:( ..
RX 570 OC to be +/- small % of an RX 580 to Vega 56 quite a jump in power as well as price and a so so jump in performance not enough to say "this is a full out 1440p graphics card"
Vega 56 to Vega 64 IMO absolutely not worth it, unless wallet is bigger then brains seeing as the 56 one can take a moment to optimize the voltages and it will run all day long at full boost whereas the 64 most times will constantly throttle (to many transistors under the same cooler or something)
I honestly wish AMD would take a look back at their history a touch, 4870 caught Nv by surprise big time, 5870 was a further oomph push was neck and neck with Nv (more power but performance to back it up) 68/69xx generation was a further optimization of and push further than 5xxx generation, 77/78/79xx were quite awesome for what they were (IMO)
they decided they needed to do a "sweet spot strategy" and cut it down or beef it up to address the market (this is what allowed 4xxx to cut Nv parade ~9 months forced Nv to price down their cards to compete) it was working quite well for them, they seemed to have stopped really doing this, at least since the 7xxx generation, they seem to be pushing a "not quite good enough" or "chews too much power for little extra performance gain" in relation to the "budget" and "mainstream" approach, either not quite potent enough, or, using too large an engine for nothing.
720p-1080p-1200p-1440p-4k, am sure if they did some homework they can see this is what folks "need" 720p class can turn a few things down to play at 1080p, 1080p can turn down for 1200p, 1200 for 1440, whereas 4k is pretty much a league of its own.
They really need to sort out that generally the "budget: machines (under $170) are just fine with 128bit bus, the mainstream ($200-$300) pretty much have to have 256bit bus or it really holds performance back, for the performance no matter the price folks (high resolution i.e 1440/3k-4k etc) that is where they need to really focus things and make sure is proper voltage optimization so does not throttle and/or give a stupid wide memory bus ^.^
back to the drawing board to make every watt count, shut off what is not being used, allow us the USER to really be able to focus on reducing voltage/clocks with a much finer toothed comb etc.
anyways, I personally think that the $ to performance ratios for Nv as well as AMD are really out to lunch the last 2 years, below $200 performance is really not that good for the power being used, above ~$225-$350+ range there is a boat load of extra power consumption and a pretty large "core difference" as far as the amount of shader, TMU, ROP, memory bus etc.
I quite liked Radeon 4k-5k-6k-7k generations because there was a pretty decent stepping stone no matter the size of the wallet you had, for esports, high end gaming, elite gaming, all out enthusiast, whereas now we basically have 2 "budget" points, mainstream, then to all out either "wow that is pricey" or "I am loaded with nothing to do with it"
The "sweet spot" strategy made all kinds of sense back when Moore's law meant that you could regularly replace your video card and by staying "n month/years" behind "state of the art" (and replacing at a similar interval) you could keep having great cards at a small fraction of "state of the art". Now we have to deal with post-Moore's law where a <7nm process is a *long* way off.
Another big change was that the "sweetspot" could concentrate on powering 1080 monitors. Vega couldn't quite handle 4x, and newer GPUs should be expected to run 4x@60Hz (especially if/when Samsung ships 4x TVs with freesync).
Granted, this is all from a desktop viewpoint and the desktop is all but dead (from a sales perspective). The "sweetspot" probably still sells well for notebooks, but AMD is still getting creamed in power efficiency by nvidia.
Finally, if AMD even thinks about selling the "sweetspot" to desktop users (that refuse to replace their trusty monitors), they can expect a flood of competition from cryptominers abandoning their miners. I'd avoid the "desktop sweetspot" like the plague if I was designing a GPU. And to be honest, the most exciting GPUs (from a "if I was AMD I'd love to sell that" aspect) are the Raven Ridge ALUs. Any work on graphics is likely to be required to work well in an ALU.
Regarding that bloated "predator" notebook running a desktop chip and GPU, I saw the same thing last year and it never shipped. I minor tweak from a 1-series Ryzen to a 2-series Ryzen ain't doing it.
wow AMD is on fire. T releases in Q3, vega refresh releases in Q4 (I mean 2H) and wow its gonna be a great time again to be a pc gamer starting Q3. I am almost certain intel and NVidia will respond... can't wait
AMD jsut stole the thunder of Nvidia big time. I am sure they were not expecting Vega on 7nm this year. If the rumors are true, Vega 7nm is between 55-70% more powerful than Vega 14nm.
I was not expecting AMD to have a good hand against Tesla, but Vega might actually finally be what it was supposed to be.
I also hope AMD will introduce an HBM free version to cut these price for mid-range. Vega needs to go mainstream.
Vega is just a stepping stone to Navi albeit its sounding like a nice bump. They needed to HBM2 memory and on 7nm before Navi is possible. I am reading into it a bit but I suspect if you are going to "glue" chipps together like Zen did you need shared memory which means very high bandwidth ie HBM2 and you need to be able to cool the multi chip GPU which 7nm gets you there.
AMDs slides mention a >35% performance increase for 7nm. I wouldn't expect consumer cards before 2019 (professional cards this year). They should launch the 600-series in a similar timeframe. I too hope we'll see gddr5 on budget models.
I'm not sure what the price handicap will be for using HBM2 by next summer. A lot of high volume memory producers (SK Hynix/Samsung) are making a ton of HBM2 with pretty good yields now (packaging was tough in first gen), and not really increasing GDDR5X production.
I don't think prices will be at parity by any stretch, but cost/GB of bandwidth might be. If you match THAT metric, HBM2 is a no brainer since it has the win on a lot of other critical areas (real estate on card, power consumption, heat production).
FreeSync in Samsung TVs. To quote POTUS: "This is a 'uuuuuuge" deal!". The more FreeSync we get into ALL displays, the more absurd NVIDIA's BFD (and G-Sync in general) will appear. It's all a precursor to HDMI 2.1 VRR Game Mode. Can't come soon enough.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
17 Comments
Back to Article
CajunArson - Tuesday, June 5, 2018 - link
Yeah, 10X the "gamers" are running Vega compared to the "previous generation" (whatever the hell that was).I'm assuming that in AMD-world cryptocoin mining is assumed to be a game of Dwarf Fortress and clearly the only GPU that can give you ASCII fortress love is Vega.
Dragonstongue - Tuesday, June 5, 2018 - link
putz lol.They count every desktop gpu class product sold as a sale, just like a car maker would count every car sold as a sale, they do not count racing cars as normal cars nor does AMD count "pro" cards as a "normal" sale..notice the part where is says PC and Console, did it say workstation, server etc? LOL.
Nv would count every single graphics "core" as one sold no matter what the product is used in (car, laptop, tegra, nintendo switch, pro workstation etc)
as far as previous generation, that would be RX 400/RX 500 (same thing really besides clock speed bumps) they have basically switched to Vega top to bottom, I honestly hope they at least try to cater towards more mainstream clients that really have no need of or want to shell out many hundred $ for something they cannot effectively use (burn power for nothing) or get hamstrung by lack of potential performance and not have enough ooomph when need it.
baseline example, performance and power usage difference between RX 550 to 560 not much more extra power for quite a bit of extra performance (~$70 jump in price greedy AIB and resellers) but is held back by a 128bit memory bus,
RX 560 to RX 570 quite a bit more $ noticeable jump in average power consumption but at least RX 570 is not "hamstrung" by being on a measly 128bit bus
RX 570 to RX 580 a fair jump in price and power consumption (average normal usage, much more so when clocked up or poor optimized games/apps) because they are absolutely not being sold near MSRP, nor is the RX 560 and often not even the cheap as dirt 550, RX 580 really is not much a jump in performance in comparison to what 560 to 570 is, and of course resellers are often pricing the 570 identical to the 580..like buying a V6 for the same price as a V8 >:( ..
RX 570 OC to be +/- small % of an RX 580 to Vega 56 quite a jump in power as well as price and a so so jump in performance not enough to say "this is a full out 1440p graphics card"
Vega 56 to Vega 64 IMO absolutely not worth it, unless wallet is bigger then brains seeing as the 56 one can take a moment to optimize the voltages and it will run all day long at full boost whereas the 64 most times will constantly throttle (to many transistors under the same cooler or something)
-----------------------------------
--------------
I honestly wish AMD would take a look back at their history a touch, 4870 caught Nv by surprise big time, 5870 was a further oomph push was neck and neck with Nv (more power but performance to back it up) 68/69xx generation was a further optimization of and push further than 5xxx generation, 77/78/79xx were quite awesome for what they were (IMO)
they decided they needed to do a "sweet spot strategy" and cut it down or beef it up to address the market (this is what allowed 4xxx to cut Nv parade ~9 months forced Nv to price down their cards to compete) it was working quite well for them, they seemed to have stopped really doing this, at least since the 7xxx generation, they seem to be pushing a "not quite good enough" or "chews too much power for little extra performance gain" in relation to the "budget" and "mainstream" approach, either not quite potent enough, or, using too large an engine for nothing.
720p-1080p-1200p-1440p-4k, am sure if they did some homework they can see this is what folks "need" 720p class can turn a few things down to play at 1080p, 1080p can turn down for 1200p, 1200 for 1440, whereas 4k is pretty much a league of its own.
They really need to sort out that generally the "budget: machines (under $170) are just fine with 128bit bus, the mainstream ($200-$300) pretty much have to have 256bit bus or it really holds performance back, for the performance no matter the price folks (high resolution i.e 1440/3k-4k etc) that is where they need to really focus things and make sure is proper voltage optimization so does not throttle and/or give a stupid wide memory bus ^.^
back to the drawing board to make every watt count, shut off what is not being used, allow us the USER to really be able to focus on reducing voltage/clocks with a much finer toothed comb etc.
anyways, I personally think that the $ to performance ratios for Nv as well as AMD are really out to lunch the last 2 years, below $200 performance is really not that good for the power being used, above ~$225-$350+ range there is a boat load of extra power consumption and a pretty large "core difference" as far as the amount of shader, TMU, ROP, memory bus etc.
I quite liked Radeon 4k-5k-6k-7k generations because there was a pretty decent stepping stone no matter the size of the wallet you had, for esports, high end gaming, elite gaming, all out enthusiast, whereas now we basically have 2 "budget" points, mainstream, then to all out either "wow that is pricey" or "I am loaded with nothing to do with it"
wumpus - Sunday, June 10, 2018 - link
The "sweet spot" strategy made all kinds of sense back when Moore's law meant that you could regularly replace your video card and by staying "n month/years" behind "state of the art" (and replacing at a similar interval) you could keep having great cards at a small fraction of "state of the art". Now we have to deal with post-Moore's law where a <7nm process is a *long* way off.Another big change was that the "sweetspot" could concentrate on powering 1080 monitors. Vega couldn't quite handle 4x, and newer GPUs should be expected to run 4x@60Hz (especially if/when Samsung ships 4x TVs with freesync).
Granted, this is all from a desktop viewpoint and the desktop is all but dead (from a sales perspective). The "sweetspot" probably still sells well for notebooks, but AMD is still getting creamed in power efficiency by nvidia.
Finally, if AMD even thinks about selling the "sweetspot" to desktop users (that refuse to replace their trusty monitors), they can expect a flood of competition from cryptominers abandoning their miners. I'd avoid the "desktop sweetspot" like the plague if I was designing a GPU. And to be honest, the most exciting GPUs (from a "if I was AMD I'd love to sell that" aspect) are the Raven Ridge ALUs. Any work on graphics is likely to be required to work well in an ALU.
sonichedgehog360@yahoo.com - Tuesday, June 5, 2018 - link
Acer: Uh, uh, uh...CajunArson - Tuesday, June 5, 2018 - link
Regarding that bloated "predator" notebook running a desktop chip and GPU, I saw the same thing last year and it never shipped. I minor tweak from a 1-series Ryzen to a 2-series Ryzen ain't doing it.Targon - Thursday, June 7, 2018 - link
Actually, the Asus ROG laptop with Ryzen 7 1700 shipped and has been available from many places.https://www.bestbuy.com/site/asus-rog-strix-gl702z...
.vodka - Tuesday, June 5, 2018 - link
That ASUS guy casually mentioning nV graphics in their Ryzen notebook... LOLCajunArson - Tuesday, June 5, 2018 - link
AMD said that RyZens IGP was more powerful than some discrete graphics.Just not *those* discrete graphics apparently.
tamalero - Tuesday, June 5, 2018 - link
lol at that burn where AMD mocks their HEDT processor demo that used a chiller and custom stuff.Hxx - Wednesday, June 6, 2018 - link
wow AMD is on fire. T releases in Q3, vega refresh releases in Q4 (I mean 2H) and wow its gonna be a great time again to be a pc gamer starting Q3. I am almost certain intel and NVidia will respond... can't waiteva02langley - Wednesday, June 6, 2018 - link
AMD jsut stole the thunder of Nvidia big time. I am sure they were not expecting Vega on 7nm this year. If the rumors are true, Vega 7nm is between 55-70% more powerful than Vega 14nm.I was not expecting AMD to have a good hand against Tesla, but Vega might actually finally be what it was supposed to be.
I also hope AMD will introduce an HBM free version to cut these price for mid-range. Vega needs to go mainstream.
FreckledTrout - Wednesday, June 6, 2018 - link
Vega is just a stepping stone to Navi albeit its sounding like a nice bump. They needed to HBM2 memory and on 7nm before Navi is possible. I am reading into it a bit but I suspect if you are going to "glue" chipps together like Zen did you need shared memory which means very high bandwidth ie HBM2 and you need to be able to cool the multi chip GPU which 7nm gets you there.Rudde - Wednesday, June 6, 2018 - link
AMDs slides mention a >35% performance increase for 7nm. I wouldn't expect consumer cards before 2019 (professional cards this year). They should launch the 600-series in a similar timeframe. I too hope we'll see gddr5 on budget models.FullmetalTitan - Thursday, June 7, 2018 - link
I'm not sure what the price handicap will be for using HBM2 by next summer. A lot of high volume memory producers (SK Hynix/Samsung) are making a ton of HBM2 with pretty good yields now (packaging was tough in first gen), and not really increasing GDDR5X production.I don't think prices will be at parity by any stretch, but cost/GB of bandwidth might be. If you match THAT metric, HBM2 is a no brainer since it has the win on a lot of other critical areas (real estate on card, power consumption, heat production).
zodiacfml - Thursday, June 7, 2018 - link
Looks like no new graphics this year.nathanddrews - Thursday, June 7, 2018 - link
FreeSync in Samsung TVs. To quote POTUS: "This is a 'uuuuuuge" deal!". The more FreeSync we get into ALL displays, the more absurd NVIDIA's BFD (and G-Sync in general) will appear. It's all a precursor to HDMI 2.1 VRR Game Mode. Can't come soon enough.