Comments Locked

124 Comments

Back to Article

  • mdw9604 - Thursday, August 18, 2016 - link

    They should have just called this a 1030. Come on.
  • Jon Tseng - Friday, August 19, 2016 - link

    GTX 1050 Ti!
  • nwarawa - Friday, August 19, 2016 - link

    100% agree. There is no reason this should not have been called a "GTX 1050 Ti". Frankly, Nvidia need another class-action against them... apparently they didn't learn their lesson with the GTX 970. False advertising with spec-obsessed geeks is never going to go over well.
  • smilingcrow - Saturday, August 20, 2016 - link

    There's no false advertising here any more than there is with an RX 480 4GB with lower memory speeds than the 8GB model.
    Although I think it's dumb of them, it's not false. Caveat emptor
  • futurepastnow - Friday, August 19, 2016 - link

    Or even just "GTS 1060"
  • Murloc - Friday, August 19, 2016 - link

    1050, it's not that much weaker.
  • mdw9604 - Friday, August 19, 2016 - link

    Half the RAM, Disabled SM. It's a 1030.
  • Ro_Ja - Friday, August 19, 2016 - link

    1030 is too much, people would be very disappointed. Just call it GTX 1050 Ti.
  • mdw9604 - Sunday, August 21, 2016 - link

    They will be really dissapointed in Xmas 2017 when BF6 comes out and needs 4GB to play at full detail and this $200 1060 won't cut it.
  • TheinsanegamerN - Monday, August 22, 2016 - link

    if battlefield needs that much vram to run at full detail, I doubt the 1060 would be powerful enough to run it fully maxed out with acceptable frames anyway.
  • kpb321 - Thursday, August 18, 2016 - link

    I expected NVIDIA to release a card using a partially enabled GP106 as it just makes sense to do that. I HATE HATE HATE the fact that they still called it a 1060. IMO they should NEVER be using the same major model number for a card with significant differences like that. Yeah it's only a 10% difference but that is still a reasonable difference. Call it a 1050 or a 1055 or a 1060SE if you have to but don't call it a 1060.

    It is as bad as some of the OEM cards or laptop chips which could have anything from ddr3 to GDDR5 or different memory widths or different shader counts for the same model card.
  • Friendly0Fire - Thursday, August 18, 2016 - link

    Entirely agreed. It's not the first time they use cut down silicon, I don't know why this time they decided to use such a confusing naming scheme. I understand that they most likely have a separate 1050 in the pipeline and want this to be seen as a slightly slower 1060 rather than a beefed up 1050, but you can still do better than this, Nvidia. There has to be an acceptable marketing choice that isn't unnecessarily misleading.
  • coinopshot - Thursday, August 18, 2016 - link

    They've done it before with the 8800 GTS 320 MB and 8800 GTS 640 MB. Not like it's new or anything.
  • e36Jeff - Thursday, August 18, 2016 - link

    Don't forget the 8800GTS 512MB. They actually had 3 different cards under the 8800 GTS nameplate.
  • AnnonymousCoward - Friday, August 19, 2016 - link

    Don't forget the 8800GT!
  • Gigaplex - Saturday, August 20, 2016 - link

    At least that one has a different name variation.
  • kpb321 - Thursday, August 18, 2016 - link

    Just because it isn't new doesn't mean we have to like it or that it is a good practice.
  • Flunk - Thursday, August 18, 2016 - link

    And the GTX 460 768GB and 1GB. They should have called this card the 1050.
  • Samus - Friday, August 19, 2016 - link

    There were at least FOUR versions of the GTX460 if I remember correctly.
  • AndrewJacksonZA - Friday, August 19, 2016 - link

    Confusing naming scheme = possible confusion from uneducated or ignorant buyers = more sales because "It's a cheaper 1060!"

    Nvidia lost some respect from me in trying to deceive people. I don't like it when companies lie and are dishonest and unethical. (And people complain about AMD's marketing department. Sheesh... *smh*)
  • Murloc - Friday, August 19, 2016 - link

    except that ignorant buyers see 3GB and think it has half the performance of the 6GB one, so I'm not sure about that.

    It's bad to name it like this, but I'm not sure they can expect a profit from this.
  • wifiwolf - Friday, August 19, 2016 - link

    there are many levels of ignorance. i see where you're coming from but some of them will just see cheap 1060. It's just like people buying cheap ipads not knowing it's 3 or 4 generations behind.
  • HollyDOL - Friday, August 19, 2016 - link

    tbh marketing pple should be banned from IT :-)
    Engineers would just put up specs table with deterministic version and that would be it.
  • DanNeely - Friday, August 19, 2016 - link

    Should've been the GTX 1060 and GTX 1060 Ti.
  • Alexvrb - Friday, August 19, 2016 - link

    This is *EXACTLY* what I was thinking when I read the article. Unfortunately, as you know, it's too late for that. They made the decision to lead the Ti off the fully enabled version and the resulting mess is this. Personally I don't think I'd buy anything with less than 4GB of VRAM at this price range, unless you planned on dumping it in a year. Even 4GB is a little meh, spend an extra 40-60 and get a 6/8GB model if you want a couple years lifespan. Or if you want to run Doom maxed now. :P
  • Morawka - Thursday, August 18, 2016 - link

    Uses to much power to call it a 1050.. Especially if they are gonna do a 1050m using a slightly binned (for lower power profile) desktop Gpu die.

    This product was probably created by nvidia as a response to the 199 RX480.. I bet it still walks all over the 480 4GB
  • just4U - Friday, August 19, 2016 - link

    Uh.. didn't the 1060 6G need to walk all over the RX480 for this lesser product to stand a chance? It didn't so... wth is up with your comment?
  • Morawka - Friday, August 19, 2016 - link

    yeah it did, unless your clinging to a couple vulkan only benchmarks. wont considerably in dx11 titles by up to 30%
  • Morawka - Friday, August 19, 2016 - link

    it won, ***
  • Alexvrb - Saturday, August 20, 2016 - link

    Depends on the title and the settings. Even with the fully enabled 6GB version in some DX11 titles they were neck and neck. It also depends on resolution. For example, I've seen Rise of the Tomb Raider DX11 mode 1440p tests that put the 1060 ahead by like... 2%. At 1080p it was a whopping 8%. Witcher 3, 7-8%. Unless you crank Hairworks way up which murders your performance anyway on high settings if you're using a "lowly" 1060. The Division also has them running neck and neck. That's DX11.

    Even less-favorable test settings for the 480 have it losing by 15% on average. I've read a good few reviews and YMMV. There are games like Battlefield or GTA V where 30% is realistic. But typically the difference is lower, and if you look at a wide range of games and settings the 3GB 1060 isn't exactly going to murder the 4GB 480.

    Then there's DX12 and Vulkan, which if utilized well (for example titles where you actually see a speed-up on both vendors) seem to do pretty decent on the 480. Even titles like Hitman where the DX12 path was added later do better on the latest GCN. But perhaps DX12 and Vulkan are a silly fad and developers are going abandon these efforts... they'll bring back Glide instead. Or just abandon the GPU nonsense altogether and go back to software rendering!
  • Alexvrb - Saturday, August 20, 2016 - link

    And yes there are some reviews that use different settings and/or configurations that have Witcher 3 winning by a larger margin (again, 10-15%) or but there are plenty of other examples to show the average is a lot lower than 30%. Tomb Raider I've seen several reviews that have either the 480 losing, tying, or tying in 1440p. I've seen GTA V benches with the 480 much closer. Arkham results with them tying or the 480 slightly ahead. 480 wins in Black Ops III, ties in Just Cause 3.

    Anyway the point is that the 30% number is disingenuous at best, and ignoring the latest APIs is downright silly.
  • Sttm - Thursday, August 18, 2016 - link

    They are cutting down the silicon, but they are also cutting down the price. If they released it with the same performance, but from lower clocks, you wouldn't have even cared. At the end of the day a $200 1060 wont be as fast as a $250 1060... what difference does it make if that is because of Clockspeed or Cuda Cores?
  • ragenalien - Thursday, August 18, 2016 - link

    Because half the people on here are whiny babies for consumer protection for some reason. Honestly, they make it pretty obvious on the box why it's cheaper. Half the memory should tell an uninformed person that it has less performance than the more expensive models. It's better than the old practice of selling 4gb ddr3 cards for more than the 2gb GDDR5 cards, people bought them because the number was bigger.
  • Cygni - Friday, August 19, 2016 - link

    We are "whiny babies for consumer protection" because we ARE the consumers. Also yea, I'm sure the partners will make it "obvious on the box" that this 1060 has not just less memory, but also a cut down, less functional version of the chip itself when compared to other 1060s sitting next to it on the shelf. Uh huh. Sure.
  • osxandwindows - Thursday, August 18, 2016 - link

    Smart move.
  • Brian Z - Thursday, August 18, 2016 - link

    Typo under the gpu layout diagram. " the GPU used in the GTX 1060 3GB ships with 1 of the 10 SMs enabled. This leaves 9 SMs enabled,"
  • TheinsanegamerN - Monday, August 22, 2016 - link

    What's confusing about that? It uses all 11 of the 10 SMs.
  • shabby - Thursday, August 18, 2016 - link

    No founders edition?
  • RaistlinZ - Thursday, August 18, 2016 - link

    More like Fool's Edition.
  • Michael Bay - Saturday, August 20, 2016 - link

    Your pain is not even funny anymore.
  • vladx - Thursday, August 18, 2016 - link

    They should've just called this GTX 1050.
  • cosmotic - Thursday, August 18, 2016 - link

    "ships with 1 of the 10 SMs disabled"

    I think this implies they intentionally crippled the card to meet a price point, but it's more likely they are finding a use for chips where one of the SMs had a flaw that would otherwise prevent the use in an a product that called for all 10 to be working.
  • mdw9604 - Thursday, August 18, 2016 - link

    Intel has been doing this for a couple of decades.
  • edzieba - Friday, August 19, 2016 - link

    EVERYONE who makes GPUs and CPUs has been doing this. It's standard practice.
  • MrSpadge - Friday, August 19, 2016 - link

    No, "disabled" does not imply neither crippling or die harvesting, it's just a neutral formulation. In practice it's going to be both: die harvesting first, and simply disabling weak but functional SMs to meet any further demand of those cards.
  • Alexvrb - Saturday, August 20, 2016 - link

    As Spadge said, this doesn't imply anything sinister. They all do it to sell more chips and boost the "effective" yield.
  • ToTTenTranz - Thursday, August 18, 2016 - link

    10th paragraph, second sentence:
    "The latter has been in very short supply since its launch, so at this second NVIDIA has a pretty solid grip on the $199 price point at this secnd."

    Typo on the second "second", which is also being written for a second time in this sentence.
  • Jtaylor1986 - Thursday, August 18, 2016 - link

    Whether rightly or wrongly I can almost guarantee that someone will manage to file a class action suit over this in the future.
  • zmeul - Thursday, August 18, 2016 - link

    someone(s) at nVidia marketing department should seek new employment - the new Pascal based Titan, also called Titan X; a new GTX1060 that's not a GTX1060
    oh, by the way, the new Titan X is a deep learning card - even if it has a bright green label on it GeForce GTX
    for fucks sake nVidia ..
  • MrSpadge - Friday, August 19, 2016 - link

    "oh, by the way, the new Titan X is a deep learning card"

    No, it's a graphics card which is suited very well to deep learning, among many other things.
  • zmeul - Friday, August 19, 2016 - link

    Quaddros are graphic cards too, and yet don't have GeForce GTX logo on the side
  • LarsBars - Thursday, August 18, 2016 - link

    Since we dislike this move by Nvidia so much, let's make sure we don't buy the card, to send a message to them.
  • LordanSS - Thursday, August 18, 2016 - link

    Even though I own a 980GTX, this is my last nVidia card.

    I got bitten by the 8800GTS, which was my last nVidia card until this..

    Fool me once, shame on me. Fool me twice, well.... I do not condone these practices. And I shall vote with my wallet.
  • Michael Bay - Friday, August 19, 2016 - link

    So what your next last nV card will be?
  • LordanSS - Friday, August 19, 2016 - link

    My next card will be a Vega. And if the DX12 multiGPU features manage to finally get rid of the SLI/Crossfire issues, I'll have another.

    Not touching team green again, but you're free to troll as you will.
  • D. Lister - Saturday, August 20, 2016 - link

    @LordanSS

    That's good, AMD certainly could use your money. One more customer would probably raise their quarterly profit by a couple of percents.

    Incidentally, you gotta love it when some random poster comes out of nowhere to make a lopsided remark and then when a site regular points to a gap in his logic with a harmless quip, he calls HIM a "troll". I would say, "well played", but that would be unreasonably generous.
  • sorten - Thursday, August 18, 2016 - link

    No big deal. One of the rare times that uninformed consumers will be getting the better deal. It's a 20% drop in price for a 5% hit in performance.
  • Nagorak - Thursday, August 18, 2016 - link

    But a real hit to potential longevity due to only 3 GB, and people buying a card at this price point generally aren't the type who upgrade every year.
  • Michael Bay - Friday, August 19, 2016 - link

    It should hold its own for a year or two if you consider that target audience is on 1080p and probably not much into demanding titles.
  • Jon Tseng - Friday, August 19, 2016 - link

    Yeah the GTX950 was aimed at Chinese free to play MMORPG gamers. If this is hitting the same segment (maybe a tad more expensive?) then 3GB is enough.

    FTP games never push the graphics limit because the business model is predicating on having a big installed base to monitise - setting the specs bar too high would be counterproductive.
  • MrSpadge - Friday, August 19, 2016 - link

    Well, it's got the "3 GB" sticker right in the name. So don't blame nVidia for people not noticing. And if the VRAM becomes a limit, the 5% computing performance difference won't matter at all.
  • Nagorak - Thursday, August 18, 2016 - link

    A 5% reduction in speed puts it less ahead of the RX480 in DX11 (the cards were almost neck in neck before, with the 1060 just a hair ahead) and way behind in DX12. Not to mention the extra 1 GB of memory pushes back the day when that becomes a problem. I think the 4 GB RX 480 is honestly the better buy between these two. It can play current games just fine, and it will do better with future games.
  • marees - Friday, August 19, 2016 - link

    Considering that the RX480 4GB was a virtual launch (it had 8GB physical memory) you should compare it against the RX470 4GB in which case the 1060 comes out stronger
  • Dark Man - Friday, August 19, 2016 - link

    Is '1050' a bad name ?
  • JamsCB - Friday, August 19, 2016 - link

    Naming issues aside, this is almost certainly about yeilds, and being able to still sell the part rather than trash it. They obviously had a noticeable percentage of GP106 chips in which one SM failed going through test. If they'd had a higher percentage of units with two SMs failing, that would have been what they were releasing. And as previously pointed out, this is a fairly common practice. As pointed it earlier, Intel's been doing it for decades.
  • webdoctors - Friday, August 19, 2016 - link

    I woulda called it the 1060SE, or 1060 VE (value edition) or 1060 CE (cheap edition) or just 1055GTX, but that's just me...
  • benzosaurus - Friday, August 19, 2016 - link

    I give it about five minutes before manufacturers start taking advantage of this to sell cut-down 6GB cards, so no one will ever be able to find out how fast the card they're buying is.

    Still not as bad as the 850m that you couldn't even tell whether the one you were getting was Kepler or Maxwell.
  • MrSpadge - Friday, August 19, 2016 - link

    No, nVidia actually manages tight restrictions on what their partners do and what not - if they choose to do so. Example: no custom Titan X, no 4 GB GTX 1070 & 1080.
  • benzosaurus - Friday, August 19, 2016 - link

    Hmm. Good to know. Maybe there is hope.
  • D. Lister - Friday, August 19, 2016 - link

    This is very reminiscent of the GTX 260 launch, when Nvidia eventually released two different pieces of hardware under the same name. Nonetheless it is fairly asinine to needlessly create such confusion.
  • silverblue - Friday, August 19, 2016 - link

    Good memory. The 260 Core 216 replaced the vanilla 260 but I'm sure there would've been older cards still floating around.
  • Michael Bay - Friday, August 19, 2016 - link

    Would have been nice as a HTPC buy if not for that TDP.
  • Murloc - Friday, August 19, 2016 - link

    maybe a 1050 is coming for lower TDP
  • MrSpadge - Friday, August 19, 2016 - link

    Lower the power target to ~80 W and you should loose about 10% peak performance. It's a simple setting in any of the typical OC utilities.
  • Michael Bay - Friday, August 19, 2016 - link

    I don`t want an additional power plug in already tiny case.
  • TheinsanegamerN - Monday, August 22, 2016 - link

    Agree. the 1050 is our best hope.
  • yannigr2 - Friday, August 19, 2016 - link

    @Ryan Smith

    So this GTX 1060 is not equal to the other GTX 1060.

    Then how are these cards equal?

    http://www.geforce.com/hardware/desktop-gpus/gefor...

    I haven't seen the press so much upset about two cards sharing the same name, while having small differences in their specs. I mean in the case of GT 730 you have three totally different cards, I mean TOTALLY, and no tech site said anything. Now every publication is upset.

    Well, while I do like seeing Nvidia having to explain it's business practices for once, always AMD was the accused, I do find this peculiar. Until now only one tech site tried to answer this question to me. Most sites pretend to ignore the existence of GT 730.
  • MrSpadge - Friday, August 19, 2016 - link

    "... and no tech site said anything"

    That's wrong.
  • yannigr2 - Friday, August 19, 2016 - link

    No. You are wrong. The tech wasn't seeing anything strange in cards having the same name but completely different specs.
  • nightbringer57 - Friday, August 19, 2016 - link

    Such a mess in the lower range models is quite usual, and most observers have given up trying to complain about it. Especially because those models are typically so low end that you can barely play at all on them, so this doesn't make any real difference.
    While still not uncommon, it is more rare and more noticeable on higher-end models, where performance matters a lot more.
  • stardude82 - Friday, August 19, 2016 - link

    The Kepler GT 730 is sort of an important part as it is the best card you can find at the 25W point which is the limit for a non-PEG slot. It'll handle Source based games fine.
  • BrokenCrayons - Friday, August 19, 2016 - link

    I have a GT 730 with 1GB GDDR5. Based on my gaming experiences so far, I've had few to no problems running most modern games as long as I keep the resolution down to 1366x768 and don't turn on AA which is a highly unnecessary feature anyway. Other settings can wander upward toward the higher end. It's a pretty good GPU though I do regret not picking up the 2GB GDDR5 version for a little more cash.

    In my experience, people in the AT comments box tend to vastly overestimate the amount of GPU they "need" to be entertained and have a pretty unrealistic understanding of how good things are in the $60 GPU universe. It's not a perfect experience and I am going to forgo playing a few games until I get around to buying a newer more powerful low end GPU...though I admit that computers have become a lot more of a dissatisfaction than a source of entertainment since Microsoft went over to the Dark Side to mine data from their users so I'm heavily leaning toward Linux-ing my desktop and playing Tux Racer for the rest of my life.
  • yannigr2 - Friday, August 19, 2016 - link

    That GT 730 is a great little card for what you do. The 40GB/sec bandwidth is enough for most games at 768p resolutions and lower setting. If you where unlucky to choose the DDR3 Kepler version, which comes with 14GB/sec bandwidth at best, you wouldn't be able to play at any resolution without dialing all settings to low and accepting framerates between 10 and 30 fps.
  • Icehawk - Saturday, August 20, 2016 - link

    I honestly can't remember the last time I gamed at 1366x768, sometime well before 1998 though. Personally I would find that to be a terrible experience - I am gaming at literally 2x that resolution. Nor would I accept any settings below Medium and at that point I am considering my next card upgrade. Of course I am not in the same market or price bracket and if you are happy that is great.

    I play at QHD, High or greater settings, and AA (which is necessary unless you like jaggies even at high resolution and more so at lower ones)? I have a 970 and that's the minimum I would suggest for that setup unless you like a slide show. 1080p or below, and yes, the firepower needed is less but for AAA titles I still wouldn't go below a GTX660.

    I would venture a large percentage of AT posters play AAA titles at 1080p, High settings, and some AA at the minimum and that is why you are seeing folks like me who think a 730 is essentially worthless.

    And yes, I guess I could be "entertained" for less, I mean I used to be pretty thrilled with no color text games (hello Zork!)... but I need to get my Overwatch, No Man's Sky, Doom, etc on.
  • yannigr2 - Friday, August 19, 2016 - link

    It's the best card if it is the GDDR5 version. The DDR3+Kepler version is just a modern G210.
  • yannigr2 - Friday, August 19, 2016 - link

    Most people wanting to just defend Nvidia's business choices, where responding with a "who cares, it's a low end card". Well, unfortunately the low end graphics card category is no longer important, which means, that whatever tricks companies where doing in the sub $100 market, they are coming soon to the sub $200 market. That's why you never say "who cares about that card".

    And you are wrong. You can happily play on the GDDR5+Kepler card. it's a fast little card that will let you even to enable some medium quality options in modern games and still get nice framerates over 30fps. You can't play with the DDR3+Kepler version of the card. It comes with only 14GB/sec memory bandwidth, which means that you can play only some extremely light games that will be happy for example with an old GT 220. Except if you like playing at 15fps.
  • Ryan Smith - Friday, August 19, 2016 - link

    "Most sites pretend to ignore the existence of GT 730."

    The GT series is its own bag of crazy. Which is why I specifically mention the GTX series here, as these shenanigans shouldn't be happening with NVIDIA's premium cards.
  • yannigr2 - Friday, August 19, 2016 - link

    Well APUs and Intel's integrated graphics are killing the low end market. So, this bag of crazy is coming to lowest GTX models now.

    Nvidia brought Titan to the market because it needed to start recreating(8800GTX Ultra anybody?) some higher price points. So they first brought Titan and then the hi end Ti card. So, while in the past you only had one hi end model, GTX 480 for example, now you have three. GTX 1080, the New Titan X and the GTX 1080Ti that will probably come latter.

    So, Nvidia is moving it's products at higher price points. And it doesn't drop the GTX name in the lowest models probably because of marketing reasons. I am not expecting a GT 1040 and the 1050 is coming as a GTX. For me there is nothing strange looking at two different GTX 1060's. I have seen it before. Yes in cheaper cards, but as I said in the beginning there is no low end market anymore. There could be, if GTX 1050 and RX 460 where costing $60. Well, they don't.
  • extide - Saturday, August 20, 2016 - link

    Well, unfortunately that happens all the time in those low end GPU's which seem to be made from whatever is laying around at the time. I specifically remember this site lambasting them for this practice in the past, not sure if it was the GT730 scenario or not, but the point is no this isn't the first time they have bitched about it.
  • MrSpadge - Friday, August 19, 2016 - link

    "Clockspeeds are also unchanged... Consequently the total performance hit to the GTX 1060 3GB as compared to the original GTX 1060 6GB will be a combination of the reduced memory capacity and the loss of 10% of the shading/texturing/geometry resources."

    The disabled SM reduces the power consumption (everything else kept similar), so the card can boost higher for a given power target. That's likely the main source of "10% less shading/texturing/geometry resources causing a 5% performance drop".
  • nunya112 - Friday, August 19, 2016 - link

    Nvidia should get into trouble for making this a 1060. it has less core count. which makes it a different chip.
    it should be a 1050 with less CUDA than a 6gb 1060
  • r3loaded - Friday, August 19, 2016 - link

    Should've called it a 1060 LE to differentiate it properly. I get that they want to get a cheaper card out there with harvested GP106 cores, but this is a terrible way to do it.
  • NamelessTed - Friday, August 19, 2016 - link

    I can understand some of the issues with the naming scheme. It's maybe not the best but I don't feel that it's misleading or confusing if they are labeling 3GB vs 6GB.

    Ultimately it is 20% cheaper for a likely 5-10% performance difference sounds like a great value. The loss of extra VRAM is mostly a non-issue for a card in this range, IMO. It might be the difference of 4-5fps at max settings and low frame rates but closer to 1-2fps difference with settings adjusted to get into a 60-70fps range.
  • extide - Saturday, August 20, 2016 - link

    Having more or less ram on the card doesnt really affect frame rate like that -- you either have enough memory or you don't, and if you have enough memory then adding more doesnt help any. If you don't have enough then there is a very large performance dropoff.
  • Shadowmaster625 - Friday, August 19, 2016 - link

    How are you supposed to find these cards on newegg? A power search of ALL video cards with 3GB capacity yields only 3 results, none of which are GTX1060.
  • damianrobertjones - Friday, August 19, 2016 - link

    'Yet Not'.

    ?
  • Leyawiin - Friday, August 19, 2016 - link

    This is exactly like the GTX 460 768mb vs. the GTX 460 1GB. In fact its less different given the two GTX 460 models had a different memory bandwidth. No one was soiling their pants over those two back in the day. Then again, it would have been less confusing to have named it GTX 1055 or something.
  • neblogai - Friday, August 19, 2016 - link

    People in this price range do not change cards every year like those who have to own the latest fastest all the time. They (and I) buy cards to last ~3-5 years. 3GB of memory is definetly not enough. It must be suffering bad frame times in some games already.
  • TheinsanegamerN - Monday, August 22, 2016 - link

    If you were in that bracket, you would also know they are not pushing ultra@1080p. Many are playing on lower resolution, with lower details, with less demanding games. Even 2GB is more then enough for that.
  • Dangerous_Dave - Friday, August 19, 2016 - link

    Similar differences here as between the RX470 and the RX480. Why on earth did Nvidia give it the same name?
  • shadowbearer - Saturday, August 20, 2016 - link

    Idk i've been looking in the $200 range and the RX 480 has been sold out on newegg for over 2 weeks. I saw that this just came out and put my order in for one. i dont play at over 1080p, my monitor is 1050p and I'm upgrading from a HD 7950 twin frozr. From what I've heard, at 1080 the 3gb will work just fine. Someone looking at a $200-250 probably can't afford a monitor above that resolution to start with. If I wanted to go up from that resolution though I'd start to see the negative effects of only having 3gb to work with. Worst case I can flip the card on ebay maybe even for more than the $209 i paid for it with shipping and handling included. I'm excited for it though.
  • m1ngky - Saturday, August 20, 2016 - link

    So this is why my 1060 is labeled as a GTX 1060 6GB in device manager
  • versesuvius - Saturday, August 20, 2016 - link

    NVidia knows that it has lost this round to AMD, and is trying to use the opportunity that the lack of availability of AMD cards has provided to make as much money as it can, by resorting to dishonest practices such as labeling a disabled, dumbed down 1060 as a genuine 1060. This kind of dishonesty cannot exist in isolation within the company. It is a culture of dishonesty and cheating within NVidia.
  • hojnikb - Saturday, August 20, 2016 - link

    But when red team does it with laptop chips (and there are more than one instance, where there is a single model for two different chips) no one bats an eye.
  • versesuvius - Saturday, August 20, 2016 - link

    It is also dishonest to compare laptops with desktop discrete GPUs that come in shiny boxes with GTX 1060 prominently displayed on them. Rest assured that 9 out of 10 (and even that is a conservative estimate) laptop owners do not know or even care what graphic system is driving their displays as long as they can read their emails and do some browsing or typing on the move.
  • hojnikb - Saturday, August 20, 2016 - link

    It's the same thing though. GPU type is advertised on the laptop and because of this, you don't know what you're getting.
    So both companies do dishonest practices.

    At least with a desktop gpu, you know what you're getting, if you check the box.
  • D. Lister - Saturday, August 20, 2016 - link

    "NVidia knows that it has lost this round to AMD,"

    Hey, I wanna play "opposite day" too. Okay my turn, "you sound very well-informed and neutral".

    "by resorting to dishonest practices such as labeling a disabled, dumbed down 1060 as a genuine 1060."

    Now if only Nvidia started selling this new one at the same price as a regular 1060, your comment would actually be worth a penny.

    "This kind of dishonesty cannot exist in isolation within the company. It is a culture of dishonesty and cheating within NVidia."

    Exactly, like that time they said their GPU was an overclocker's dream, and it couldn't even run on default clocks without water-cooling. Yeah, TOTALLY dishonest.
  • versesuvius - Sunday, August 21, 2016 - link

    To point out AMD's claims or propaganda, not specs, as sign of dishonesty just confirms the fact that even supporters of NVidia acknowledge the systematic dishonesty of NVidia. Nvidia is not called the "Dark One" for nothing. Though to be honest, one envies the users of NVidia cards for the monies that they'll be getting back from class actions suits against NVidia for the foreseeable future.
  • D. Lister - Sunday, August 21, 2016 - link

    "Nvidia is not called the "Dark One" for nothing."

    lol, some people say the inevitable demise of the AMD brand would be a bad thing for the market. I think the bigger loss would be the AMD fanboys not being able to indulge with such amusingly ridiculous hyperboles. Simple minds live in a simple world, and it doesn't get any simpler than "black and white".
  • Michael Bay - Monday, August 22, 2016 - link

    No, they`ll go into total overdrive. After all, with AMD being gone, they are the trve vnderdogs now, fighting The Man and all.
  • Michael Bay - Monday, August 22, 2016 - link

    >The Dark One

    You can`t be serious.
  • versesuvius - Monday, August 22, 2016 - link

    I am quite serious. I do envy NVidia Adherents for the monies that they will getting back from NVidia. It is win win win all the way for them. They buy their DarkOne cards. They go out and buy some cheap DirectX 11 games and enjoy them while the DarkOne agrees on the next class action settlement and pays the Adherents their share of the loot back and by then the DirectX12 games are cheap and prevalent enough that they can go and buy some DirectX 12 games and keep on enjoying themselves. It was $30 for GTX 970. It may be higher for the 10 series. That can get the Adherents even three DX 12 games. On the other hand AMD card owners will get no money back. Nothing. Absolutely nothing. Somebody should take over that AMD, and fast.
  • Achaios - Saturday, August 20, 2016 - link

    The year is 2002. February 6. NVIDIA introduce the GeForce4 line "GeForce4Ti 4600, 4400, 4200" based on the NV 25 chip.

    The same day NVIDIA introduce their GeForce4 MX line, GeForce4 460, 440, 420, which are based...on the GeForce2 (NV 17) cards although labelled as GeForce4 cards.

    So yeah, we've seen all that dishonesty and underhanded practices from NVIDIA before.
  • D. Lister - Sunday, August 21, 2016 - link

    Okay, so basically you're saying that "rebranding" is dishonest, right?
  • D. Lister - Sunday, August 21, 2016 - link

    The year is "last year", the date doesn't matter, AMD introduce the 3xx line...
  • Michael Bay - Monday, August 22, 2016 - link

    Funny that you should mention it, I was thinking about getting a cheap HP laptop(their "15" line, it doesn`t have a brand name), and the only differences between 2015 and 2016 models outside of Windows version is, I kid you not, the presence of R5 M330. I guess the stale inventory got so big, AMD is just giving those away.
  • D. Lister - Monday, August 22, 2016 - link

    The GPU may be free-ish, but the drivers will "cost you". :p
  • Michael Bay - Monday, August 22, 2016 - link

    Well, if it can do something for Photoshop and movies, I`m fine. Not expecting to play on it or anything.
  • crashtech - Saturday, August 20, 2016 - link

    It's amazing that they are calling this cut-down card a 1060, but of course they will get away with it.
  • Hxx - Sunday, August 21, 2016 - link

    The naming will just confuse people. Maybe that's their goal. I can totally see folks going for a 1060 "on sale" and later finding out that its the cut down version.
  • MarkieGcolor - Monday, August 22, 2016 - link

    How much for the founders edition?
  • MarkieGcolor - Monday, August 22, 2016 - link

    Shenanigans!

    You would think they would learn their lesson after selling 3.5 gb cards... when do I get my $60 back for my 970 sli?
  • Mugur - Wednesday, August 24, 2016 - link

    Well, I have to admit that NVIDIA will "get away with it" if the performance is close enough of the 6 GB 1060 at 1080p. What I don't like is the price point. It means that when the 1050 will appear (128bit bus, 2 GB DDR5), it will be priced at 150$ or more. What about the 1050 with 4 GB that I'm sure will make an appearance? :-)

    So, although we saw a "democratization" of "VR" level of performance (whatever this means, since I bet the owners of the 2 headsets are no more than 1% of the total owners of the cards that can support them), which is great overall, we won't see it from top to bottom, but only in the mid-range.

    And I believe that 1050 will destroy RX 460 (higher clock speeds and better compression combined with 128bit bus for both). Not good for AMD. It's funny that even they have a good lineup (460, 470, 480), they are beaten (not by much, but anyways) at every category. Damn GloFo and its manufacturing processes... They managed to screw up every bit of innovation AMD had in the past 10 years. First cpus and now the gpus.
  • farturd - Thursday, August 25, 2016 - link

    The RX470's refence model it's actually 180 USD

Log in

Don't have an account? Sign up now