In a manner that appears it may be building momentum, AMD has released another new driver update. As this is a point driver release the changes aren't immense, but AMD has pushed through some stability fixes along with some other improvements to prep their drivers for some forthcoming game releases. As always these are all welcome news items.

First off there are a few notes on Ashes of the Singularity. Along with providing performance and quality optimizations there is also a fixed issue with a 'Driver has stopped responding' error showing up while playing in DirectX 12 mode.  AMD does note however that there still remains issues with the game crashing on some AMD 300 series GPUs and Ashes of the Singularity may fail to launch on some 2GB cards.

For Star Wars: Battlefront high performance graphics can now be used on devices running switchable graphics. AMD is also aware of a small number of users who are experiencing crashes with GTA V on some AMD Radeon R9 390X GPUs. Some changes have been made that should resolve the issue and they will continue to monitor user feedback on the problem.

On the stability front AMD has brought fixes for a couple of TDR errors which caused a crash when toggling between minimized and maximized mode while viewing 4K YouTube content or running the Unreal Engine 4 DirectX benchmark. Additionally playback issues with both MPEG2 and intermittent playback issues in Cyberlink PowerDVD when connected to  a 3D display through HDMI have both been resolved. Lastly there was a problem with driver installation halting on some configurations that is now fixed.

Those interested in reading more or installing the drivers for AMD's desktop, mobile, and integrated GPUs can find them on AMD's Catalyst beta download page.

Source: AMD

Comments Locked

45 Comments

View All Comments

  • Morawka - Thursday, October 15, 2015 - link

    the fact that you had to bring up all that irrelevant stuff makes your argument look even weaker.

    My point was, When Indie, AA and AAA Games are released, Nvidia has Day 1 Drivers ready AND HAS SLI Profiles, AND Geforce Experience Settings... oh did i mention they do all of this and still make WHQL Certification?

    DirectX12 is not used in any games so that's irrelavant..
    Price, Of course AMD will win Performance Per Dollar, That's because nobody's buying them. Nvidia has 75% GPU market share for crying out loud.

    The whole windows 10 thing is blown out of proportion and i have yet to have 1 single problem. You wanna talk about bad windows 10 gpu driver support, start talking about Intel.
  • RussianSensation - Friday, October 16, 2015 - link

    I don't understand the point of your post. AMD's drivers for single GPUs have allowed GCN to not only wipe out all the advantage Kepler GPUs have had over the last 3 years but cards like 380 and 390 are beating their direct competitors - 960 and 970. Again, if AMD's drivers were so "terribad", how is that possible?

    "My point was, When Indie, AA and AAA Games are released, Nvidia has Day 1 Drivers ready AND HAS SLI Profiles, AND Geforce Experience Settings"

    So now we are bringing SLI vs. CF into the mix? Roughly 300,000 gamers worldwide have SLI/CF and Steam has over 125 million users. Moving on then. Even on this topic, you ignore hard scientific data. All I read from your is an opinion:

    "Where the Fury X Crossfire setup won big was in Thief where it was 50% faster and Total War: Attila where it was 36% faster. Removing Thief's result sees the Fury X cards losing to the GTX 980 Tis overall by 1%. Now for the interesting part, typically we expect Nvidia to have the edge when looking at frame time (99th percentile) performance, but this wasn't the case here. The R9 Fury X Crossfire cards were on average 22% faster when comparing the 99th percentile data."
    http://www.techspot.com/review/1033-gtx-980-ti-sli...

    See that's how facts actually work. Your point that AMD's drivers are shit is hilarious considering they are winning the $80-550 price/performance and Fury X CF loses to 980Ti SLI only when 980Ti SLI is overclocked. That's pretty good for a claim that "their drivers are crap."

    "oh did i "mention they do all of this and still make WHQL Certification?"

    WHQL certification means nothing to me as an Intel/AMD/NV user. What do I care if the drivers are WHQL certified or not? In the past AMD had WHQL certified drivers and they were worse than they are in the last 4 years. So if you just want a badge of WHQL, knock yourself out.

    "Geforce Experience Settings"

    1. AMD has Raptr.
    2. I am old enough to know how to adjust settings on my own. If you need a console-like hand-holding to tune the settings for you, that's your choice and nothing wrong with that but to say that GeForce Experience means NV has better drivers is laughable as GeForce Experience has nothing to do with stability or performance of actual games for users who own AMD/NV cards and adjust settings on their own.

    "DirectX12 is not used in any games so that's irrelavant.."

    Sure, it's 100% relevant because you are generalizing how all AMD drivers are garbage and yet AMD is winning in all price segments besides 980Ti vs. Fury X in 2/2 DX12 games. So DX12 driver performance doesn't matter now? Ok, we'll revisit in 2016 and beyond.

    "Price, Of course AMD will win Performance Per Dollar, That's because nobody's buying them. Nvidia has 75% GPU market share for crying out loud."

    What a crazy argument. So you are a hardcore NV user and if AMD offers amazing price/performance it's because they make garbage products? I hope you are not even 18 years old as that's forgivable for someone with so little PC gaming experience. Just a reminder, NV's GeForce 3 Ti 200, GeForce 4 Ti 4200, GeForce 5900XT, GeForce 6800 non-U unlocked, 6800GT, 6600GT, 7800GT/7950GT, 8800GT, GTX460/470, 560Ti, and in recent era GTX970 offered amazing price/performance. Does that mean NV is desperate cuz no one was buying them? Try to make a coherent argument if you are going to make a point.

    Let's try this - single GPUs:

    Sept 2014
    780Ti $699 = beats 290X by 14% at 1080P, 9% at 2560x1600
    https://www.techpowerup.com/reviews/NVIDIA/GeForce...

    Sept 2015
    780Ti = beats 290X by only 4.5% at 1080P, 0% at 2560x1600
    https://www.techpowerup.com/reviews/Colorful/iGame...

    780Ti cost $699 while 290X cost $549. Your argument continues to fail while AMD's drivers for GCN are far superior to whatever NV has been able to produce for Kepler. This looks even worse considering 780Ti cost more and is no better than a cheaper 290X today.

    Going back to your SLI vs. CF argument, it falls apart even more.

    290X CF beats 780Ti SLI and 970 SLI at 1440P:
    http://www.sweclockers.com/test/20216-nvidia-gefor...

    See, those are what I call facts. I am providing real data that disproves your point and all you do is keep repeating the same thing how AMD drivers are crap.

    "The whole windows 10 thing is blown out of proportion and i have yet to have 1 single problem. You wanna talk about bad windows 10 gpu driver support, start talking about Intel."

    Google - nvidia driver issues windows 10

    Way to ignore how NV also had many driver issues - destroyed GPUs in laptops due to broken fan profiles due to NV's control panel, completely broken full RGB mode over HDMI for more than a decade, insanely horrendous blurry LOD under SSAA mode all the way until October 2012 -- proof:
    http://www.computerbase.de/2012-10/nvidia-geforce-...

    AMD's VSR still has superior IQ to NV's DSR - tested scientifically in games:
    http://www.overclock.net/t/1529509/computerbase-de...

    If you are going to have an opinion as strong as "AMD's drivers are crap", you better have data to bring to the table or you will get called on it.

    But considering it seems you are emotionally invested in NV, it sounds to me like nothing anyone will post will change your mind to an objective state regardless. Just my opinion.
  • BurntMyBacon - Friday, October 16, 2015 - link

    Ouch. And I thought my bacon was burnt.

    I personally use ATi exclusively for HTPC applications. That said, I favor nVidia for multi-GPU builds. More importantly, I favor single-GPU builds over multi-GPU builds. It is common knowledge that nVidia generally has the edge over ATi in multi-GPU setups. Here's a few recent articles I spent 30s to find dating from current to 2013:
    http://www.hardocp.com/article/2015/10/06/amd_rade...
    http://www.hardocp.com/article/2015/09/28/asus_str...
    http://techreport.com/review/21516/inside-the-seco...
    http://www.pcper.com/reviews/Graphics-Cards/Frame-...

    Feel free to look around. There are more to be had. However, as you stated and linked earlier, nVidia is not immune to issues either. I prefer to just stay out of such a volatile and update dependent setup.

    My personal experience with nVidia and ATi is that they have both had their quirks for single card setups and not very many have been insurmountable from either team. ATi has generally offered more performance at the same cost with a number of standout cards from nVidia (I believe you mentioned some earlier). Until recently, nVidia has been much better about getting developer support, which gives them a bit of an advantage with day 1 drivers. ATi has been making up ground here recently.

    For the time being, I'll stick with ATi for HTPC, nVidia for multi-GPU, and the best deal for my single card needs. The situation could obviously change at any point and I'm waiting to see how the "Sync" wars play out.
  • Chaser - Thursday, October 15, 2015 - link

    If you prefer inefficient, power sucking, room heaters, for video cards and CPUs that are slower, then AMD is a great option.
  • RussianSensation - Friday, October 16, 2015 - link

    What do CPUs have anything to do with this discussion? I buy either NV/AMD depending on what's better at the time I upgrade.

    Sapphire Fury runs quieter at max load than a 980Ti does at idle:
    http://www.anandtech.com/show/9421/the-amd-radeon-...

    How loud and hot a videocard runs is a large function of its cooling system. It's possible to have a 250W TDP quieter card than a 180W TDP one. If you don't do your own research, then sure you'll make stupid claims how AMD's cards run hot and loud.

    As far as power efficiency goes, since HD4800 days, AMD cards have made me tens of thousands of dollars due to bitcoin mining. As a tech enthusiast I will take tens of thousands of US dollars over saving $3 a month in my electric bill. Thanks though.
  • The_Countess - Tuesday, October 27, 2015 - link

    ah yes a 30 watt difference in game between a 980Ti vs fury X make one a space heater while the other isn't.

    that makes perfect sense obviously.
  • kurahk7 - Wednesday, October 14, 2015 - link

    That's funny. AMD released their SWB drivers over a week before NVidia did. http://www.anandtech.com/show/9667/amd-releases-ca...
  • Morawka - Thursday, October 15, 2015 - link

    That is BS,, read the article and the patch notes.. amd specifically said this driver introduces flickering in SWBF3 in crossfire X setup.. that's the entirety that was said about star wars BF.
  • D. Lister - Thursday, October 15, 2015 - link

    @kurahk7

    Err... wasn't 15.9 rather hurriedly rolled back by AMD because of a crippling memory leak?
  • silverblue - Thursday, October 15, 2015 - link

    ...with an update, 15.9.1, following a few days later.

Log in

Don't have an account? Sign up now