Dear ATI/AMD, Please fix your video drivers!

I just recently finished a whole lot of code whereby the canvas in paint.net 4.0 is now rendering via Direct2D. This has numerous advantages: layer composition is finally off the UI thread, it’s faster, CPU usage is lower, and I even added back those darned “dancing ants” that everyone seems to be so fond of from ye ol’ version 3.36, etc. etc. Not to mention the next set of changes … which I won’t mention yet. Ok I will: I’m planning to move all rendering off the UI thread. I’m maybe 1/4th of the way there since layer composition is off of it, and now I have to get all of the tools to do the same (much of the rendering code in paint.net is multithreaded, but it still forks/joins/blocks on the UI thread, so the logic is still inherently single threaded).

Anyway.

It’s neat when ATI and NVIDIA keep one-upping each other by adding fancy antialiasing and other gizmo settings into their control panels. It’s fun to download a new game and then shoehorn whatever antialiasing and anisotropic filtering settings you want to use with it. Booting up the original Deus Ex and forcing it to use 131,072x antialiasing is a treat (for about 5 minutes anyway). However, it would be smart if they would not apply those settings to regular desktop applications.

Case in point. If you try to turn on “morphological filtering” on an ATI Radeon 5850 with the absolutely latest drivers downloaded just yesterday from their website: (click for full size)

I wish I had written a filter that made the text look that way. It’s supposed to be normal Calibri, but this “morphological filtering” makes the canvas look like an oil painting that’s being rained on. It’s fully animated and very trippy (hmm, I have some friends who may be interested in this). This also happens with other Direct2D-based applications, such as Windows Live Messenger 2011, so I’m quite sure it’s not some bit of wonky code I checked into paint.net 4.0.

So, here’s my plea to ATI: please fix your drivers so that this doesn’t happen with Direct2D-based desktop applications. I don’t want to add code that detects an ATI video card, and then snoops the registry or whatever you store configuration into, detects this morpho-whatever thing, and then forces software rendering. From what I can tell, all this setting really does is make games blurry and slow, and it’s worse than that “Quincunx” stuff that NVIDIA had on the GeForce 3/4.

And NVIDIA, I’m keeping my eye on you too …

Anyway this Radeon has got to go. My workstation had a GeForce GTX 260 Core 216 in it, and I’ll be reverting to it. I upgraded my gaming PC from the Radeon 5850 to a GeForce GTX 570, and figured why not put the Radeon in the workstation? (which has a Core i7-980x at 4.0GHz btw, mwahaha) The benchmarks show it’s quite a bit faster. Unfortunately this card is just glitchy, for additional reasons other than the LSD setting shown above. I don’t know if it’s because I’m running dual monitors in a weird configuration (one landscape, one portrait), or if I’m just unlucky. It never had any problems playing games. Oh well.