How to fix: Paint.NET "breaks" with Vista SP2 Beta

I’ve had some reports that installing the Windows Vista SP2 beta (or “CPP”) breaks Paint.NET v3.36.

You’ll get an error message like so:

Contrary to the error, Paint.NET v3.36 does not require .NET Framework 3.5 SP1.

There are two ways to fix this:

1. Install .NET Framework 3.5 SP1. I recommend doing this anyway, because it has numerous fixes and performance improvements that make Paint.NET happy.

2. Go to the directory you installed Paint.NET, and remove all the files with the “.exe.config” extension. This will un-confuse the .NET loader stuff.

This seems to be something related to the .NET Client Profile, although I’m not sure what the root cause is. I’ll be reporting this bug to the right people, so that it can be fixed.

Advertisement

Installing .NET 3.5 SP1: Please wait … Forever!

The very cool thing about Paint.NET v3.5 is that it installs quite fast on a fresh Windows XP SP2 machine. And that includes the installation of prerequisites like Windows Installer 3.1 and the Client Profile version of the .NET Framework 3.5 SP1. Even on my new little Atom 330 box* it is kind of pleasantly fast. I’d even say it’s fun. (The unfortunate thing is that Paint.NET v3.5 is not yet out of “pre-alpha” …)

Intel BOXD945GCLF2 Atom 330 Mini ITX Motherboard/CPU Combo

Intel BOXD945GCLF2 Atom 330 Mini ITX Motherboard/CPU Combo

Intel BOXD945GCLF2 Atom 330 Intel 945GC Mini ITX Motherboard/CPU Combo


Once you jump over to Windows Vista, the story becomes very very very very dire. It took a full hour to install .NET 3.5 SP1. The hard drive was thrashing and yelling the entire time, and CPU usage was quite high. In the middle of this, a Windows Update dialog popped up in the corner telling me I needed to restart. That sounds like a bad idea since I’m still in the middle of installing a new system component! This paints a very bleak picture for getting .NET 3.5 SP1 and Paint.NET v3.5 successfully deployed to the large userbase that I have currently sitting on .NET 2.0 and Paint.NET v3.36. I’m afraid that most users will see the .NET installer “hanging” at 40% and just restart their computer, or cancel it, or kill it using Task Manager. How fun will it be for users to click on “Install Update” only to have to wait an hour before they can use their computer again, let alone Paint.NET?

I honestly don’t think it’s worth 1 hour to install a 2 MB program. Even Adobe Photoshop and Mathematica 7.0 install in minutes, and they are hundreds of megabytes.

This isn’t a random or one-off occurrence — this is not the first time I’ve seen this. Almost every time I’ve installed .NET 3.5 SP1 on to any system, whether it’s mine or someone else’s, the same thing happens. It doesn’t matter if it’s an Atom or a brand new 3.0GHz Core 2 Duo, it still takes one full hour. Sometimes you can actually get the installation to complete quickly if you go and make sure that Windows Update is completely caught up. Even then, you can never be completely sure. Any system that isn’t used 8+ hours/day by a computer-industry professional like myself is likely to be at least 1 update behind. (I’ll bet a Core i7 965 could do it in 45 minutes though :))

This is very frustrating, to say the least. On the positive side I know some of the people who work on this stuff, and they’re all great people who want things to be awesome. You can be sure I’ll be e-mailing them soon πŸ™‚ And with any luck, the “GDR” update that’s coming (soon?) will have already fixed this. Cross your fingers.

Performance of the Atom 330 is actually surprisingly good. The results of 32-bit PdnBench are almost exactly the same as a Pentium 4 3.0 GHz “E” Prescott chip — about 180 seconds for completion — which is impressive to say the least. Back in the day (2004) that P4 chip consumed so much power that some reviewers melted their motherboards, whereas this Atom barely even needs a heatsink. In 64-bit mode, the Atom 330 pulls ahead to 155 seconds. Those results use 2 threads on the P4 (single core w/ HyperThreading), and 4 on the Atom (dual core w/ HyperThreading).

* Actually it’s not really a box. It’s small, and not inside of a case. Maybe “kit” would be a better term?

** Yes, I’m testing out some newegg.com affiliate stuff. If you’re interested in the Atom 330 board listed above, then please click on the “Buy” button above. Just like Amazon affiliate links, if you buy it via that link then I get a tiny amount of the purchase price. It doesn’t cost you anything extra. It’s another way to support Paint.NET πŸ™‚

Goodbye Pentium 4, Hello Atom

Sadly, I fried my Pentium 4 test system a few days ago, which had proven invaluable in my performance testing of Paint.NET v3.5. I went to turn it on* and the screwdriver missed by a few millimeters, shorted the wrong pins, and … bzzzt. No more P4.

* Since this system was “bread boxed,” meaning that it wasn’t inside of a case or anything, turning it on involved shorting the two pins that the power button is normally wired directly straight to.

Fortunately I have one of these on the way from newegg. Along with twenty dollars worth of RAM (2 GB), I will soon have a new performance test bed.

It’s a motherboard with a soldered-on Intel Atom 330 CPU for $80. It’s dual-core, supports 64-bit, and has HyperThreading. And it runs in a small 8W power envelope (well, the CPU itself anyway).

Think about it: for $80 you can get started with a system that supports 4 hardware threads! I will probably disable the second core and HyperThreading, as my primary purpose is low-end, single-core performance testing. It will be interesting to see how the Atom scales with HyperThreading and the second core turned on.

My main complaint is that this motherboard only has VGA output: DVI is not an option. For what I’m using it for, this won’t matter, but it certainly prevents me from recommending it to others, especially for HTPC / Media Center systems.

Maybe in a few months I’ll be able to purchase a Dual Xeon based on the Nehalem/Core i7 architecture. 2 chips, 8 cores, 16 threads … we’ll pit it against the Atom and see who wins πŸ˜‰

Paint.NET v3.5: "Improved rendering quality when zoomed in"

Brad Wolff recently wrote a comment on my earlier post, “Change of plans – here comes Paint.NET v3.5” :

“Rick – You mentioned that 3.5 will have β€œImproved rendering quality when zoomed in”. Can you elaborate on this? My fear is that we will end up having to look at the blurred mess that Windows Picture Viewer displays when zoomed in. Please tell me I am wrong!” — Brad Wolff

Brad, you’re wrong πŸ™‚ And it’s in a good way. Paint.NET v3.5 does not use bilinear or bicubic resampling when zooming in, which is the cause of the blurred mess that you mention in Windows Picture Viewer. In fact, it is now using the same resampling algorightm for zooming in that has been employed for zooming out: rotated grid supersampling. The old resampling method was the simple nearest neighbor one. It was very fast, especially when paired with a lookup table for avoiding a per-pixel division operation. The image quality problem with nearest-neighbor is very apparent between 101% and 199% zoom levels: you end up with a moire of 1-pixel wide and 2-pixel wide samples and it just looks awful. With supersampling, we are able to achieve a smoothed look that does not blur as you zoom in.

Here’s an example from Paint.NET v3.36, where I’ve drawn a normal circle and some scribbles with the paintbrush tool. The zoom level was then set to 120%:

Here’s the same, but in Paint.NET v3.5:

At this zoom level, each pixel from the image should be drawn as “1.2” pixels on-screen. In v3.36, this entails drawing 4 pixels at 1-pixel width, and then a fifth pixel at 2-pixel width. Put another way, every 5th pixel is doubled in size. In v3.5, each source pixel ends up with a uniform width and the overlaps are smoothed together in a much more pleasing manner. (This is done on the y-axis as well — replace ‘width’ with ‘height’ in the preceding paragraph and it’s also true.) It will still maintain a “pixelated” appearance as you continue zooming in, which is what you want, but the edges between samples will look smoother.

This does come at a performance cost, but I believe it’s worth it. It also scales well with multiple cores, so it’s something that will be faster with each new CPU upgrade. I’ve also experimented with using bilinear and bicubic resampling — it’s fun, but too expensive and blurry. You would need an 8-core system for it to be comfortable.

A fluent approach to C# parameter validation

Fluent programming gets a bad reputation, since some developers like to write code like the following:

var time = 7.Days().Plus(4.Hours())

Barf. However, when used properly, I think it’s very powerful. Let’s look at a typical method with some parameter validation:

// Copy src[srcOffset, srcOffset + length) into dst[dstOffset, dstOffset + length)
public static void Copy<T>(T[] dst, long dstOffset, T[] src, long srcOffset, long length)
{
    if (dst == null)
        throw new ArgumentNullException(“dst”);

    if (src == null)
        throw new ArgumentNullException(“src”);

    if (dstOffset + length > dst.Length || dstOffset < 0)
        throw new ArgumentOutOfRangeException(
            “dst, dstOffset, length”,
            string.Format(“dst range is not within bounds, ({0} + {1}) > {2}”, dstOffset, length, dst.Length));

    if (srcOffset + length > src.Length || srcOffset < 0)
        throw new ArgumentOutOfRangeException(
            “src, srcOffset, length”,
            string.Format(“src range is not within bounds, ({0} + {1}) > {2}”, srcOffset, length, src.Length));

    if (length < 0)
        throw new ArgumentOutOfRangeException(“length”, “length must be >= 0, ” + length.ToString());

    for (int di = dstOffset; di < dstOffset + length; ++di)
        dst[di] = src[di – dstOffset + srcOffset];

}

That’s a lot of code for parameter validation, but in a robust system it is necessary. For debugging purposes, having all the information in there with the actual parameter values is invaluable, so that you can get a stack trace that tells you, “Length was too big. It was 50, but the max was 49.”

The problem here is twofold. One, code like this gets sprinkled all over the codebase of a large application and so it gets repetitive, tiresome, and is a bug hazard. Having an off-by-1 error is many times worse if it’s in your validation code. Or, because it’s tiresome, sometimes there just won’t be any validation.

The second problem is actually much more subtle. Ask yourself this: if both src and dst are null, what exception does the caller get? (and subsequently, what goes into the crash log or Watson upload?) It will only tell you that dst is null. This leads to more iterations in debugging than is optimal, where you fix the problem of dst equaling null only to immediately get it crashing on you again when src is null. If the exception told you about both errors, you could have saved a lot of time.

This happens more often than I’d like when debugging issues on other people’s systems, especially ones I don’t have any direct access to (physical or remote, ala Remote Desktop). The end-user will post a Paint.NET crashlog to the forum, I’ll fix it and send them a patch or new build, and then the same method will crash on the very next line of code. This is especially relevant to methods for graphics which take parameters for stuff like width, height, location, bounding box, etc. The X value may be bad, but the Y value might also be bad. I need to know about both, along with the valid ranges (and not just “out of range”).

There are times where I have fixed issues with no direct interaction with a user: if I get a bunch of crash logs for a certain issue, but I can’t reproduce it, I have often been able to fix it by incorporating a hopeful and conservative fix into the next release and then monitoring to make sure that no more crash logs come in. And yes, I’ve done that many times with Paint.NET.

Reporting an aggregated judgement like this is just not fun. To go the extra mile you need to create a StringBuilder, decide on the preciding exception type, manage concatenation of multiple parameter names (“sentence-ization”), etc. Like this …

public static void Copy<T>(T[] dst, long dstOffset, T[] src, long srcOffset, long length)
{
    StringBuilder sb = new StringBuilder();
       
    if (dst == null)
        sb.Append(“dst. “);

    if (src == null)
        sb.Append(“src. “);

    if (sb.Length > 0)
        throw new ArgumentNullException(sb.ToString());

    if (dstOffset + length > dst.Length || dstOffset < 0)
        …

    if (srcOffset + length > src.Length || srcOffset < 0)
        …

    if (length < 0)
        …

    if (sb.Length > 0)
        throw new ArgumentOutOfRangeException(sb.ToString());

    …
}

Boo. This is still tiresome, and creates extra objects, etc. Because of the extra work involved, this tends to be done reactively instead of proactively. Only the “hot” methods get the comprehensive logic.

I’ve come up with another method. Check this out:

public static void Copy<T>(T[] dst, long dstOffset, T[] src, long srcOffset, long length)
{
    Validate.Begin()
            .IsNotNull(dst, “dst”)
            .IsNotNull(src, “src”)
            .Check()
            .IsPositive(length)

            .IsIndexInRange(dst, dstOffset, “dstOffset”)
            .IsIndexInRange(dst, dstOffset + length, “dstOffset + length”)
            .IsIndexInRange(src, srcOffset, “srcOffset”)
            .IsIndexInRange(src, srcOffset + length, “srcOffset + length”)
            .Check();

    for (int di = dstOffset; di < dstOffset + length; ++di)
        dst[di] = src[di – dstOffset + srcOffset];
}

Yow! Ok that’s much easier to read. And here’s the kicker: if no problems are found with your parameters, then no extra objects are allocated. The cost for this pattern is only in the extra method calls.

There are three classes involved here: Validate, Validation, and ValidationExtensions. Here’s the Validate class:

public static class Validate
{
    public static Validation Begin()
    {
        return null;
    }
}

That was easy. This allows us to not allocate a “Validation” object, and its enclosed fields, until we actually encounter a problem. The presiding philosophy in code that uses exception handling is to optimize for the non-exceptional code path, and that’s exactly what we’re doing here. Here’s the Validation class:

public sealed class Validation
{
    private List<Exception> exceptions;

    public IEnumerable<Exception> Exceptions
    {
        get
        {
           
return this.exceptions;
       
}
    }

    public Validation AddException(Exception ex)
    {
        lock (this.exceptions)
        {
            this.exceptions.Add(ex);
        }

        return this;
    }

    public Validation()
    {
        this.exceptions = new List<Exception>(1); // optimize for only having 1 exception
    }
}

It’s basically just a list of exceptions. AddException() returns ‘this’ to make some of the code in the ValidationExtensions class easier to write. Check it out:

public static class ValidationExtensions
{
    public static Validation IsNotNull<T>(this Validation validation, T theObject, string paramName)
        where T : class
    {
        if (theObject == null)
            return (validation ?? new Validation()).AddException(new ArgumentNullException(paramName));
        else
            return validation;
    }

    public static Validation IsPositive(this Validation validation, long value, string paramName)
    {
        if (value < 0)
            return (validation ?? new Validation()).AddException(new ArgumentOutOfRangeException(paramName, “must be positive, but was ” + value.ToString()));
        else
            return validation;
    }

    …

    public static Validation Check(this Validation validation)
    {
        if (validation == null)
            return validation;
        else
        {
            if (validation.Exceptions.Take(2).Count() == 1)
                throw new ValidationException(message, validation.Exceptions.First()); // ValidationException is just a standard Exception-derived class with the usual four constructors
            else
                throw new ValidationException(message, new MultiException(validation.Exceptions)); // implementation shown below
        }
    }
}

The sum of these collections allows us to write validation code in a very clean and readable format. It reduces friction for having proper validation in more (or all? :)) methods, and reduces the bug hazard of either incorrect or omitted validation code.

Missing from this implementation, and other kinks to work out:

  • Could use lots of additional methods within ValidationExtensions. (some were omitted for brevity in this blog post)
  • Calling ValidationExtensions.Check() is itself not validated. So, if you forget to put a call to it at the end of your validation expression then the exception will not be thrown. Often you’ll end up plowing into a null reference and getting a NullReferenceException, especially if you were relying on ValidationExtensions.IsNotNull(), but this isn’t guaranteed for the other validations (esp. when dealing with unmanaged data types). It would be simple to add code to Validation to ensure that its list of exceptions was “observed”, and if not then in the finalizer it could yell and scream with an exception.
  • The exception type coming out of any method that uses this will be ValidationException. This isn’t an issue for crash logs, but it is for when you call a method and want to discriminate among multiple exception types and decide what to do next (e.g., FileNotFoundException vs. AccessDeniedException). I’m sure there’s a way to fix that, with better aggregation, and (hopefully) without reflection.
  • Should probably change the IEnumerable<Exception> in Validation to be Exception[].

Here’s the implementation of MultiException, as promised in the code above. And, in fact, it’s incomplete because it does not print all of the exceptions in a ToString() type of call. Umm … how about I leave that as an exercise for the reader? πŸ™‚

[Serializable]
public sealed class MultiException
    : Exception
{
    private Exception[] innerExceptions;

    public IEnumerable<Exception> InnerExceptions
    {
        get
        {
            if (this.innerExceptions != null)
            {
                for (int i = 0; i < this.innerExceptions.Length; ++i)
                {
                    yield return this.innerExceptions[i];
                }
            }
        }
    }

    public MultiException()
        : base()
    {
    }

    public MultiException(string message)
        : base()
    {
    }

    public MultiException(string message, Exception innerException)
        : base(message, innerException)
    {
        this.innerExceptions = new Exception[1] { innerException };
    }

    public MultiException(IEnumerable<Exception> innerExceptions)
        : this(null, innerExceptions)
    {
    }

    public MultiException(Exception[] innerExceptions)
        : this(null, (IEnumerable<Exception>)innerExceptions)
    {
    }

    public MultiException(string message, Exception[] innerExceptions)
        : this(message, (IEnumerable<Exception>)innerExceptions)
    {
    }

    public MultiException(string message, IEnumerable<Exception> innerExceptions)
        : base(message, innerExceptions.FirstOrDefault())
    {
        if (innerExceptions.AnyNull())
        {
            throw new ArgumentNullException();
        }

        this.innerExceptions = innerExceptions.ToArray();
    }

    private MultiException(SerializationInfo info, StreamingContext context)
        : base(info, context)
    {
    }
}

What if XP SP3 were the minimum OS?

Currently, the minimum version of Windows that Paint.NET will run on is XP SP2. Unfortunately, it’s starting to show it’s age and it’s making a big hassle for the installer. The issue is that a “fresh” installation of XP SP2 does not have Windows Installer 3.1, whereas XP SP3 does. I have all sorts of custom code to detect this, and special packaging rules for creating my ZIP files and self-extractors. It adds about 2MB to the Paint.NET v3.5 download, although it greatly improves the user experience and reduces friction for getting our favorite freeware installed. I was hoping to get the .NET 3.5 Client Profile installer to auto-download Windows Installer 3.1, but unfortunately it has a hard block on this before it even starts to parse the Products.XML file which contains the installation manifest and logic.

If I were to set the minimum system requirement to be XP SP3, then it would greatly simplify things!

There’s no charge to upgrade from XP SP2 to XP SP3. So, why isn’t everyone using it yet? I have a thread over on the forum where I’m asking any XP SP2 users to reply and tell me why they haven’t upgraded to XP SP3 yet. So far the reasons are: dial-up, too busy, and “didn’t see a reason to.” (actually that last one came to me via a private message, so you won’t see it on the forum)

I’d like to extend the discussion to this blog: if you haven’t upgraded from XP SP2 to XP SP3, please post a comment and let me know why. I’m not trying to make judgements here, so please don’t be shy — I’m simply on a fact-finding mission. The sooner I can bump up the minimum requirement to XP SP3, the better things will be: the download size will go down, I can spend more time on other engineering tasks, less time testing, and I can drink more beer. All three of these make someone happier.

This also brings to light the issue of prerequisite management on Windows, and for freeware apps. First, why isn’t it easier to deal with prerequisite OS components? Second, in the eyes of a typical user, what leverage or authority does a 1.5MB freeware (Paint.NET) have in dictating what service pack level you should have installed? If Photoshop were to require SP3, you can bet that a user who just paid $650 is going to install it so that they can get their money’s worth! And it probably isn’t a good idea (or feasible!) for Paint.NET to auto-download and install an entire service pack. Which means that the user experience involves the trusty message box that says, “You don’t have ___insert stupid computer nerd babble here___. Click Yes to do something even more confusing, or No to go back to what you were doing before.”