Although I've never used a monitor of that size, I have compared DVI and VGA and never seen a noticable difference.
I'm more inclined to think that the problem lies with a setting on his computer. Has he definately got the resoltion set at the monitor's native resolution? If not, there is a massive drop in image quality, even if DVI is used.
Also, the refresh rate can have an effect on the image quality too. I discovered this when my friend bought a Viewsonic monitor at xmas and at 60hz the text was a bit fuzzy. Increasing it to 72/75Hz (I can't remember which one now) made a big difference.
If you really want to confirm this, connect his monitor to your computer but use a VGA cable and see what the result is (you might have to use a DVI - VGA convertor which are normally supplied with DVI only graphics cards). I will be very surprised if simply switching from VGA to DVI will make any significant difference.
There's quite a noticeable difference on mine between VGA and DVI at the native resolution; I was very pleasantly surprised. The display is pin-sharp to the extent that some of the icons appear almost three dimensional.
I understand his pc won't go as high as 1680 x 1050 using the onboard graphics, and if he's going to have to buy a card it might as well have a DVI connector.
Think I'll point him in the direction of the 6200 card Gongoozler linked to; it's not much dearer than the older FX5200 I was considering