But, for me, the question would be why bother?
A 'pure' digital signal should produce better results since it has not undergone a 'conversion' process. Once it is analogue you can't convert it back to the original digital signal, that has 'gone'. All you would do is convert it to the digital 'equivalent' of an analogue signal.
So, one would go from digital in the graphics card, conversion to analogue then conversion back to a digital analogue signal. Each conversion would, probably, lower the quality so you would end up with a '3rd generation' digital signal probably worse than the '2nd generation' analogue one!
My LG1710B has both digital and analogue inputs and my radeon AIW 9000 has digital out. I have tried it both as 'pure' digital and through the converter. The digital is noticeably crisper and clearer, to my eye at least!