"No signal" from DVI on LCD screen

  kainos 11:06 30 Aug 2005
Locked

I have just bought an LCD monitor with DVI. It works fine with the analogue cable but when I use the DVI cable instead the screen gives the message "no signal".
Am I missing something obvious or is it likely a fault with the monitor or with my graphics card.

  Jackcoms 13:28 30 Aug 2005

"It works fine with the analogue cable but when I use the DVI cable instead the screen gives the message "no signal"".

Check the DVI cable.

  wee eddie 13:57 30 Aug 2005

This will depend on the Graphics Card you have installed. If necessary look up the specification on the Manufacture's site.

  kainos 16:16 30 Aug 2005

Thank you wee eddie. I have checked the ATI website for specification of my graphics card, Radeon 8500 LE, and there is no mention of DVI.
However the card has a socket which takes the DVI cable. Perhaps the DVI cable is the same for TV Out which the card does support.
If that is the case then I think this is where I have made my mistake.
Perhaps someone could tell me what the DVI type socket is for on the graphics card, please.

  wee eddie 16:24 30 Aug 2005

This may help you

click here

  kainos 16:40 30 Aug 2005

Thanks wee eddie, very conclusive, I guess that answers the question.
However I have to admit that I don't understand one piece of that review:
ATi curtails manufacturing time and cost by removing support for DVI; the DVI output is missing and the DVI transmitter is absent between the fan and upside-down Rage Theater chipset. Although the Radeon 8500 LE lacks Digital Flat Panel monitor support, it still retains Hydravision, which is ATi's patent dual-display support using two 400 MHz RAMDACs"

What is Hydravision and what does 'dual-display support' mean.
Should I post that question as a new thread?

Thanks for your help.

  wee eddie 17:06 30 Aug 2005

2 screens.

The Hyrda, in Mythology, had many heads.

Note the date of the review, "8th June 2002" many of the advances will have been superseded or in the dustbin by now!

  Eric10 18:02 30 Aug 2005

You haven't said what make and model of monitor you have. I have 2 dual input Dell monitors and they have a switch to change between VGA and DVI. Perhaps yours is the same.

  DieSse 18:35 30 Aug 2005

Er - if you used the DVI cable - presumably you DO have something on the back of your graphics card to plug into?

  DieSse 18:39 30 Aug 2005

Ah - you said you did - got confused after looking at the link. Maybe you have a none-ATI card (ie not actually made by ATI) which does have DVI output.

In which case you may need to look at the display driver settings to access the DVI output.

The DVI output socket will NOT be used as a TV out.

  kainos 18:20 31 Aug 2005

DieSse - Thank you, the link provided by wee eddie (and quoted above) tells us that "ATi curtails manufacturing time and cost by removing support for DVI; the DVI output is missing and the DVI transmitter is absent between the fan and upside-down Rage Theater chipset." This would surely answer why the DVI socket is there.

Eric10 - the monitor is Digimate L 1918 and has a menu switch between Analogue and DVI.

Wee eddie - thanks again and you are of course quite right about the date of the review. Presumably this suggests that the Radeon card was produced at a time when support for DVI was not as desirable as it is now.

Could someone please tell me whether there is worthwhile difference in quality between analogue and digital; there doesn't seem to be in television terms.

This thread is now locked and can not be replied to.

Elsewhere on IDG sites

AMD Radeon Adrenalin release date, new features, compatible graphics cards

Indie publisher Canongate’s top 10 book covers of 2017

New iMac Pro release date, UK price & specs rumours

Tablettes Amazon Fire : quel modèle choisir ?