It's really odd as depending upon which cable I attach to the monitors they claim to only offer a certain resolution. I know the 1280*768 screen will work at that resolution because it does so with the cable from the 1920*1080 screen. However if I use one of the other cables I can only get 1028*800 and yet another only gives 800*600 whilst a third cable offers 640*480. I've tried all combinations and they stay constant on both screens.
The cable that gives the maximum resolution has all pins in place whereas the others have one pin omitted. Looking at the cable configuration the missing pin seems to be Ground so that shouldn't make a difference as the cable/socket have Ground in place when connected.
Sorry but I should have said - it's all standard 15 pin SVGA type connections found on any computer and monitor.
The 15 pin cables that work fully all have the full 15 pin configuration whereas the others have only 14 of the 15 pins (which seems a fairly common arrangement missing out the redundant 'Ground').
I do note that people sell more expensive cables for 1920*1080 (which happens to be FULL HDTV standard) for connecting a computer to a large screen through the PC and TV SVGA connection.
I'm just confused why different cables of apparently similar quality produce very different results that are often well below the maximum obtainable from the graphics card and the maximum resolution of the screen.
This thread is now locked and can not be replied to.