Sun, 24 Dec 2006
Hassles with DVI output on an nvidia fx5500 card at 1680x1050Well I got myself the Proview 22 inch lcd monitor from MSY (I love their JUST THE FACTS basic web design style as well).
Nice basic cheap and cheerful flatscreen monitor that looks nice on the desktop. It has a litle bit of "backlight bleeding" from the top and bottom, but only visible when the screen is nearly all black, none of it is visible during my normal usage patterns of it so I don't care. The model number is FP2226w.
Main problems aoccured trying to get the FX5500 to drive the monitor via DVI.. the nvidia drivers refused to go higher than 1440x900 when I can easily go to the max mode using the VGA mode, but it looked terrible. After some reading up and trial and error, the problem appears to be the line in the Xorg.0.log saying:
135.0 MHz maximum pixel clockWhich appears to be a driver enforced limit. Adding the option "NoMaxPClkCheck" to the driver section like:
Section "Device" Identifier "nVidia Corporation NV34 [GeForce FX 5500]" Driver "nvidia" Option "ModeValidation" "NoMaxPClkCheck" Option "NvAGP" "3" EndSectionStopped this arbitrary limit being enforced for enumerating video moded and now I have the monitor running perfectly at best resolution.
One problem with the Leadtek Winfast A340 PRO T card is that it has a really crap heat sink and no fan and was overheating when playing True Combat Elite or Enemy Territory.. another visit to MSY to get a $3 80mm case fan to bolt on to the heat sink and heat problems mostly resolved, I can play at 1024x768 or 1280x800 without any heating problems kicking in.. still hangs at 1440x900 for those games, but then again the game isn't playable on an FX5500 at that resolution anyway. Other OPenGL games play fine at full resolution so it must be the Quake3 derived games that really work the card hard or something.