Sun, 24 Dec 2006
Hassles with DVI output on an nvidia fx5500 card at 1680x1050Well I got myself the Proview 22 inch lcd monitor from MSY (I love their JUST THE FACTS basic web design style as well).
Nice basic cheap and cheerful flatscreen monitor that looks nice on the desktop. It has a litle bit of "backlight bleeding" from the top and bottom, but only visible when the screen is nearly all black, none of it is visible during my normal usage patterns of it so I don't care. The model number is FP2226w.
Main problems aoccured trying to get the FX5500 to drive the monitor via DVI.. the nvidia drivers refused to go higher than 1440x900 when I can easily go to the max mode using the VGA mode, but it looked terrible. After some reading up and trial and error, the problem appears to be the line in the Xorg.0.log saying:
135.0 MHz maximum pixel clockWhich appears to be a driver enforced limit. Adding the option "NoMaxPClkCheck" to the driver section like:
Section "Device" Identifier "nVidia Corporation NV34 [GeForce FX 5500]" Driver "nvidia" Option "ModeValidation" "NoMaxPClkCheck" Option "NvAGP" "3" EndSectionStopped this arbitrary limit being enforced for enumerating video moded and now I have the monitor running perfectly at best resolution.
One problem with the Leadtek Winfast A340 PRO T card is that it has a really crap heat sink and no fan and was overheating when playing True Combat Elite or Enemy Territory.. another visit to MSY to get a $3 80mm case fan to bolt on to the heat sink and heat problems mostly resolved, I can play at 1024x768 or 1280x800 without any heating problems kicking in.. still hangs at 1440x900 for those games, but then again the game isn't playable on an FX5500 at that resolution anyway. Other OPenGL games play fine at full resolution so it must be the Quake3 derived games that really work the card hard or something.
posted comments -
Is your Proview display running 1680x1050 at 60hz now on your FX5500? After you made the modification, did you have to create a custom resolution, or did the driver just offer that resolution next time you opened up the Display properties dialog based on your monitor's capabilities (assuming you're running Windows)?
The reason I ask is because I just bought an FX5500 and was planning to use it for my new Dell W207WFP widescreen, which has a native resolution of 1680x1050 at 60 hz. According to the literature, the FX5500 is only supposed to support up to 1600x1200 on DVI.
Would you be able to send me your modified driver files, or else point out how I could do the modification you point out here?
Your blog contains the only posting I've been able to find that gives a workaround for this issue with the FX5500 (other than simply going VGA)!
Hoping you catch this post! Thanks.
Yes my proview is definately running at 1680x1050 at 60Hz with my FX5500 card.
I guess I didn't make it clear in the post but I am running Linux and the above tweak is for the config file for Xorg Xserver. Fortunately the nvidia driver options for Linux have parameters for overriding some of the driver limits, as the card itself can definately run at 1680x1050 perfectly, with good performance and stability. It is more likely the driver was made without the dimensions of a widescreen monitor in mind, because 1680x1050 is actually less pixels than 1600x1200, they are just arranged differently.
Maybe there are some windows driver tweaks for overriding the defaults as well, as it was the
artificial "maximum pixel clock" limit in the driver that was the barrier for me.
Hope this helps...
Thanks much. You're right that it seems very logical that it should be able to run at 1680x1050 if it can run at 1600x1200. A lot of owners for this card have hit a blank wall on this, and you're the first person I've encountered who's found a workaround. There are a lot of posts on the nVidia and evga (the vendor for my particular FX5500) complaining about this problem.
Do you have any idea where I might go to get a start on making a driver modification like this on Windows drivers? I haven't the faintest idea.
Thanks for the interaction.
No, I don't know how to make such a tweak to the windows drivers for this card.. do any of the overclocking type utils for nvidia under windows allow you to mess with this parameter? Maybe ask nvidia themselves if they have an equivalent config for the "NoMaxPClkCheck" parameter in the Linux driver on windows.
I just got a Samsung 206bw 20" monitor that does 1680x1050 DVI. Just like you, I have an nVidia FX5500 and it refuses to go higher than 1440x900. I edited my xorg.conf file (I'm using Ubuntu Gutsy Gibbon 7.10) to add the option you mention above, but this doesn't seem to help. My log file sez:
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(**) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
(**) NVIDIA(0): Option "ModeValidation" "NoMaxPClkCheck"
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
(II) NVIDIA(0): enabled.
(II) NVIDIA(0): NVIDIA GPU GeForce FX 5500 at PCI:1:0:0 (GPU-0)
(--) NVIDIA(0): Memory: 262144 kBytes
(--) NVIDIA(0): VideoBIOS: 04.34.20.69.00
(II) NVIDIA(0): Detected AGP rate: 4X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce FX 5500 at PCI:1:0:0:
(--) NVIDIA(0): Samsung SyncMaster (DFP-0)
(--) NVIDIA(0): Samsung SyncMaster (DFP-0): 135.0 MHz maximum pixel clock
(--) NVIDIA(0): Samsung SyncMaster (DFP-0): Internal Single Link TMDS
(II) NVIDIA(0): Mode Validation Overrides for Samsung SyncMaster (DFP-0):
(II) NVIDIA(0): NoMaxPClkCheck
(II) NVIDIA(0): Assigned Display Device: DFP-0
(WW) NVIDIA(0): No valid modes for "1680x1050@60"; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1440x900@60"
(II) NVIDIA(0): "1600x1024@60"
(II) NVIDIA(0): "1280x800@60"
(II) NVIDIA(0): "1280x720@60"
(II) NVIDIA(0): "1280x768@60"
(II) NVIDIA(0): "800x600@60"
(**) NVIDIA(0): Virtual screen size configured to be 1920 x 1200
(--) NVIDIA(0): DPI set to (85, 84); computed from "UseEdidDpi" X config
(--) NVIDIA(0): option
Is this still working for you? What version of the nvidia drivers are you using (I have 1.0-9639)?
Got it working! See this thread in the nvnews forums (you'll have to add the http to the front yourself because the blog software won't let me post a URL):