A) Note that NvAGP is the default. You only have to set this if "1" (the nvidia AGP handler) isn't working properly. B) I believe 24 bits are accelerated, and that only 32 bits use the frame buffer The IgnoreEDID is only for when your screen reports wrong data. Most newer monitors don't need this. Anders On Sunday 08 July 2001 17:35, Curtis Rey wrote:
Sorry, 2 different issues. A) set the agp option as noted. B) run X in 16 bit colors because 24/32 bit colors requires a framebuffer (this is independent of the agp setting). I wanted you to try both these varaible to help improve the performance. The essential on is not run in anything higher than 16 bit. The agp setting is to help with further optimization. Have a look at my settings:
Section "Screen" DefaultDepth 16 SubSection "Display"
Section "Device" BoardName "GeForce 2 MX" BusID "1:0:0" Driver "nvidia" Identifier "Device[0]" Screen 0 VendorName "Nvidia" Videoram 32768 Option "IgnoreEDID" "1" Option "NvAGP" "1" (also 0=off, 2=agpgart, 3=try gart then nvagp) (1=nvagp builtin agp driver) Option "fifo_aggresive" "1" (might also try fifo_moderare or fifo_conservative) EndSection
On Sunday 08 July 2001 10:09, you wrote:
Curtis Rey wrote:
Paul. Try:
Option "NvAgp" "1"
To use the nvidia's agp drivers, and run in 16 bit, 24/32 is a framebuffer and degrades performance.
I'm confused. Are you saying that installing this option will degrade performance? Or are you saying that "NvAgp 1" uses 16 bit buffers while other NvAGP values use 24 or 32 bit buffers and degrade performance?
Is NvAgp documented anywhere that I can read about it?
Paul