Re: [SLE] Nvidia / AV Out Not Center
On Wednesday 24 May 2006 23:58, Shawn Holland wrote:
Here is my new Xorg.conf <snip> I got the modeline I'm running from:
http://www.tkk.fi/Misc/Electronics/faq/vga2rgb/calc.html
I selected 640x480 NTSC TV Interlaced
The "monitor" section I'm not sure its right. I just left it the default settings of the Monitor it picked up when I loaded SaX2 -m 0=nvidia
I just adjusted the refresh rates.
here is my Xorg.0.log file output <snip> What happens is it switches to Composite and you can see a big black box where KDE should be loading. But nothing ever loads in there.
I feel I'm so close but for all I know I could be way off :)
Hi Shawn, I've had a look and my impression is your module is built and loading correctly :-) Here is what X found during it's probes: (--) PCI:*(1:0:0) nVidia Corporation NV15DDR [GeForce2 Ti] rev 164, Mem @ 0xfd000000/24, 0xc0000000/27, BIOS @ 0xfeaf0000/16 ... (--) Chipset NVIDIA GPU found ... (--) NVIDIA(0): Linear framebuffer at 0xC0000000 (--) NVIDIA(0): MMIO registers at 0xFD000000 ... (--) NVIDIA(0): VideoBIOS: 03.15.01.06.06 (--) NVIDIA(0): Interlaced video modes are supported on this GPU ... (--) NVIDIA(0): VideoRAM: 65536 kBytes ... (--) NVIDIA(0): Detected TV Encoder: Philips 7108 (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 8 bpp: 350 MHz (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 16 bpp: 350 MHz (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 32 bpp: 300 MHz There are *no* (EE) errors, *no* (!!) notices, *no* (NI) not implemented and *no* (??) unknowns. ;-) This is a very good sign. (WW) warnings are another matter, but they don't really look very serious: * There's the usual collection of 'missing' font paths/fonts (they're just not installed on your system.) AFAIK, this shouldn't 'break' your installation. * (WW) Open APM failed (/dev/apm_bios) (No such file or directory) Note: I don't know if this is a 'show stopper' or not... it looks like 'Advanced Power Management' to me, but I could be wrong... You probably want to check it out. ;-) * (WW) NVIDIA(0): Not using mode "512x384" (not a valid TV mode) Note: This one *definitely* needs investigating. You also have 77 tried/invalid display mode entries... "(vrefresh out of range)" ... "(hsync out of range)" ... "(bad mode clock/interlace/doublescan)" and "(no mode of this name)" and, finally, this highly targeted and informative warning: * (WW) NVIDIA(0): WAIT (0, 1, 0x2000, 0x00000494, 0x00000494, 0) :-) I 'see' three possible approaches to this: * configure the system to work on the primary display, first, without having the TV connected and plan on 'adding' the TV after the card is configured. * configure the system to work using the TV as the primary display, first, without having your usual primary display (monitor) connected to the system (i.e. "unplug the monitor") then 'adding' your monitor after the card is configured. * proceed as you are, attempting to configure the system with both the TV and your primary display connected. I don't recommend this.* (*)I'm a big fan of reducing problems down to the least common denominator... the least complexity... until I can establish a solid foundation (no 'unknowns' at play) that I can confidently build upon. It's your system, your time & effort and, eventually your how-to :-) so obviously the decision here is your's. The only other items I saw that raised a flag were these (==) default settings (Note... "TrueColor is 24-bit vs. a 'DefaultDepth' set to 16-bit): (==) NVIDIA(0): RGB weight 565 (==) NVIDIA(0): Default visual is TrueColor (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0) compared to these settings from xorg.conf: Section "Screen" DefaultDepth 16 SubSection "Display" Depth 16 Modes "640x480" EndSubSection Device "Device[0]" Identifier "Screen[0]" Monitor "Monitor[0]" EndSection I wouldn't start hacking xorg.conf again. At this point, with the card correctly detected and initialized and with no errors from loading the module, I'd see if SaX2 (with only one display connected... probably your monitor and not the TV)... can sort out the necessary settings on it's own. Looking at 'man SaX2' I see these possibilities: sax2 -r (reinit the hw db/detection and try again) sax2 -a (attempt autoconfigure) sax2 -m 0=nvidia (try SaX2's GUI again, specifying the module) I'm sure you and others will have some good ideas about how to proceed, as well. regards, Carl
On Thursday 25 May 2006 11:19 am, Carl Hartung wrote:
I've had a look and my impression is your module is built and loading correctly :-) Here is what X found during it's probes:
(--) PCI:*(1:0:0) nVidia Corporation NV15DDR [GeForce2 Ti] rev 164, Mem @ 0xfd000000/24, 0xc0000000/27, BIOS @ 0xfeaf0000/16 ... (--) Chipset NVIDIA GPU found ... (--) NVIDIA(0): Linear framebuffer at 0xC0000000 (--) NVIDIA(0): MMIO registers at 0xFD000000 ... (--) NVIDIA(0): VideoBIOS: 03.15.01.06.06 (--) NVIDIA(0): Interlaced video modes are supported on this GPU ... (--) NVIDIA(0): VideoRAM: 65536 kBytes ... (--) NVIDIA(0): Detected TV Encoder: Philips 7108 (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 8 bpp: 350 MHz (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 16 bpp: 350 MHz (--) NVIDIA(0): Display device TV-0: maximum pixel clock at 32 bpp: 300 MHz
There are *no* (EE) errors, *no* (!!) notices, *no* (NI) not implemented and *no* (??) unknowns. ;-) This is a very good sign.
(WW) warnings are another matter, but they don't really look very serious:
* There's the usual collection of 'missing' font paths/fonts (they're just not installed on your system.) AFAIK, this shouldn't 'break' your installation.
* (WW) Open APM failed (/dev/apm_bios) (No such file or directory) Note: I don't know if this is a 'show stopper' or not... it looks like 'Advanced Power Management' to me, but I could be wrong... You probably want to check it out. ;-)
I believe you are correct. Simple solution: Section "ServerFlags" Option "NoPM" "true" EndSection **This disables APM in the XServer. Most newer hardware uses acpi instead of APM.**
* (WW) NVIDIA(0): Not using mode "512x384" (not a valid TV mode) Note: This one *definitely* needs investigating. You also have 77 tried/invalid display mode entries... "(vrefresh out of range)" ... "(hsync out of range)" ... "(bad mode clock/interlace/doublescan)" and "(no mode of this name)"
This caught my eye as well. I didn't understand why it was trying so many modes when the xorg.conf told it what modes to use. And there was only one. Is this just a standard probe detection by xorg? To determin which modes are available on your connected display device?
and, finally, this highly targeted and informative warning:
* (WW) NVIDIA(0): WAIT (0, 1, 0x2000, 0x00000494, 0x00000494, 0)
:-)
I wonder if this "wait" is whats causing no display!
I 'see' three possible approaches to this:
* configure the system to work on the primary display, first, without having the TV connected and plan on 'adding' the TV after the card is configured.
I can get it to work on a monitor by itself no problem.
* configure the system to work using the TV as the primary display, first, without having your usual primary display (monitor) connected to the system (i.e. "unplug the monitor") then 'adding' your monitor after the card is configured.
I can certainly try this. However, Using SaX2 defaults to use the vga out. I'm not sure how to tell it to go to composite out.
* proceed as you are, attempting to configure the system with both the TV and your primary display connected. I don't recommend this.* (*)I'm a big fan of reducing problems down to the least common denominator... the least complexity... until I can establish a solid foundation (no 'unknowns' at play) that I can confidently build upon. It's your system, your time & effort and, eventually your how-to :-) so obviously the decision here is your's.
This was the easiest solution. And I also used ssh from my desktop to do most of the xorg.conf changes just becasue its a lot easier to copy / paste!
The only other items I saw that raised a flag were these (==) default settings (Note... "TrueColor is 24-bit vs. a 'DefaultDepth' set to 16-bit):
(==) NVIDIA(0): RGB weight 565 (==) NVIDIA(0): Default visual is TrueColor (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
compared to these settings from xorg.conf:
Section "Screen" DefaultDepth 16 SubSection "Display" Depth 16 Modes "640x480" EndSubSection Device "Device[0]" Identifier "Screen[0]" Monitor "Monitor[0]" EndSection
Where its using the Monitor[0] and I left the monitor vendor setup as the NEC and not the tv which use to detect as VESA. I don't know if thats causing a problem.
I wouldn't start hacking xorg.conf again. At this point, with the card correctly detected and initialized and with no errors from loading the module, I'd see if SaX2 (with only one display connected... probably your monitor and not the TV)... can sort out the necessary settings on it's own. Looking at 'man SaX2' I see these possibilities:
sax2 -r (reinit the hw db/detection and try again) sax2 -a (attempt autoconfigure) sax2 -m 0=nvidia (try SaX2's GUI again, specifying the module)
I'm sure you and others will have some good ideas about how to proceed, as well.
regards,
Carl
-- Regards, Shawn Holland
On Thursday 25 May 2006 11:34, Shawn Holland wrote:
Section "ServerFlags" Option "NoPM" "true" EndSection
**This disables APM in the XServer. Most newer hardware uses acpi instead of APM.**
That was easy! ;-)
* (WW) NVIDIA(0): Not using mode "512x384" (not a valid TV mode) Note: This one *definitely* needs investigating. You also have 77 tried/invalid display mode entries... "(vrefresh out of range)" ... "(hsync out of range)" ... "(bad mode clock/interlace/doublescan)" and "(no mode of this name)"
This caught my eye as well. I didn't understand why it was trying so many modes when the xorg.conf told it what modes to use. And there was only one. Is this just a standard probe detection by xorg? To determin which modes are available on your connected display device?
It wouldn't surprise me if the default behavior was to try an array of modes if probing reveals a configuration that isn't compatible with the hardware. That's my take on it, anyway.
and, finally, this highly targeted and informative warning:
* (WW) NVIDIA(0): WAIT (0, 1, 0x2000, 0x00000494, 0x00000494, 0)
:-)
I wonder if this "wait" is whats causing no display!
This also wouldn't surprise me, assuming it can't identify a valid display mode.
* configure the system to work on the primary display, first, without having the TV connected and plan on 'adding' the TV after the card is configured.
I can get it to work on a monitor by itself no problem.
That's definitely a good sign. What I'm thinking is it might be helpful to pare xorg.conf down and refine it to just the entries needed to support your primary display. Theoretically, it should then be easier to 'add' support for the second display which, in this case, is the TV.
* configure the system to work using the TV as the primary display, first, without having your usual primary display (monitor) connected to the system (i.e. "unplug the monitor") then 'adding' your monitor after the card is configured.
I can certainly try this. However, Using SaX2 defaults to use the vga out. I'm not sure how to tell it to go to composite out.
I definitely saw instructions on this... maybe not with SaX2 but possibly with the nvidia-settings(name?) utility or... maybe it was a 'how-to' I read on the web? In any case, if you pursue this route, regardless of a utility or hand-editing xorg.conf, the end result would provide the parameters needed to 'speak' correctly to the TV. I imagine it would use the same settings to drive that output in the dual (not simultaneous) display arrangement. Side question: Did I interpret what I read about that card correctly? I got the impression it handles the 'splitting' between S-video and composite internally... meaning it doesn't matter if you connect through S-video or composite, the signals are available at both outputs simultaneously?
* proceed as you are, attempting to configure the system with both the TV and your primary display connected. I don't recommend this.* (*)I'm a big fan of reducing problems down to the least common denominator... the least complexity... until I can establish a solid foundation (no 'unknowns' at play) that I can confidently build upon. It's your system, your time & effort and, eventually your how-to :-) so obviously the decision here is your's.
This was the easiest solution. And I also used ssh from my desktop to do most of the xorg.conf changes just becasue its a lot easier to copy / paste!
What I'd really like to see (I even Googled a bit for it) is a copy of a working xorg.conf from a similarly configured system. Without the hardware or past experience you've got, I'm still trying to wrap my head around the number of required 'screen' and 'monitor' and 'mode' entries. ;-)
Where its using the Monitor[0] and I left the monitor vendor setup as the NEC and not the tv which use to detect as VESA. I don't know if thats causing a problem.
Yes, that's what I was looking at. But this ties into my comment, above, about coming to grips with what and how many entries are required to support these displays? Now I'm going for some Tylenol ... ;-) Carl
Addendum: Hi Shawn, I'm gradually dredging up tidbits. ;-) Here are two promising links: http://forums.suselinuxsupport.de/lofiversion/index.php/t35720.html and http://wiki.suselinuxsupport.de/wikka.php?wakka=ConfigFilesxorgconftvmonitor regards, Carl
There are some specific options for the nvidia driver that tells it to ignore the EDID list that it defaults to checking modes the monitor are capable of. This had given me problems in the past with widescreen detection. There are also options to ignore the detection of the type of display also, if you want to force the driver to use what you specify, you will need to have these options disabling the automatic stuff that ovverrides the manual settings in xorg.conf Option "IgnoreDisplayDevices" "string" This option tells the NVIDIA kernel module to completely ignore the indicated classes of display devices when checking what display devices are connected. You may specify a comma-separated list containing any of "CRT", "DFP", and "TV". For example: Option "IgnoreDisplayDevices" "DFP, TV" will cause the NVIDIA driver to not attempt to detect if any digital flat panels or TVs are connected. This option is not normally necessary; however, some video BIOSes contain incorrect information about what display devices may be connected, or what i2c port should be used for detection. These errors can cause long delays in starting X. If you are experiencing such delays, you may be able to avoid this by telling the NVIDIA driver to ignore display devices which you know are not connected. NOTE: anything attached to a 15 pin VGA connector is regarded by the driver as a CRT. "DFP" should only be used to refer to digital flat panels connected via a DVI port. Option "UseEdidDpi" "string" By default, the NVIDIA X driver computes the DPI of an X screen based on the physical size of the display device, as reported in the EDID. If multiple display devices are used by the X screen, then the NVIDIA X screen will choose which display device to use. This option can be used to specify which display device to use. The string argument can be a display device name, such as: Option "UseEdidDpi" "DFP-0" or the argument can be "FALSE" to disable use of EDID-based DPI calculations: Option "UseEdidDpi" "FALSE" See Appendix Y for details. Default: string is NULL (the driver computes the DPI from the EDID of a display device and selects the display device). Option "ModeValidation" "string" This option provides fine-grained control over each stage of the mode validation pipeline, disabling individual mode validation checks. This option should only very rarely be used. The option string is a semicolon-separated list of comma-separated lists of mode validation arguments. Each list of mode validation arguments can optionally be prepended with a display device name. "<dpy-0>: <tok>, <tok>; <dpy-1>: <tok>, <tok>, <tok>; ..." Possible arguments: o "AllowNon60HzDFPModes": some lower quality TMDS encoders are only rated to drive DFPs at 60Hz; the driver will determine when only 60Hz DFP modes are allowed. This argument disables this stage of the mode validation pipeline. o "NoMaxPClkCheck": each mode has a pixel clock; this pixel clock is validated against the maximum pixel clock of the hardware (for a DFP, this is the maximum pixel clock of the TMDS encoder, for a CRT, this is the maximum pixel clock of the DAC). This argument disables the maximum pixel clock checking stage of the mode validation pipeline. o "NoEdidMaxPClkCheck": a display device's EDID can specify the maximum pixel clock that the display device supports; a mode's pixel clock is validated against this pixel clock maximum. This argument disables this stage of the mode validation pipeline. o "AllowInterlacedModes": interlaced modes are not supported on all NVIDIA GPUs; the driver will discard interlaced modes on GPUs where interlaced modes are not supported; this argument disables this stage of the mode validation pipeline. o "NoMaxSizeCheck": each NVIDIA GPU has a maximum resolution that it can drive; this argument disables this stage of the mode validation pipeline. o "NoHorizSyncCheck": a mode's horizontal sync is validated against the range of valid horizontal sync values; this argument disables this stage of the mode validation pipeline. o "NoVertRefreshCheck": a mode's vertical refresh rate is validated against the range of valid vertical refresh rate values; this argument disables this stage of the mode validation pipeline. o "NoEdidDFPMaxSizeCheck": when validating for a DFP, a mode's size is validated against the largest resolution found in the EDID; this argument disables this stage of the mode validation pipeline. o "NoVesaModes": when constructing the mode pool for a display device, the X driver uses a built-in list of VESA modes as one of the mode sources; this argument disables use of these built-in VESA modes. o "NoEdidModes": when constructing the mode pool for a display device, the X driver uses any modes listed in the display device's EDID as one of the mode sources; this argument disables use of EDID-specified modes. o "NoXServerModes": when constructing the mode pool for a display device, the X driver uses modes provided by the core XFree86/Xorg X server as one of the mode sources; this argument disables use of these modes. o "NoPredefinedModes": when constructing the mode pool for a display device, the X driver uses additional modes predefined by the NVIDIA X driver; this argument disables use of these modes. o "NoUserModes": additional modes can be added to the mode pool dynamically, using the NV-CONTROL X extension; this argument prohibits user-specified modes via the NV-CONTROL X extension. Examples: Option "ModeValidation" "NoMaxPClkCheck" disables the maximum pixel clock check when validating modes on all display devices. Option "ModeValidation" "CRT-0: NoEdidModes, NoMaxPClkCheck; DFP-0: NoVesaModes" do not use EDID modes and do not perform the maximum pixel clock check on CRT-0, and do not use VESA modes on DFP-0. See the README.txt file for more information..... B-) On Thursday 25 May 2006 10:57 am, Carl Hartung wrote:
Addendum:
Hi Shawn,
I'm gradually dredging up tidbits. ;-) Here are two promising links:
http://forums.suselinuxsupport.de/lofiversion/index.php/t35720.html and http://wiki.suselinuxsupport.de/wikka.php?wakka=ConfigFilesxorgconftvmonitor
regards,
Carl
-- Check the headers for your unsubscription address For additional commands send e-mail to suse-linux-e-help@suse.com Also check the archives at http://lists.suse.com Please read the FAQs: suse-linux-e-faq@suse.com
On Thursday 25 May 2006 2:11 pm, Brad Bourn wrote:
There are some specific options for the nvidia driver that tells it to ignore the EDID list that it defaults to checking modes the monitor are capable of.
This had given me problems in the past with widescreen detection.
There are also options to ignore the detection of the type of display also, if you want to force the driver to use what you specify, you will need to have these options disabling the automatic stuff that ovverrides the manual settings in xorg.conf
Option "IgnoreDisplayDevices" "string"
I have done this in my config already to ignore the CRT.
See the README.txt file for more information.....
This is what I used to setup my xorg.conf options. I will review it more extensivly and try some of the other options you listed. -- Regards, Shawn Holland
On Thursday 25 May 2006 13:23, Shawn Holland wrote:
I will review it more extensivly and try some of the other options you listed.
Hi Shawn, I am/was hoping Brad could help decipher some of the settings that you're wrestling with. In the interim, I'd like to confirm the makes and models of your displays. I see the monitor listed as an 'NEC MultiSync XV17' but I don't see the TV make and model. Can you supply that info? regards, Carl
On Thursday 25 May 2006 4:47 pm, Carl Hartung wrote:
On Thursday 25 May 2006 13:23, Shawn Holland wrote:
I will review it more extensivly and try some of the other options you listed.
Hi Shawn,
I am/was hoping Brad could help decipher some of the settings that you're wrestling with. In the interim, I'd like to confirm the makes and models of your displays. I see the monitor listed as an 'NEC MultiSync XV17' but I don't see the TV make and model. Can you supply that info?
regards,
Carl
Well looking at the Xorg.0.log file it detects the TV as: (--) NVIDIA(0): Detected TV Encoder: Philips 7108 The TV is a Samsung TXK2566. As well in the mean time I've brought back the MX 4000 and picked up an FX5200. As the SVideo didn't work well on the MX 4000. It was not a cable problem as the cable worked perfectly in the SVideo out on the GF2 Ti (just not in X :) ) But funny enough was completely black and white ?? I tried running startx -- -verbose 5 but there doesn't seem to be any difference in the log output. I was hopeing to get more information from it. Hey if I end up giving up again. I'll fedex you the card so you can enjoy it as much as I did :) If you get it working you can keep it! -- Regards, Shawn Holland
On Thursday 25 May 2006 5:15 pm, Shawn Holland wrote:
Well looking at the Xorg.0.log file it detects the TV as:
(--) NVIDIA(0): Detected TV Encoder: Philips 7108
My bad thats probably the composite encoder on the GF2 Ti :) -- Regards, Shawn Holland
On Thursday 25 May 2006 16:19, Shawn Holland wrote:
My bad thats probably the composite encoder on the GF2 Ti :)
:-) That's OK, I'd figured that out anyway, Shawn. There's a problem at Samsung's website, though. Every one of their televisions is listed on the User Guide download page /except/ the TXK2566. Hopefully, you've got the manual there and can post the video-in specs.
Hey if I end up giving up again. I'll fedex you the card so you can enjoy it as much as I did :) If you get it working you can keep it!
It's a deal! But I'd really rather we got this puzzle solved for you. I understand you don't have infinite time to spend on it, but there /is/ an element of principle involved. Hopefully it won't come to that, but if it does, you'll help me set it up, won't you? ;-) regards, Carl
Ok I got some good news and some bad news :) I took the exact same xorg.conf and just changed it to load to svidio, I installed the FX5200, upgraded the driver to the newest and x started in svidio no problem. Nice and centered. So I know now its not a problem with the xorg.conf. I'm thinking it has to do with either the driver or the video card itself :( Whats bad is that the FX5200 svideo out looks the same as the MX4000's svideo. It looks like its picking up a local tv chanel that you can get with just any tv (no cable hooked up) through radio signal's.. or how ever they transmit it. But my GF2Ti's svideo looks nice and clean (just in black and white) So now I have a choice. Try to continue working on the GF2Ti when there may not be a way to get it working (or one that I may never figure out) with a possibility of a clean image. Or try and figure out how to clean the image up on the FX5200? Any thoughts? -- Regards, Shawn Holland Sandara Technologies Ltd. Ph. 1-902-405-3344 Fx. 1-902-405-3345
On Friday 26 May 2006 11:39, Shawn Holland wrote:
Ok I got some good news and some bad news :)
:-) this is classic, isn't it?
I took the exact same xorg.conf and just changed it to load to svidio, I installed the FX5200, upgraded the driver to the newest and x started in svidio no problem. Nice and centered.
So I know now its not a problem with the xorg.conf. I'm thinking it has to do with either the driver or the video card itself :(
This might be a stupid question, but didn't the new driver installation modify your original xorg.conf? Can you please e-mail me copies (direct) of the 'before new driver' version and the 'after new driver' version?
Whats bad is that the FX5200 svideo out looks the same as the MX4000's svideo. It looks like its picking up a local tv chanel that you can get with just any tv (no cable hooked up) through radio signal's.. or how ever they transmit it.
Another stupid question: Do you see the same interference with *nothing* connected to the TV? In this case it could be a 'tuning in' problem at the TV end of things. One possibility: Most consumer electronics like DVD players, VCR's, etc., have a switch... sometimes on the back of the box and sometimes in firmware... to switch between broadcasting to the attached TV on 'Channel 3' and 'Channel 4'. Is it possible the card is transmitting on 'Channel 4' while your TV is set to receive on 'Channel 3'? Note: Depending on the quality of the receiving circuitry in the TV, a very strong 'Channel 3' output from the card *could* show up as black and white if the TV is 'listening' on 'Channel 4'. (Remember analog TV sets with real dials? You could 'dial in' a station and variably catch/drop the subcarriers with the color data... a 'nudge' down and the picture turned to black and white... a 'nudge' up and the colors returned. That's the effect I'm speculating might be involved here.)
But my GF2Ti's svideo looks nice and clean (just in black and white)
See previous comment.
So now I have a choice. Try to continue working on the GF2Ti when there may not be a way to get it working (or one that I may never figure out) with a possibility of a clean image. Or try and figure out how to clean the image up on the FX5200?
I think there is a strong possibility you will fix both if you troubleshoot the 'interference' problem. regards, Carl
Shawn Holland wrote:
Ok I got some good news and some bad news :)
I took the exact same xorg.conf and just changed it to load to svidio, I installed the FX5200, upgraded the driver to the newest and x started in svidio no problem. Nice and centered.
So I know now its not a problem with the xorg.conf. I'm thinking it has to do with either the driver or the video card itself :(
Whats bad is that the FX5200 svideo out looks the same as the MX4000's svideo. It looks like its picking up a local tv chanel that you can get with just any tv (no cable hooked up) through radio signal's.. or how ever they transmit it.
But my GF2Ti's svideo looks nice and clean (just in black and white)
Black and Withe? "Stupid Question(TM)": What is the norm of your TV? NTSC? PAL-N? PAL-Something? That's configured on your x.org?
So now I have a choice. Try to continue working on the GF2Ti when there may not be a way to get it working (or one that I may never figure out) with a possibility of a clean image. Or try and figure out how to clean the image up on the FX5200?
Any thoughts?
You still have the fedex idea? :)
Shawn Holland wrote:
Ok I got some good news and some bad news :)
I took the exact same xorg.conf and just changed it to load to svidio, I installed the FX5200, upgraded the driver to the newest and x started in svidio no problem. Nice and centered.
So I know now its not a problem with the xorg.conf. I'm thinking it has to do with either the driver or the video card itself :(
Whats bad is that the FX5200 svideo out looks the same as the MX4000's svideo. It looks like its picking up a local tv chanel that you can get with just any tv (no cable hooked up) through radio signal's.. or how ever they transmit it.
But my GF2Ti's svideo looks nice and clean (just in black and white)
So now I have a choice. Try to continue working on the GF2Ti when there may not be a way to get it working (or one that I may never figure out) with a possibility of a clean image. Or try and figure out how to clean the image up on the FX5200?
Any thoughts?
I was just trolling thru the list of software available in v10.1 looking for apps. which handle tv (I have a tv card installed) and by chance came across this entry: nvtv Allows you to enable and fine tune TV output of nVidia cards..... May this solve your problems? Who knows? Cheers. -- All answers questioned here.
On Saturday 27 May 2006 06:28, Basil Chupin wrote:
I was just trolling thru the list of software available in v10.1 looking for apps. which handle tv (I have a tv card installed) and by chance came across this entry:
nvtv Allows you to enable and fine tune TV output of nVidia cards.....
May this solve your problems? Who knows?
Hi Basil, I hope Shawn catches this post. Thanks! Carl
On Saturday 27 May 2006 08:15, Carl Hartung wrote:
I hope Shawn catches this post.
Addendum: 0.4.7a is alpha sw, no changes/additions since May 1, 2004 :-( Summary: http://sourceforge.net/projects/nv-tv-out/ Files: http://sourceforge.net/project/showfiles.php?group_id=33758 It might still be worth experimenting with... Carl
On Saturday 27 May 2006 09:14, Shawn Holland wrote:
I don't experiment with guys :)
lol! (blush) I've really gotta watch my 'shorthand!' Good morning! Any news? FWIW, I got the impression from reading the nvtv docs that it's definitely possible to 'fine tune' TV-out on those cards and the M$ utilities support it, but that maybe the Linux drivers weren't (2004) sophisticated enough? Is that a fair assessment? regards, Carl
On Saturday 27 May 2006 09:34, Carl Hartung wrote:
FWIW, I got the impression from reading the nvtv docs that it's definitely possible to 'fine tune' TV-out on those cards and the M$ utilities support it, but that maybe the Linux drivers weren't (2004) sophisticated enough? Is that a fair assessment?
Hi Shawn, The suspense is growing... ;-) Did I miss or accidentally delete your respons? Any luck yet? (I hope so) regards, Carl
On Saturday 27 May 2006 7:28 am, Basil Chupin wrote:
nvtv Allows you to enable and fine tune TV output of nVidia cards.....
May this solve your problems? Who knows?
Yes I used NVTV early on in setting this up. Only problem was that it talks directly to the card so it can have some unwanted results. Plus you don't get GL support unless you install the driver. And my problem is that when I finally got the driver installed it would not load X properly. So without X I can't use this tool to tune my settings. -- Regards, Shawn Holland Sandara Technologies Ltd. Ph. 1-902-405-3344 Fx. 1-902-405-3345
On Thursday 25 May 2006 12:55 pm, Carl Hartung wrote:
On Thursday 25 May 2006 13:11, Brad Bourn wrote:
See the README.txt file for more information.....
Brad,
Have you followed this entire thread or are you just 'tuning in'?
I've followed it off and on looking for anything that I might be able to help with. Didn't see anything that rung any bells until the summery of all the warning in the startup log. The ones about all the modes rung a bell of the EDID override that had to do to get my widescreen modeline to work and set the proper DPI. Hence the suggestion. Why? Was it bad information? Redundant? Automation is a good thing overall, but usually has some overlap of not letting custom settings shut off the automation at first. The latest drivers from nVidia seem to do the EDID corretly now for the widescreen. Trying to help with the little 2cents I had. B-)
Hi Brad, On Thursday 25 May 2006 15:49, Brad Bourn wrote:
Why? Was it bad information? Redundant?
Neither bad nor redundant. Not at all! I was just perplexed that you hadn't joined in about 30 posts ago (literally) when Shawn and I were discussing that very README.txt :-) And, honestly, from all your previous SLE posts, my impression was you'd probably be able to help a great deal with interpreting which of those settings apply to Shawn's setup.
Automation is a good thing overall, but usually has some overlap of not letting custom settings shut off the automation at first. The latest drivers from nVidia seem to do the EDID corretly now for the widescreen.
This is precisely the kind of help I think Shawn needs right now. We both saw that (WW) warning but *you* knew how to silence it. Now, all that's left to be done is refine the xorg.conf to support the two displays... not even simultaneously.
Trying to help with the little 2cents I had.
Did you remember to turn your pockets inside out? ;-) regards, Carl
On Thursday 25 May 2006 2:08 pm, Carl Hartung wrote:
And, honestly, from all your previous SLE posts, my impression was you'd probably be able to help a great deal with interpreting which of those settings apply to Shawn's setup.
Automation is a good thing overall, but usually has some overlap of not letting custom settings shut off the automation at first. The latest drivers from nVidia seem to do the EDID corretly now for the widescreen.
This is precisely the kind of help I think Shawn needs right now. We both saw that (WW) warning but *you* knew how to silence it. Now, all that's left to be done is refine the xorg.conf to support the two displays... not even simultaneously.
I haven't played with the TV/Out of my nvidia on my laptop (yet).... So the Composite stuff and others that you were discussing up to the point I jumped in is still unfamilliar territory. I'm happy to help any way that I can, as long as I think it is actually helpfull. I'd be happy to try some tests on my laptop if it may help, and time allowing of course. I'd like to be able to use the TV out on my laptop too. hehehe And in looking through this README.txt, there seems to be allot more options than the last time I checked into anything. There was that one other setting that had to do with forcing the driver to properly detect that it was on a laptop with LCD. That was a few drivers ago though, and lately everything has been working, right out of the box. I was really impressed when I got a new ASUS SLE MB for an amd_64 server, and the nVidia driver went on and just worked. Those guys (nVidia) get my vote for sure! Let's hope Shawn finds more usefull messages to further narrow down the problem. B-)
participants (5)
-
Alvaro Kuolas
-
Basil Chupin
-
Brad Bourn
-
Carl Hartung
-
Shawn Holland