Mailinglist Archive: radeonhd (699 mails)

< Previous Next >
Re: [radeonhd] randr PreferredResolution in xorg.conf not working
  • From: Matthias Hopf <mhopf@xxxxxxx>
  • Date: Wed, 28 Nov 2007 17:06:26 +0100
  • Message-id: <20071128160626.GA27261@xxxxxxx>
On Nov 27, 07 17:54:50 +0100, Sebastian Brocks wrote:
Matthias Hopf schrieb:
Given your symptoms I'd like to ask you to test the latest git. Hot plug
detection works for the RandR case now as well.

With latest git, everything sems to work fine in gdm with the monitor
with the PreferredResolution option plugged into DVI-I_2/digital.
But once gnome starts up, only the other screen gets anything displayed,
the one with PreferredResoluton stays black with no signal.
Also, the screen that does work switches for a moment to a black screen
every few seconds, which makes it basically unusable.

We just literally tested this for hours here, and finally found out that
the following combination creates a bad display, that doesn't like to
sync:

- Philips Monitor (Brilliance 200W) reporting 2(!) native modes at 59.9 Hz
just with slightly different timing (146 vs. 147MHz), and different
sync polarization
- Using the preferred mode, with +HSYNC +VSYNC as reported
- Attached to LVTMA (TMDSB)
- LVTMA_PLL_CP_GAIN set to 0x1d (value from AtomBIOS)

Now change any *one* of those:

- Other monitors work just fine
- Attaching to TMDSA works just fine
- Setting sync polarization to -HSYNC works *almost* fine (few hickups still)
- Selecting the other native mode at 59.9 Hz works just fine (the one
with the *higher* dot clock)
- Setting LVTMA_PLL_CP_GAIN to 0x10 *or* 0x20 works fine
(verification: setting to 0x00 or 0x01 or 0x3f blanks immediately,
0x30 and up increases the number and severity of hickups)

Now this seems to very much depend on the monitor (and the chip of
course), and ATM I'm a bit unsure how to work this out. Changing sync
polarities seemed to work at first, but it isn't exactly.

The value in LVTMA_PLL_CP_GAIN is magic and thus very card specific, and
we don't know the gory details how this is selected. Gee, we don't even
know what this PLL is doing, I guess it is used for bit clock
generation.

Unfortunately, we don't have a oscilloscope here that could keep up with
this data rate and still produce somewhat sharp edges at this clock
rates to verify the quality of the TMDS signal.

beerockxs@awesome:~$ xrandr -q
Screen 0: minimum 320 x 200, current 1280 x 1024, maximum 2960 x 1680
DVI-I_1/analog disconnected
DVI-I_1/digital connected 1280x1024+0+0 376mm x 301mm
1280x1024 75.9 75.0 71.9 59.9* 60.0

Your DVI-I_1/digital did not report a preferred mode, this has been
fixed in git recently. A newer driver should report the preferred mode
correctly.

CU

Matthias

--
Matthias Hopf <mhopf@xxxxxxx> __ __ __
Maxfeldstr. 5 / 90409 Nuernberg (_ | | (_ |__ mat@xxxxxxxxx
Phone +49-911-74053-715 __) |_| __) |__ R & D www.mshopf.de
--
To unsubscribe, e-mail: radeonhd+unsubscribe@xxxxxxxxxxxx
For additional commands, e-mail: radeonhd+help@xxxxxxxxxxxx

< Previous Next >