On Jan 07, 10 14:45:42 -0500, Andrew Shor wrote:
most LCD panels today are using either 6-bits or 8-bits per color, a migration to 10-bit and higher LCD panels has begun. Presently, there are several generally-available PC displays capable of 10-bits and 16-bits per color, and more such high-end displays are expected to be
What you're referring to is called "Deep Color" - and AFAIR this has been added to the HDMI 1.3b spec. As DVI is physical layer compatible with HDMI, it should theoretically be transmitted over DVI as well. AFAIK there is some EDID information about whether the display supports Deep Color - that would have to be tested.
How difficult would it be to modify the radeonhd driver to try to enable this? I don't know for sure that the light engine expects the same formats that AMD provides, even if it is using enhanced color depth with dual link, as well, although it should.
As the format is standardized, it shouldn't be a problem. Lately the
Xserver (specifically libpixman) has been extended to be able to cope
with 10-10-10-2 bit visuals, so that should work as well - before that
change it would be close to impossible.
I think 10 bit per channel visuals haven't received a lot of testing -
I would doubt that only few applications crash :-]
Still it might work. 16 bits per channel are out of the question for
now, because AFAIK the server doesn't support color depths >32 bit.
The changes in radeonhd should be extensive, but rather easy to do I
guess, unless you're dealing with acceleration. You need to change
general setup, display setup, crtc programming, color table handling,
and acceleration.
We (Egbert and me) eventually wanted to add this as well, but never got
around to. Especially in the light of DisplayPort, which supports >8
bit per channel from day one (but we didn't even manage to get
DisplayPort running so far in radeonhd due to lack of time).
So any patches are very welcome!
Thanks
Matthias
--
Matthias Hopf