On Tue, Nov 24, 2009 at 03:06:28PM +0100, Matthias Hopf wrote:
On Nov 24, 09 00:03:55 +0100, Luc Verhaegen wrote:
So do get off your high horse and stop thrashing things we had to do because we got a lot more thrashing in the other direction already.
Luc, calm down. The request is perfectly valid, though not achievable with current RandR implementation and logic (dunno whether it would be possible at all the way RandR is designed ATM).
Now that we have all had a few months to calm down, here are some supplemental notes for this old thread. 1. My mission in my postings was to help out and to educate, as I had already established a workaround for my personal machine. I am sorry this turned into a flamewar which is why I backed down from the debate until now. 2. While it may not have come across, I have been doing video adapter programming on other operating systems for a few decades, but I have not yet worked with the inards of X or its driver model, hence my confusion regarding the distribution of tasks between RANDRx and the driver.
What we need is
- Ability to reconfigure DPI, etc. on-the-fly (should be working)
This would only be needed if changing resolution etc. upon seeing a monitor getting plugged in. A simpler more traditional aproach is to just stick with whatever resolution was chosen at driver startup, even if not optimal for the real monitor. For instance if the driver started with 96dpi 800x600, it would just continue reporting and using those values even if connected to a 120dpi monitor capable of 1600x1200 pixels.
- Ability to probe outputs flicker-free (at least partly working for radeonhd thanks to Egbert's latest changes)
This can be avoided by generating "blind" output even when there is no monitor receiving it. This is the traditional solution for analog / primitive channels such as VGA, CGA/EGA and TV ports. It may not be good for DVI/HDMI/DisplayPort outputs, but those are more likely to facilitate flicker free detection.
- Resizeable framebuffer (or reasonably large preallocated fb - the former already working with radeon/KMS, the later with radeonhd, modulo bugs)
Again this is needed only if recomputing the resolution dynamically.
- User interfaces that deal nicely with DPI and screen space changes (the later often - not always - working, the former not)
For comparison at least one widely used operating system has officially supported this since 1995 via a simple: "User preferences/screen resolution changed" event, prompting applications and toolkits to recheck any cached data (or just continue if they don't care). This event was originally just used for the case of an end user manually changing the configuration through a GUI, but was simple and generic enough to work the same in the presence of automatic changes. Despite this simplicity and well-establishedness, there are still (as of 2010) business critical applications on that OS which crash and burn when they receive the message (I get flack for this due to my own driver work...).
- Some code in RandR or a yet-to-be-written user space daemon or in the display managers *and* desktops that probes output every second or so, and reacts on the changes
Moving probing too far up the layers would make life hard on people not running a fancy desktop or hundreds of desktop-helper-daemons. X is supposed to be usable with a single non-wm app or a primitive wm such as twm. I see that another poster already figured out how to keep this at the driver/RANDR level.
The last point is the main issue - there is no code. Additionally, some chips (notably intel) don't allow for flicker-free probing, so you would have to rely on DDC probing. RandR logic was to always switch off a CRTC before doing the probing, I think that has already changed (but I could be mistaken).
Due to this I don't think we will see this soon, as a non-generic solution probably won't be accepted, especially if it hits chips with large user basis as with intel.
This is why all my posts assumed startup-probing only with some classic least-common-denominator fall back values in the absence of user overrides. Just as in old X drivers for old chipsets like ET4000 or S3 Trio. There would basically be multiple sources for the output resolution, with decreasing priority: 1. User forced configuration with "Enable Yes" (Example: User says force 1200x1024, hardware detection ignored, 1200x1024 wins). 2. User non-forced configuration if compatible with actual data from detection (Example: User says 1200x1024, hardware says monitor can do up to 1600x1200, 1200x1024 wins). 3. Actual data from detection (Example: User says 1200x1024 or is silent, hardware says monitor can do up to 1024x768, 1024x768 wins). 4. User non-forced configuration (Example: User says 1200x1024, hardware failed to detect what is plugged in or detected no monitor yet, 1200x1024 wins). 5. Cross-CRTC chipset limitation (Example: One of above rules configured HDMI output, no config for VGA output and no detection for VGA output, CRTC chip can only do certain resolutions on VGA port when HDMI port has that resolution, so it does that). 6. Historic minimum hardware specs (Example: User says nothing, hardware failed to detect what is plugged in or nothing is plugged in yet, 640x480 @ 60Hz wins). Thanks for your patience. -- This message is hastily written, please ignore any unpleasant wordings, do not consider it a binding commitment, even if its phrasing may indicate so. Its contents may be deliberately or accidentally untrue. Trademarks and other things belong to their owners, if any. -- To unsubscribe, e-mail: radeonhd+unsubscribe@opensuse.org For additional commands, e-mail: radeonhd+help@opensuse.org