http://bugzilla.opensuse.org/show_bug.cgi?id=1089932
http://bugzilla.opensuse.org/show_bug.cgi?id=1089932#c54
Michal Srb
That's pretty broken IMO. If the size is meaningless anyway, it should be omitted.
It is part of the core X protocol, so it can not be removed. Sending values corresponding to 96 DPI is a reasonable fallback for legacy applications that still use it.
Qt uses the virtual DPI to calculate the font size, so that it can be overwritten with -dpi. That is intentional and got introduced specifically to allow the override.
That is not the wisest choice. A better choice would be to take DPI of an output (the primary output for a window that can be moved around or a specific one for window that fixed to the output). If the size should be configurable, then that value could be multiplied by some factor configurable in Qt. I realize that it may be hard to push such change to Qt, but it is equally hard to advocate the change you want in X server.
The best option is to have the size be the actual physical size of the whole screen (so addition of monitor edges) and virtual size. That way the calculated DPI is roughly the average.
There is no way of knowing the size of the monitor edge, but some kind of heuristic could be done based on the sizes of the monitor screens and their relative placement. But at best we would get some average DPI which would look wrong on all monitors. X server used to just replicate the DPI of the "first" monitor. That was changed in 2009 to the fixed 96 DPI. Someone opened bug opposing it: https://bugs.freedesktop.org/show_bug.cgi?id=23705 X developers declared that the behavior is correct and not a bug, but the comments are still coming until recently. The latest conclusion is that applications should not use the screen's DPI value, but the output's DPI values. A patch was proposed to even change xdpyinfo to not display that value anymore: https://patchwork.freedesktop.org/patch/156651/ That again sparkled long discussion. Quoting one relevant comment from it:
The root of the issue is that in the case of multiple monitors with potentially inconsistent DPI values, the core protocol value is ambiguous at best. It also has the downside that its value is only communicated at connection time, so it doesn't dynamically change even when the circumstances change (e.g. resolution change, move to a different output with a different DPI, etc). Clients need to be aware of the possibilities that different outputs may have different DPI values, and that the values can change, and that requires RANDR support and listening to the appropriate change notification masks.
...
Now one could argue that in the case of single output X11 could automatically set the DPI to the one of the only connected output (something I actually agree with), but that's (a) a separate issue and (b) not without its downsides (e.g. should it automatically change whenever the output changes? what should be done when a new output gets connected? what should be done when an output gets disconnected and we're left with one output again? etc).
If we expect HiDPI to work properly, we can't just let X send fake values to applications by default.
The real values can be retrieved from randr. Especially in the sddm case that should be the preferred source of information - as I understand it, sddm creates a separate login screen for every output, so it should scale the fonts on each output based on that output's DPI. That should give better looking results than using some average value of all of them. -- You are receiving this mail because: You are on the CC list for the bug.