14.07.2018 11:00, Daniel Bauer пишет:
On 14.07.2018 05:45, David C. Rankin wrote:
On 07/13/2018 02:34 PM, Carlos E. R. wrote:
On 2018-07-13 19:28, David C. Rankin wrote:
The only other thought I had was to make sure you do not have other xf86 drivers installed. It would be odd that there was a conflict, but if you have multiple drivers installed and nothing black-listed, I can see issues there. It also sounds like you have multiple cards in your laptop -- I don't know if you can select/disable one in the BIOS, but that would be something else to investigate.
He has Optimus, so dual video hardware. One Intel, another Nvidia. On Windows this works (they say) perfect: on battery it uses Intel, and when the wall power is connected it switches automatically to Nvidia. Or switch when you want high performance.
This combination works one year yes, another no on Linux. With different levels of "working".
Ah!
So this is a case where Linux has a good Intel Driver, and a good Nvidia driver, but only windows has a good Optimus driver that makes use of both depending on laptop power.
Why couldn't he just configure systemd somewhere to only publish "plugged-in" and exclusively use Nvidia, or the other way around and only use Intel? Is there a kernel parameter or systemd or sysfs entry that can control this?
This is how it works on WIndows, but not on Linux. Here the Nvidia doesn't work at all.
All your logs attached to mentioned bug report show that nVidia *does* work (i.e. Xserver is started and is using nVidia driver) but SDDM crashes.
It /did/ work perfectly on 13.x!
13.x did not even have SDDM, did it?
I could use suse-prime and switch between intel and nvidia. So to my eyes it's not that Linux can't (because it could, I worked a lot with it on this laptop), but it doesn't want. Don't know why.
My bug report https://bugzilla.suse.com/show_bug.cgi?id=1088287 is still "new"...
Quoting comment on this bug: Both the Xorg.0.log and Xorg.0.log.old show situation when nvidia is the primary GPU and intel is secondary. It seems that intel gets correctly configured as output source because it sets up the eDP1 output. But shortly after the X server is cleanly terminated. My guess is that the display manager fails. You could try if switching to some other display manager helps (KDM, GDM, SDDM, ...). This reply was added on the same day you submitted bug report. Did you try different DM in three months that passed since then? -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org