On Wed, 2 Sep 2015, Carlos E. R. wrote:
I will say again that the update process in Kubuntu is easier and faster, both from the command line perspective (apt) and the GUI perspective (KDE). The GUI updates installer is not very streamlined here.
You mean apper? Did you try YaST? That's the real openSUSE updater, the flagship.
Yast is very slow to start. I don't have a fast system. I use it for managing repositories. I have used Zypper extensively and almost exclusively. I hadn't seen Apper yet, I think it is very elegant but I would not soon use a graphical tool. It may be a better tool to use for updating though. No, I mean. Zypper seems to always update its (or tries to update its) repositories whenever you start it. Not sure. That makes it slow. Zypper has like 2 important actions (verbs) which are update and install, but they are not really disjunct and they do overlap. Apt has only two important verbs: update and upgrade. Update refreshed the repo/list, upgrade does what zypper update does. And then there is install, so you have three: update, upgrade and install. Those are three totally different things. It works really well. The zypper manual (man page) is more like a compendium, not a quick reference guide. So it is structured like a book; but when I read a man page I look for commands. It is so verbose that you are scanning for pages to find something, and then the actions are all categorized and everything is hard to find. I generally don't read books to find tidbits of what I need to know ;-). It's a bit hard to discover everything. In Debian/Ubuntu the "apt" command does almost everything you need. In Debian 7 they didn't have it, but in Debian 8 it was introduced. It's better than aptitude. apt list -> zypper package apt list "mysql*" -> zypper search "mysql" apt search "mysql" -> finds much more than just package names. Now I am currently comparing a slow 4 year old laptop with an 1.8" 5400 drive with a VPS running on 1 core with an SSD, but on my own system, from experience, all apt operations run about two times as fast as all zypper operations. Maybe zypper can do much more, but it is much harder to use. zypper wp --> apt-file search Apparently wp does not work very well, for example I cannot use it to find header files on my system or where they are coming from. zypper wp "inet/in.h" returns nothing. apt-file search "inet/in.h" returns: dietlibc-dev: /usr/include/diet/netinet/in.h frama-c-base: /usr/share/frama-c/libc/netinet/in.h freebsd-glue: /usr/include/freebsd/netinet/in.h libc6-dev: /usr/include/netinet/in.h libklibc-dev: /usr/lib/klibc/include/netinet/in.h musl-dev: /usr/include/x86_64-linux-musl/netinet/in.h postgresql-server-dev-9.4: /usr/include/postgresql/9.4/server/port/win32/netinet/in.h On that system. But glibc-devel clearly has it: $ rpm -ql glibc-devel | grep "in\.h" /usr/include/bits/in.h /usr/include/bits/initspin.h /usr/include/netinet/in.h Well, I don't want to brag too much or complain too much, I must be happy I have a good running system atm ;-).
But "zip" already exists, it is an archiver.
I know. Call it "zap" and make it fast ;D. "ze archive program". "zap" is also a command debian doesn't know (if you type a command it knows about but which is not installed, it will instantly tell you what package it belongs to; rather helpful :)". "cnf" on the other hand takes about 5 seconds here to do the same task. Zap, unfortunately, is already a command in Suse :P. Let's stop talking about Debian though ;-). Maybe it's offensive.
You are absolutely right! :-)
It still feels that way, when kde daemon doesn't crash on startup; taking my wifi with it ;-). (NetworkManager is so bugged that it will stop doing anything without giving any indication of an error). (I mean the KDE applet). A bit of stability is nice for a change, I haven't really had it for a year. Kubuntu is rather a volatile thing. Because it is new, and small, and sets high goals for itself, and doesn't have all the expertise or the number of people in-house that it depends on, and because it doesn't really like Ubuntu, it gives a very unstable environment. It is still not clear if Kubuntu will continue past october. If you are completely and wholly dependent on Ubuntu, but you don't like its master, that is rather a problem. Normally in life.... Stability produces peace. It also produces self-confidence. People who have solid foundations, in life, or in themselves, which is the same thing, are also emotionally unstable. Your emotional stability is a direct result of your life-situation-stability. I meant to say that they are stable. Many people live lives with foundations that are not their own. My only real interest in open source is to form a basis or foundation that is impervious to be disrupted by outside forces. Ownership of code is one thing that is important for that. Linux is not wholly stable; in the sense that hostile takeovers of many kinds happen continuously. I consider systemd to be one of those things or something partly similar to it. The open source model dictates (or at least the licensing culture and everything that derives from it) that those who create a software or solution and manage to push it, automatically get the most votes for that software to be included, because it works by visual results. There is no one who can say "hey, we think this is wrong, let's devise something else" because manpower generally can't be organised like that. So if someone creates something that does the job, no matter how badly and no matter how hard to use, it gets included because perfect design and flawless operation are not really standards that happen to be important or vital to inclusion. Anything that works but is deeply flawed can still be included in any distribution. I don't know if I'm allowed to elaborate here or whether it would be helpful. I don't have much experience participating in open source projects but the experience I do have indicates that the one who can push a solution fastest is the one who gets the votes. Because there is something tangible, visible. Anyone who has a better solution (in mind) but which is not fully formed -- for example because it required the cooperation of said author -- gets ignored. So what happens is: Person A comes with an idea for Project X. Author D, who managed the project, is interested in that kind of feature. Person B, who has written something that might become a real solution, offers his code to the group, and asks whether he should continue developing it. Person A starts discussing the solution of person B but feedback is required of authors D and E. Author D knows the system so well that he is capable of throwing a solution together in no time, that might not live up to the ideals of person A or B, but which he can concoct and throw together almost instantly. Within days, or weeks, he pushes his solution to the group. Meanwhile he, insincerely, tells person B to continue developing HIS solution because 'it might be preferable'. But person B bails because he doesn't see it happening. Person A keeps debating the thing but Author D tries to shut him down. D immediately, or within a very short time, presents his flawed solution for review. Author E chimes in and they start discussing improvements. Many of those improvements are ideas stolen from A and B. But only D really gets credit for it. The solution gets committed to a branch but now all efforts and attention is directed at developing that branch, soon (?) to be included in the main thing. Alternatives have almost completely become unnecessary. I don't know how it goes on from that because I'm in the middle of it :P. You can guess that I am person A here. Why is this problematic? Because actual programming work (visible code) is preferred over ideas (architectural design) and the one who is fastest (thus the one who has designed the least) gets to be included while budding alternaties are discarded. The end result of that is flawed code. When a system favours including flawed code over correct or elegant code, and then merely seeks to iteratively improve on that flawed code (fixing bugs) you get like a runaway of direction. You could call that a form of 'hostile takeover'. Power dynamics in the open source group result in certain code being pushed. People in positions of power achieve success, but at the cost of their own project and at the cost of those who would create something better. They do so in order to retain their power position. It is vital for them to keep being the ones who decide the code and who get credit for it. Basically, I feel the only way to retain control of your code is to not open source it in that way. We see that contributing to an open source project does not help to improve it. Release early and release often favours bad code over good code. I follow the TrueCrypt model myself. I have never ever in my life seen a better software than TrueCrypt. That software, at least for Windows, covers pretty much every eventuality, documents every step of the way, safeguards everything, and leaves absolutely nothing to chance. It is just almost flawless, you could say. Maybe it is. So much dedication has gone into it. And the authors always retained control. While not even closing the source. Anyway. It seems as though OpenSUSE has a high degree of control over its ecosystem. At the very least. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org