Re: [opensuse-web] Re: [opensuse-wiki] openSUSE.org Security Alert
Hello, Am Montag, 7. November 2011 schrieb Thomas Schmidt:
On 07.11.2011 16:18, Matthew Ehle wrote:
I know you have suggested this before. In all honesty, it doesn't really matter whether I use subversion or not, especially with the way that we have to upgrade these wikis. Getting the MW core code is the easy part. I just download, extract, and move a couple of files over.
That's still more work when compared to "svn up". And even more work if you have to modify one of the mediawiki files - you have to (remember to) patch it again at every update. (I'm quite sure index.php is modified for iChain, and MultiBoilerplate contains a patch from me to support different templates per namespace) I know that using svn doesn't change much on _this_ update - but it will save you time on the _next_ updates. Counter-question: what's the advantage of using the tarball? ;-)
The vast majority of the time is spent in downloading and installing the extensions. I use subversion for as many of them as I can, but that is suitable for maybe half of the extensions that we use.
Yes, I know there are several extensions that consist of only a single file (which is not available via SVN). OTOH, for example MultiBoilerplate is available via SVN, and we are using a modified version (with a patch from me). Just updating it with "svn up" or "svn switch" would save you lots of time here.
What would be most helpful is to re-evaluate the extensions that we are running and see if we can get rid of a couple. That would go a long way for making the upgrades easier.
As you have seen in the second half of my mail, I'm even requesting a new extension (ReplaceText) to make maintenance tasks like the CSS cleanup easier. Also the WikiEditor would be a nice feature ;-) (and will be shipped with 1.18, if I got the beta release notes right). Removing extensions is a very hard task - I had a quick look on them and think all of them still make sense. In other words: nothing to remove. The only exceptions might be SelectCategory and MultiUpload because they are already disabled ;-)
the problem we face with hosting the wiki code in git is, that our production system doesn't have git installed (only svn) and it would be quite complicated to do so because of datacenter policies etc... I think a possible solution could be githubs subversion client support[1]. I already pushed our current code state there[2], so we can evaluate if this is an option for our deployments. If Matthew confirms that this works, we can go ahead and set up the repo at https://github.com/openSUSE/wiki .
That sounds like a good idea - and brings up an interesting question: If we push SVN checkouts to git, they will include .svn directories. What will happen if you check them out using svn? ;-) (Please test this and tell me the result.)
@Christian: Could you help us doing your requested changes in the github repo once it's at its final place? You are probably the one of us with the most knowledge and experience in mediawiki,
Thanks for the flowers ;-) I should add that the wiki I'm maintaining is quite small when compared to the openSUSE wiki: only one language, less visitors (54000 total views for main page - en.opensuse.org mainpage has 1.6 million views), less than 90 registered users etc. For example, I never had to think about memcached. Oh, and I even have some more extensions [1] installed ;-)
so it would be nice if you could help us :-)
I have the "usual" problem - days with only 24 hours ;-) I'll help whenever possible, but it really depends on what other things I have on my schedule. BTW: skins/bento/css_local/style.css seems to contain a bug ;-) Line 471 is thumb tright { I'd guess there are some dots missing (".thumb .tright"), but that's only a wild guess because I don't know where this style should be used. : Regards, Christian Boltz [1] see http://hortipendium.de/Spezial:Version Just in case I shocked you: most of the additional extensions don't make sense for the openSUSE wiki for various reasons ;-) -- Wenn ich eine SuSE-CD an ein Schwein binde und dieses trete, laufen KDE & Co. auch ohne RAM recht schnell. [Robin S. Socha in de.comp.os.unix.linux.newusers] -- To unsubscribe, e-mail: opensuse-wiki+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse-wiki+owner@opensuse.org
On 2011-11-08 00:37:09 +0100, Christian Boltz wrote:
Date: Tue, 08 Nov 2011 00:37:09 +0100 From: Christian Boltz <opensuse@cboltz.de> Subject: Re: [opensuse-web] Re: [opensuse-wiki] openSUSE.org Security Alert To: opensuse-web@opensuse.org Cc: Thomas Schmidt <tom@opensuse.org>, Matthew Ehle <mehle@novell.com>, opensuse-wiki@opensuse.org
Hello,
Am Montag, 7. November 2011 schrieb Thomas Schmidt:
On 07.11.2011 16:18, Matthew Ehle wrote:
I know you have suggested this before. In all honesty, it doesn't really matter whether I use subversion or not, especially with the way that we have to upgrade these wikis. Getting the MW core code is the easy part. I just download, extract, and move a couple of files over.
That's still more work when compared to "svn up". And even more work if you have to modify one of the mediawiki files - you have to (remember to) patch it again at every update. (I'm quite sure index.php is modified for iChain, and MultiBoilerplate contains a patch from me to support different templates per namespace)
I know that using svn doesn't change much on _this_ update - but it will save you time on the _next_ updates.
Counter-question: what's the advantage of using the tarball? ;-)
that we can track all local changes, like theme, different auth provider and so on. darix -- openSUSE - SUSE Linux is my linux openSUSE is good for you www.opensuse.org -- To unsubscribe, e-mail: opensuse-wiki+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse-wiki+owner@opensuse.org
Hello, Marcus, I'm afraid you didn't completely understand my proposal - and that is quite understandable because my method might sound crazy until you completely understand it ;-) I'll explain below... Am Donnerstag, 10. November 2011 schrieb Marcus Rueckert:
On 2011-11-08 00:37:09 +0100, Christian Boltz wrote:
I know that using svn doesn't change much on _this_ update - but it will save you time on the _next_ updates.
Counter-question: what's the advantage of using the tarball? ;-)
that we can track all local changes, like theme, different auth provider and so on.
Short answer: That works even better if you use SVN checkouts and store everything (including local changes, theme etc.) in another version control system, for example git. Long answer: Let's use a real-world example: the openSUSE wiki uses a modified index.php which at least[1] has this additional line in it: require( $IP . '/extensions/iChainLoginFix.php' ); Now let's update to the latest version of MediaWiki, which is another read-world example ;-) If you use the tarball, you have to - download and extract the tarball - hopefully remember that you added the iChainLoginFix to index.php and do this change on the latest index.php again - if you are unsure if you did other modifications to the upstream code, you have to download and extract the tarball of the old version and to diff against it. Now compare that with using a SVN checkout: - initially run svn co \ http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_17/phase3 - for later updates, just run "svn up" or "svn switch" to update to a new major version - the iChainLoginFix will still be in index.php after svn up or svn switch, except if there is a merge conflict - finding out about local changes is easy - just run "svn diff" In other words: Using a SVN checkout saves time and is much better in handling modified files. This is not a theory, that's experience from a wiki I maintain, and also from the blog software I use. The same works for extensions, at least those that are available via SVN: - installing a new extension: - cd extensions - svn checkout $whatever_extension_you_need - and later: - cd extensions/$extension - svn up (or svn switch) You can also add the bento skin etc. as usual. SVN will flag them with "?" as unknown file/directory in "svn status", but that doesn't hurt. And yes, we now have a wild mix of svn checkouts (MediaWiki core + some extensions) and additional files like the bento theme. But that's still better than having a wild mix of tarballs (MediaWiki core + some extensions) and additional files like the bento theme. ;-) What we have until now will work perfectly if you are working directly in the DocumentRoot on a single server. For the openSUSE wiki, it becomes even more interesting[tm] and confusing because we have more than one server and need a clean deployment method ;-) We could just rsync the DocumentRoot content to all servers. This would work, but lacks version control for the whole picture. Therefore I proudly present the Matryoshka version control system ;-)) Commit everything (including all .svn directories) to another version control system, for example git [2]. You can then a) on the server: deploy from git b) on a development system: run "svn up" in your git checkout and commit the latest MediaWiki/$extension/bento theme/... to git c) on the server: deploy from git again When using tarballs, step b) would be "extract the tarball in your git checkout and commit [...] to git". I hope this mail explains what I mean and wasn't too confusing. If something is not clear, just ask ;-) Regards, Christian Boltz [1] I didn't do a diff against the upstream version and the iChainLoginFix is obviously not upstream code ;-) [2] even CVS would work ;-) The only thing that doesn't work is SVN because you can't store a SVN checkout in another SVN. Well, you could do it with tricks like vendor branches, but you probably don't want to do that because using another version control system is much easier. PS: non-random sig today ;-) -- Was schlagen sie vor, Prof. Dr. cvs. Boltz? :-) [Ratti in fontlinge-devel] -- To unsubscribe, e-mail: opensuse-wiki+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse-wiki+owner@opensuse.org
Hi Christian, Yes, that will work just fine on a normal wiki installation, but remember that I have to deal with upgrading 20+ wikis. I don't upgrade by copying over the old directory, and I wouldn't want to upgrade by doing an 'svn up' in the current wiki directory. As for the extensions, I can use svn, but about half the extensions don't use tags. That means I would have to checkout from the trunk instead of the last release. I have been using subversion for the extensions with tags, but I download and untar the releases for ones without tags. That is the best way to make sure I'm installing the latest release and not a 'beta' from the next release. If it helps, here is the path that I take to upgrade the wikis. Thomas explained how I can make that work well with git, so I may do that. However, as you will see, using subversion to check out the new MW code isn't really that beneficial to me. 1. Download and untar the new MW core 2. Remove the images directory 4. Update index.php (may be made obsolete by the upcoming change to Access Manager) 3. Copy the Bento theme and Localsettings.php to the new installation 4. Svn export the latest tag, or download the latest release, of each extension into the extensions directory 5. For each wiki (done by a script) 1. Remove symlinks to old installation 2. Symlink to new installation 4. Run update script for the core 5. Run update script for SemanticMW This takes me maybe an hour to do on stage, about 3/4 of it doing extensions. If people are OK with me using the trunk for all of the extensions, I can make that much shorter. However, we have enough trouble with the stable releases of some of the extensions, so I don't know about that. Prod takes about 15-20 minutes because I can essentially cut out steps 1-4. I don't want to svn up to the same directory because it doesn't allow me to upgrade the wikis individually, testing in between. That ability been very helpful in the past. -Matt
participants (3)
-
Christian Boltz
-
Marcus Rueckert
-
Matthew Ehle