Hello Everyone,
 
I was looking at some site stats, and right now Google is reporting that the wiki pages are taking an average of between 2.5 and 3.5 seconds to load.  This is slower than 50%+ for all sites.  I am hoping to cut this down to under 2 seconds, which would put us in the 25% range or better.  Here are a couple of options to do that:
 
Localization Caching - There is no conceivable reason to not do this, and it should provide a small speed increase.
 
Enable Gzip - This would help out a little, and there shouldn't be much risk to it.
 
File Caching - This would provide a major boost in speed.  This alone could take nearly a second off page loading time.  However, this does have a drawback.  Taken from the manual page: "The file cache tends to cache aggressively; there is no set expiry date for the cached pages and pages are cached unconditionally even if they contain variables, extensions and other changeable output. Some extensions disable file cache for pages with dynamic content."  How badly would this affect us?  It would be easy enough to cron up a refresh of this cache if that would help.
 
Combine external files - Perhaps some of the skin designers might want to look at this.  Right now, each page requires up to 13 CSS files and 9 Javascript files, a good portion of which are on static.opensuse.org.  Combining some of these would make a noticeable difference.
 
Caching Proxy (Squid) - An alternative to file caching and a viable long term option.  Would require some big changes and probably not justifiable right now...
 
I can work on the first two, but I would like to hear some input on the last three.  In particular, I want to see if the file caching option is viable, since it has the largest potential right now.