Mailinglist Archive: opensuse-buildservice (178 mails)

< Previous Next >
Re: [opensuse-buildservice] Storage problems ... cleanup comming ...
  • From: Cristian Morales Vega <reddwarf@xxxxxxxxxxxx>
  • Date: Tue, 4 Sep 2012 09:26:15 +0100
  • Message-id: <CAOWQn3TrWtQZ_Gy1ry3ZaOZ=qa9Bi+f2kBB5Eke2bsfh0ZG3vQ@mail.gmail.com>
2012/9/4 Adrian Schröter <adrian@xxxxxxx>:

Hi,

our binary backend server is filled for more than 90% atm.

While we could increase the storage system there in general, we would
still have the problem that we need to sync most of the data also
to at least some mirrors. So we need to look a way to avoid
too much bloat there.

Attached is a list of the largest projects. Please think about
if they are really necessary or could be reduced.

Please do not understand me wrong, it is of course okay to use the resources
when there is a reason. I do not ask to remove projects, just because
they are large. However, you may want to check if you really need all
the target repositories maybe.

You may also want to check the values calculated from the download statistics.
They are accessable via

osc api /build/_dispatchprios

when the adjust number is below 1 it means that the downloads are below
average.
The repository is a STRONG removal candidate when it is below 0.5 IMHO.

thank you for your co-operation :)

280G games

Any way to get more detailed stats?
I would be tempted to disable debug information from it, but it would
require a full rebuild and being the games repo it's well possible
that most of the size comes from actual game data (graphics and
sound). In fact two or three packages could be tacking most of the
space.
A way to make all this game data, probably in a noarch subpackage,
"build once and be available for multiple distros" would also help. (I
think "rpmlib(PayloadIsLzma)" was the latest rpm feature added, so a
package build in Factory should work in 11.4).
--
To unsubscribe, e-mail: opensuse-buildservice+unsubscribe@xxxxxxxxxxxx
To contact the owner, e-mail: opensuse-buildservice+owner@xxxxxxxxxxxx

< Previous Next >
References