On 8/9/24 11:08, Lew Wolfgang wrote:
That is work. If you need it, do it.
Do tell? What ongoing work is needed to maintain an archive? How much work is involved to make a snapshot and archive a repository? Doesn't seem like it should involve a lot of work by someone who groks SuSE's repositories. Your suggestion "If you need it, do it" is crass and impracticable for us simple users. I imagine it would be a steep learning curve to understand the whole repository architecture, design, software, and testing procedures, let alone get qualified to make changes to the repository structures! What would take an expert a few minutes to accomplish would take me days/weeks to do.

This was a topic a few months ago.  I've got to maintain a local image
of many of the openSUSE repos for servicing an insular network. I've
got a fairly easy way to do it using Redhat's yum program.  I'll collect
my notes and processes and post them here in a bit.

Here's how I downloaded openSUSE repositories.  I'm sure this isn't
the only way to do this, but I had a restriction in that I couldn't use
rsync to download deltas.  Once the repos were on a local machine
I sneaker-netted the deltas into an isolated network where they
were distributed on an insular subnet by NFS.

First, download the dnf utilities:
 
    zypper in dnf-utils

Then, create the root directory for your local repos:

   mkdir /data/repos  (for example)

Now, make the local  repo using the URI that "zypper lr -U" reports: (using backports as an example)

   yum-config-manager --add-repo=http://download.opensuse.org/update/leap/15.5/backports/  (for example)

cd to your local repo root:

   cd /data/repos

Now list your local repo.  It will be different than the one you downloaded, spaces replaced with underlines.

   dnf repolist

Finally, the outside repository can be downloaded to the local.  Plug in the repository
you listed in the dnf repolist command

   reposync --repoid=download.opensuse.org_update_leap_15.5_backports_  --download-metadata --download-path=./

The first time reposync is run it will download the entire indicated repository.
Subsequent runs will download only the changes from the last time it was run.
This allows one to have a complete up-to-date local repo that can be kept around
permanently if desired.  Repeat for the rest of the repositories you want to keep.

The series of reposyncs could be put into a shell script and run regularly
if the remote repos are still being maintained.

Note that rsync is probably better at doing this, but my system is behind a
firewall that doesn't allow outgoing rsync connections.  Go figure.   They
allow outgoing ports 80 and 443, but not port 873.

Regards,
Lew