On 5/5/24 01:05, Rodney Baker wrote:
On Sunday, 5 May 2024 04:44:09 ACST Lew Wolfgang wrote:
Hi Folks,

Is there a way to maintain local copies of the openSUSE repositories?
Maintenance includes keeping the local repos up-to-date by downloading
only the deltas between the remote and local repos.

I know how to do this using the Red Hat "reposync" program, a part
of the yum/dnf package, but it just doesn't feel right having to
depend on Red Hat for what would appear to be core functionality.

Note:  I can't use rsync due to policy constraints.  But web ports 80
and 443 are okay.  I know, go figure...
 Do you need to maintain full copies of the repos locally, or would proxying/
caching requests to the repos for installs/updates be good enough? 

I've used apt-cacher-ng at home and in our corporate network for providing 
updates for machines that don't have outbound internet access. It caches files 
locally the first time they're requested and then serves them from the cache 
for subsequent requests. 

It works for zypper and yum as well as apt, but the mechanism is slightly 
different. For zypper and yum it's necessary to modify the repo URL to point 
to the apt-cacher-ng service on port 3142 (there are 2 documented ways do to 
that), whereas with apt it's a one-liner in a config file. The port can be 
changed if needed (e.g. to 8080 or any other arbitrary port). 

Apt-cacher-ng is available for openSUSE (including TW) but it's necessary to 
add the server:proxy repo to get it. 

Thanks for the suggestion Rodney.  But in my case it really makes sense to
have a completely local copy of repositories.  My off-network machines are
completely off-network, the only way to import data is to carry physical
media down the hallway and plug it in after checking for malware. 

Having a local copy also allows easy creation of deltas using rsync.  The size
of the repos this morning is 1.3-TB, and since corporate policy demands that
the media to import data be used only once,  it would become expensive
to sacrifice a disk drive every week.  But the changes from week to week will
fit onto a 32-GB SDcard, which can reasonably be used once and thrown away.
I've got hundreds of them in my desk drawer. 

A local copy also allows malware scanning.  Clamav works fine, and while
it might not be the best choice for detection it's enough to check the box.
I updated my local repo last night and ran clamscan on it, the results are
included below.   Clamav unpacks rpm's, so it really is a valid exercise.
This was run on an oldish desktop, I'm sure it would be faster on a modern
box.  The repo selection is the standard enabled suite for Leap 15.5, including
Packman and Nvidia, except the openh264 repo which seems to be having
a mirror problem right now.

Here's the Clamav report:

Known viruses: 8692269
Engine version: 0.103.11
Scanned directories: 165
Scanned files: 326923
Infected files: 0
Data scanned: 726747.85 MB
Data read: 1336155.75 MB (ratio 0.54:1)
Time: 28876.391 sec (481 m 16 s)
Start Date: 2024:05:04 20:40:55
End Date:   2024:05:05 04:42:12

Regards,
Lew