We're in the process of deploying about 200 SuSE workstations out to our remote offices scattered across the US. I need a way that I can easily make changes to the workstations all at once. I've been thinking of a hack that I would write (in Perl) that would go something like this... 1) each workstation would execute a cron job daily that would download a script from our central server each day 2) that script would be executed by another cron job a few minutes later. This script will contain any changes that I need to make. If there aren't any updates for the day, then the script will be blank that day. Will this work? There has to be something better than this out there. What do you guys do in this situation? For example, we've deployed about 25 workstations out so far, but now I need to change this Perl script on each one of the machines. What is the easiest way to push this script out besides SCP'ing it to each one individually. Thanks, Chris
Chris Purcell wrote:
We're in the process of deploying about 200 SuSE workstations out to our remote offices scattered across the US. I need a way that I can easily make changes to the workstations all at once. I've been thinking of a hack that I would write (in Perl) that would go something like this...
1) each workstation would execute a cron job daily that would download a script from our central server each day 2) that script would be executed by another cron job a few minutes later. This script will contain any changes that I need to make. If there aren't any updates for the day, then the script will be blank that day.
Will this work? There has to be something better than this out there. What do you guys do in this situation? For example, we've deployed about 25 workstations out so far, but now I need to change this Perl script on each one of the machines. What is the easiest way to push this script out besides SCP'ing it to each one individually.
Thanks, Chris
One thing I've heard of people doing in the past is using rsync. You have a master workstation that you make the changes to. Then you keep a copy of the entire workstation filesystem on some central server. Each time you make a change, you upload a copy. You have all the workstations you deploy scheduled via cron to rsync against this server daily or whenever you like. If you don't use the master workstation for anything else, you could just have them rsync directly to it. We manage PC labs of several hundred machines with a commercial product called Rembo Took Kit, http://www.rembo.com/products_toolkit.htm. Depending on your requirements and budgest, it could be an option for you. Jason Joines =================================
One thing I've heard of people doing in the past is using rsync. You have a master workstation that you make the changes to. Then you keep a copy of the entire workstation filesystem on some central server. Each time you make a change, you upload a copy. You have all the workstations you deploy scheduled via cron to rsync against this server daily or whenever you like. If you don't use the master workstation for anything else, you could just have them rsync directly to it.
We manage PC labs of several hundred machines with a commercial product
called Rembo Took Kit, http://www.rembo.com/products_toolkit.htm. Depending on your requirements and budgest, it could be an option for you.
Hi Jason, That rsync idea sounds like a good idea, if you were on a high speed LAN. Our remote offices are connected via either frame relay or DSL (VPN). rsync connections would completely saturate the lines, not too mentioned the RAM on the central server. This Rembo toolkit seems like its just used for cloning and deploying machines? We're using Symantec Ghost Corporate Edition 8.0 right now and its working great for us for deploying Windows and Linux machines. Thanks, Chris
On Tuesday 16 March 2004 11:50 am, Chris Purcell wrote:
That rsync idea sounds like a good idea, if you were on a high speed LAN. Our remote offices are connected via either frame relay or DSL (VPN). rsync connections would completely saturate the lines, not too mentioned the RAM on the central server.
Eh? Rsync can be told to only change things that are new/changed. So if sending out new/changed stuff would saturate the lines, then whatever you use to do same is going to saturate the lines. Am I missing something? -- +----------------------------------------------------------------------------+ + Bruce S. Marshall bmarsh@bmarsh.com Bellaire, MI 03/16/04 12:01 + +----------------------------------------------------------------------------+ "When the well's dry, we know the worth of water."
That rsync idea sounds like a good idea, if you were on a high speed LAN. Our remote offices are connected via either frame relay or DSL (VPN). rsync connections would completely saturate the lines, not too mentioned the RAM on the central server.
Eh? Rsync can be told to only change things that are new/changed. So if sending out new/changed stuff would saturate the lines, then whatever you use to do same is going to saturate the lines.
Am I missing something?
Our frame relay connections are usually only 256K or 384K lines. You don't think that 10 simultaneous rsync connections would be too much for this? Yes, we would only be changing new files but rsync would still have to scan the entire disk of the machine each day, that takes time, especially over a fractional T. For example, right now I need to change a Perl script that is on each workstation. The script is a small file, under 50K. It would be much, much faster to simply just copy this file to each workstation then to have each machine doing an rsync scan of an entire disk just so that it can copy a 50K Perl file over. We're talking 10 minutes for the scan, as opposed to 10 seconds to simply copy the file over. Chris
On Tuesday 16 March 2004 12:07 pm, Chris Purcell wrote:
Eh? Rsync can be told to only change things that are new/changed. So if sending out new/changed stuff would saturate the lines, then whatever you use to do same is going to saturate the lines.
Am I missing something?
Our frame relay connections are usually only 256K or 384K lines. You don't think that 10 simultaneous rsync connections would be too much for this? Yes, we would only be changing new files but rsync would still have to scan the entire disk of the machine each day, that takes time, especially over a fractional T.
No, I don't think it would be too much... and if so, it wouldn't be rsync's fault. If it's too much, why not rsync to a special directory and then have the remote machine do it's updates from there.
For example, right now I need to change a Perl script that is on each workstation. The script is a small file, under 50K. It would be much, much faster to simply just copy this file to each workstation then to have each machine doing an rsync scan of an entire disk just so that it can copy a 50K Perl file over. We're talking 10 minutes for the scan, as opposed to 10 seconds to simply copy the file over.
Chris
I think you're looking for a 'magical solution'. Rsync should do it if it is done right. -- +----------------------------------------------------------------------------+ + Bruce S. Marshall bmarsh@bmarsh.com Bellaire, MI 03/16/04 12:21 + +----------------------------------------------------------------------------+ "I've learned- that you can keep puking long after you think you're finished."
On Wednesday 17 March 2004 04:07, Chris Purcell wrote:
Our frame relay connections are usually only 256K or 384K lines. You don't think that 10 simultaneous rsync connections would be too much for this?
If an office has 10 workstations, then make it hierarchical. 1 connects to the remote model machine and gets itself up to date, then the others copy it. I don't know if Tridge ever made the changes to make the filelist itself rsyncable, there was talk of it years ago. michaelj -- Michael James michael.james@csiro.au System Administrator voice: 02 6246 5040 CSIRO Bioinformatics Facility fax: 02 6246 5166
Chris Purcell wrote:
One thing I've heard of people doing in the past is using rsync. You have a master workstation that you make the changes to. Then you keep a copy of the entire workstation filesystem on some central server. Each time you make a change, you upload a copy. You have all the workstations you deploy scheduled via cron to rsync against this server daily or whenever you like. If you don't use the master workstation for anything else, you could just have them rsync directly to it.
We manage PC labs of several hundred machines with a commercial product
called Rembo Took Kit, http://www.rembo.com/products_toolkit.htm. Depending on your requirements and budgest, it could be an option for you.
Hi Jason,
That rsync idea sounds like a good idea, if you were on a high speed LAN. Our remote offices are connected via either frame relay or DSL (VPN). rsync connections would completely saturate the lines, not too mentioned the RAM on the central server.
This Rembo toolkit seems like its just used for cloning and deploying machines? We're using Symantec Ghost Corporate Edition 8.0 right now and its working great for us for deploying Windows and Linux machines.
Thanks, Chris
Actually, I think rsync is ideal for low bandwidth situtations. It only replicates changes. You can change a line in a file on the master and only that line gets changed on the slave, the whole file doesn't get copied. I'd definitely use it over DSL or Frame-Relay. Rembo actually does much more. It also can be used to just synchronize the changes. You can also use it to edit text files, modify registry, push patches, collect inventory information and put it into a datbase (MySQL, etc) and much more. We've had much better luck with it than with Ghost both from a performance and a capability point of view. Jason ===========
On Tuesday 16 March 2004 11:50 am, Chris Purcell wrote:
One thing I've heard of people doing in the past is using rsync. You
That rsync idea sounds like a good idea, if you were on a high speed LAN. Our remote offices are connected via either frame relay or DSL (VPN).
I use rsync for remote backup over dialup lines, still works great even at low bandwidth. Greg Engel
I use rsync for remote backup over dialup lines, still works great even at low bandwidth.
If bandwidth isn't an issue, how well is a model client going to hold up with multiple rsync connections scanning it simulataneously? rsync is very memory intensive, even for one connection. I think it would crash with multiple rsync connections attached to it. We have about 25 clients right now, but that could go up to as high as 200 in the near future. Someone else recommended cfengine to me. Has anybody ever used it before? http://www.cfengine.com/ Chris
At 01:43 PM 3/16/2004, Chris Purcell wrote:
I use rsync for remote backup over dialup lines, still works great even at low bandwidth.
If bandwidth isn't an issue, how well is a model client going to hold up with multiple rsync connections scanning it simulataneously? rsync is very memory intensive, even for one connection. I think it would crash with multiple rsync connections attached to it. We have about 25 clients right now, but that could go up to as high as 200 in the near future
One thing you might consider is to have one machine from each location rsync to the master machine . then have the other machines at the locations to rsync from that one that has has rsnyc from the master. Another words make one machine at each location into a master for that location. that would cut down the machines trying to rsync at the sametime to just a few. hope that makes since. jack
We're in the process of deploying about 200 SuSE workstations out to our remote offices scattered across the US. I need a way that I can easily make changes to the workstations all at once. I've been thinking of a hack that I would write (in Perl) that would go something like this...
1) each workstation would execute a cron job daily that would download a script from our central server each day 2) that script would be executed by another cron job a few minutes later. This script will contain any changes that I need to make. If there aren't any updates for the day, then the script will be blank that day.
Will this work? There has to be something better than this out there. What do you guys do in this situation? For example, we've deployed about 25 workstations out so far, but now I need to change this Perl script on each one of the machines. What is the easiest way to push this script out besides SCP'ing it to each one individually.
Another thing that I was thinking might work would be to deploy a central apt server and have each client connect to it to check for updates. I would create RPM packages for any updates that need to be done and then add them to the central apt server. This might get a little messy though with all of these RPM packages being added to the RPM database on each workstation. What do you think about this idea? Thanks, Chris
Sorry if this gets sent twice but my first mailing din't seem to work... Have you at least looked into alice? I believe it is written by SuSE. I know it is written to help people with exactly this problem. It not only helps with the "distribution", but also helps with tracking version, multiple workstation types etc.. IMHO you need to (at least) take a hard look at it. Jerry On Tue, 2004-03-16 at 17:18, Chris Purcell wrote:
We're in the process of deploying about 200 SuSE workstations out to our remote offices scattered across the US. I need a way that I can easily make changes to the workstations all at once. I've been thinking of a hack that I would write (in Perl) that would go something like this...
1) each workstation would execute a cron job daily that would download a script from our central server each day 2) that script would be executed by another cron job a few minutes later. This script will contain any changes that I need to make. If there aren't any updates for the day, then the script will be blank that day.
Will this work? There has to be something better than this out there. What do you guys do in this situation? For example, we've deployed about 25 workstations out so far, but now I need to change this Perl script on each one of the machines. What is the easiest way to push this script out besides SCP'ing it to each one individually.
Thanks, Chris
participants (7)
-
Bruce Marshall
-
Chris Purcell
-
Greg Engel
-
Jack Malone
-
Jason Joines
-
Jerome R. Westrick
-
Michael James