On Fri, 16 Mar 2001, Alan Harris wrote:
Multiple servers? We advise that 30 clients per server is a sensible upper limit, mainly on the basis of avoiding a single point of failure. With the money you save by using low-spec, thin-client hardware you can easily afford a couple of servers and the problem is solved. I already run multiple servers - somethings run better on high powered equipment, some run better on low powered equipment. I don't trust thin clients - maybe it's a personal thing but I still want to work when the servers (plural) are down,
How often do all your servers go down at once? :-)
We have never had a server failure on our office system (thin-client, Linux, two servers) - they happily run 24x7 and only get switched off for hardware upgrades or transportation to BETT, Salisbury etc. Oh, and sometimes they get a reboot when we need to test the installation procedure.
or when the hub/switch/etc is down and so will teachers. It's a lot more work to use fat clients (but that's mainly the fault of the current widely used OS) than thin but the end result is, I think, better.
It will cost you a LOT more to get similar levels of both performance and reliability out of a fat client system, even using Linux.
You could make a decent workstation out of a 500MHz machine, might 128M rather than 32M of RAM though. Use reiserfs and it's most likely tougher than the average Win9X machine. (as well as about 200 quid cheaper, considering we have 28 computers per room that's over 5 and a half thousand pounds.) The apps can quite happily live on a server, even if you want the binarys on the local HDD putting "mount <server>:/apps /mnt; cp -a /mnt/apps /apps; umount /mnt" (in practice you'd want conditional copying) is hardly that tricky. -- Mark Evans St. Peter's CofE High School Phone: +44 1392 204764 X109 Fax: +44 1392 204763