Recently I read in one of the PC mags that by default Win2K and Win-XP both disable low-level Ethernet packet decoding / processing in hardware on the network card. MS have set this to be handled in software by the PC's CPU instead (sorry I can't remember which mag - PC-Pro, Pc-Plus or PCW - I'm moving house so I can't easily lay my hands on this mag right now). There are some advantages to this (I forget what), but on a network segment with PCs attached to a hub as opposed to a switch (as in a typical classroom) this means every Win2K PC has to do a lot of CPU intensive background processing just identifying all the packets on a network segment, almost all of which it's ultimately going to reject anyway! Now maybe this processing overhead (especially at a time when a whole class is logging in at the same time) is slowing down the PC to such an extent its causing the login to time out and the machine to default back to local machine only mode. Incidentally, it is possible to disable this 'feature' in Win2K. Maybe someone else can dig out this reference for you. As far as I can recall it's also referred to somewhere deep inside MS's knowledge base.
I still don't understand why this problem occurred, and was only present on the W2K clients. I suspect that for performance reasons we ought to have oplocks enabled on the home directory shares, but clearly, reliability is more important than performance.
David Bowles Education Support