Its astounding just how that magical 100Mbps LAN degrades when you get a few Napster users on it, or the server suffers the load from 2Gb of downloaded porn images (both problems we have had in the last year, and Squid has helped discover)! Actually a colleague of mine set it up, but now it is great to be able to redirect commonly abused pages to a default HTML page warning them about internet usage, and advising them that their activity had been logged. We started off with a basic list of 'bad' words that Squid would find in the URL of web pages loaded, but now it scans for a list of banned sites, and uses a Perl script to redirect to the custom HTML page. You could of course use this powerful tool in a number of ways, but it will always be a game of chasing the new pages users will inevitably discover! Perhaps using it to permit only agreed search engines would be one method?? Anyone else out there got more experience of implementing banned pages policies??
Now squid is up and running on our network, I am thinking ahead. We are about to upgrade the network to 100MBit cabling/switches and Pentium III workstations. By March next year we should also have a broadband 2MB connection to the Internet (according to our LEA!).
In these circumstances, is there any benefit in running a web proxy cache? I do not use it to restrict access to certain sites etc..
Martin Dart Senior Computing Officer School of Computing & Mathematical Sciences Oxford Brookes University Gipsy Lane Headington, OXFORD Tel (01865) 483667 Mobile 0797 927 1628
participants (1)
-
Martin Dart