Quoting "b@rry.co.za" :
One of my servers is picking up huge volumes of requests for http://www.instituto.com.br/attackDoS.php?ver=01&task=newzad&first=1
This is a worm infection that somehow eluded the various anti virus solutions these people are running.
It is hammering my squid proxy, even though the current zone for the domain points www.instituto.com.br to 127.0.0.1
I have created an explicit deny for the url, but the cache still has to deny the requests, so huge load on the cpu right now.
The customers sysadmin is away and I cannot get there until probably tuesday (usually I only support the Linux servers), is there a way to block this with a reduced load so that the rest of my system can operate normally?
P.S. I have no idea where this came from as the users are not permitted to download executables, scripts etc, and we have a comercial antivirus product that has successfully updated its pattern file every hour for the last 8 months. This also plugs into squid, so denying mime types via squid reduces the load but the anti-virus still checks things that go through (including activex and other scripts)
Is it just one machine or the whole internal network? If it's one machine, I'd block it's IP at the iptable/SuSEfirewall level. That user can sit in the corner for getting his machine infected. If it's the whole network of windows boxes, there isn't much you can really do. Unless, of course, you control the local DNS, perhaps? Make it so that those windows boxes think the URL is 127.0.0.1. They shouldn't even go to the proxy, then. Barring that, you may be better off pointing the url on the squid box to something other than 127.0.0.1. Assuming the squid box isn't running a web server, it's going to have to deal with those failed connections. You're better off giving it a different machine that it can actually connect to on port 80, or even just turning up apache on the squid machine with a blank homepage. Squid will likely be happier serving up a blank page than dealing with failed connection errors.