On Wed, 09 Jul 2008 15:31:19 -0400, Brian K. White wrote:
This is why they're just not a problem here. They can not proliferate like that.
So of course it makes sense to let the infected files lie dormant until someone uses a system like Windows to access them.
Is it a car's responsibility to scan the trunk for bombs? Even if you could add that ability for a mere 100% increase in initial cost, weight, fuel consumption, occupied volume, and ongoing maintenance? Even a mere 10% overhead?
Not exactly the same thing, but if someone put an add-on in a car to allow it to be scanned constantly, would anyone object?
Is it your shirt's responsibility to scan every floppy you put in your pocket for viruses?
Of course not. My shirt doesn't run programs. My computer does.
How about thumb drives? Should we build a virus scanner into every usb thumb drive?
My USB drives don't run programs. My computer does. ;-)
Why don't SAN/NAS boxes do it?
Because they're *storage* devices, not *processing* devices.
I'm not denying that there are uses for virus scanners. I'm saying the average linux box has better things to do with it's cpu and i/o resources peforming actual useful tasks. If you want to scan everything for virii, then put that into a dedicated appliance on the network between the users and the application host.
Actually, the idea is not to "scan everything", it's to "scan things that are accessed". Scanning everything would be like running a scheduled scan once a day. When you've got relatively static data, that works fine, at least it has for me.
But I'll partially backpedal. I guess the same argument could be made about routers, spi firewalls, and vpn endpoints. There are hardware appliances for those, and yet there is also a use for doing them in software on the application host too in small sites. And probably SAN/NAS boxes that perform full time all data scanning will actually start appearing.
Quite possibly. But also keep in mind that people who implement a single system aren't going to want an overcomplicated second system to scan for viruses. They'll look at the Windows world and say "I don't need a separate machine to scan my Windows machines, so using Linux adds cost = another machine, another install, higher maintenance, etc". I don't think *anyone* is advocating that the *kernel* do the scanning. But leveraging kernel hooks to implement scanning is how it's done on other OSes. Hell, even NetWare has on-access scanning capabilities with third party add-ons, and NOBODY runs applications (not in the sense that one runs OpenOffice) on NetWare. End users have exactly ZERO access to run apps on the NetWare kernel. And on-access scanning there *typically* doesn't kill the kernel or overload it with work, even on systems 10 years ago. Even on systems with *thousands* of users accessing files. So it seems to me that if an on-access scanning agent on Linux impacts performance to the degree that some say it does, then there's an architecture problem in the agent. I've seen effective on-access scanning implemented on systems running 80[34]86 processors.
It's just retarded to make the box waste time scanning things it knows cannot possibly contain a virus, or that cannot hurt anything even if they do.
So a well architected solution wouldn't scan image files, for example - using the 'file' command it's easy to tell if a file contains executable code or not. Or to exclude /tmp directories. Again using the NetWare analogy, AV solutions there that do on-demand scanning typically allow the administrator to exclude files by file extension (since file typing isn't something implemented in that kernel or with the standard utilities - not that it couldn't be) or by directory. I remember seeing problems with some scanning the _NETWARE directory and deciding database files there that stored user information were infected and then "cleaning" them.
It's a bad trade-off, doing work 100% of the time just to protect against 1% chance of a problem. When ImageMagick reads a file, even one created on a windows machine that may contain a virus, I do not need the kernel to scan it for virii. When ImageMagick writes a temp file, I do not need to scan it for virii either at write or read times. When my multi-user database app writes a thousand temp files every minute and does thousands of surgical reads and writes of individual records evey minute I do not need every one of those transactions being scanned for a virus. I most especially do not need the the even more numerous index maintenance ops being scanned. If I wanted to be super nice, just as an over-the-top service to my users, I might possibly install a module into apache and scan all outgoing static and dynamic content as it is served out.
And that's what exclusions for. Why is it that some think on-access has to be an "all or nothing" proposition? Jim -- Jim Henderson Please keep on-topic replies on the list so everyone benefits -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org