adobe acrobat alternative
Does anyone know of a KDE alternative to adobe acrobat PDF viewer? It looks so hanus running on my KDE desktop, and for some reason doesn't see my printer when I go to print PDF documents.
On Friday 25 February 2005 3:28 pm, Mike wrote:
Does anyone know of a KDE alternative to adobe acrobat PDF viewer? It looks so hanus running on my KDE desktop, and for some reason doesn't see my printer when I go to print PDF documents. xpdf, KWrite are two that read PDFs. -- Jerry Feldman <gaf@blu.org> Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9
On Friday 25 February 2005 3:28 pm, Mike wrote:
Does anyone know of a KDE alternative to adobe acrobat PDF viewer? It looks so hanus running on my KDE desktop, and for some reason doesn't see my printer when I go to print PDF documents. xpdf, KWrite are two that read PDFs. And I forgot, KPdf, and GNOME PDF Viewer. -- Jerry Feldman <gaf@blu.org> Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9
On Fri, 25 Feb 2005 16:07:54 -0500, Jerry Feldman <gaf@blu.org> wrote:
On Friday 25 February 2005 3:28 pm, Mike wrote:
Does anyone know of a KDE alternative to adobe acrobat PDF viewer? It looks so hanus running on my KDE desktop, and for some reason doesn't see my printer when I go to print PDF documents. xpdf, KWrite are two that read PDFs. And I forgot, KPdf, and GNOME PDF Viewer.
Thanks all for the prompt responses... I tried ghostview, but that didn't seem to do a great job (IIRC it displayed the first page, but couldn't get it to move to the next...) I'll give the rest a try tonight.. Thanks again!
Mike, On Friday 25 February 2005 13:58, Mike wrote:
On Fri, 25 Feb 2005 16:07:54 -0500, Jerry Feldman <gaf@blu.org> wrote:
On Friday 25 February 2005 3:28 pm, Mike wrote:
Does anyone know of a KDE alternative to adobe acrobat PDF viewer? It looks so hanus running on my KDE desktop, and for some reason doesn't see my printer when I go to print PDF documents.
xpdf, KWrite are two that read PDFs. And I forgot, KPdf, and GNOME PDF Viewer.
Thanks all for the prompt responses... I tried ghostview, but that didn't seem to do a great job (IIRC it displayed the first page, but couldn't get it to move to the next...) I'll give the rest a try tonight..
My experience has been is that the best "coverage" is obtained by the Adobe reader, not too surprisingly. (By that I mean it properly handles more PDF files than the other viewers.) On the other hand, on one or two occasions when I tried to print a document it went into a tailspin consuming ever more memory, pushing everything else out of memory and bringing my system to its knees (*). This has never happened with any of its native Linux counterparts. It has lots of other problems, but it's still the viewer I use by default. (*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process. Randall Schulz
On Friday 25 February 2005 04:16 pm, Randall R Schulz wrote: [...]
(*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process.
Clearly, you need more memory. :) Most modern system will accept 2GB, if not 4 or more. You should have time to kill acroread before it fills up 2GB of physical memory. --Danny, noting that the kernel starts killing processes when it runs out of memory...
Danny, On Saturday 26 February 2005 08:33, Danny Sauer wrote:
On Friday 25 February 2005 04:16 pm, Randall R Schulz wrote: [...]
(*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process.
Clearly, you need more memory. :) Most modern system will accept 2GB, if not 4 or more. You should have time to kill acroread before it fills up 2GB of physical memory.
I have 1 GB. Brute force cannot be the right way to address this problem. Besides, a run-away program can easily consume all the RAM and start driving swap activity much more quickly than a human user could recognize the problem and attempt to stop it. In fact, by the time there is any indication of a problem, it's pretty much too late already. Furthermore, such processes often are not responding to the messages triggered by clicking the close box or typing ALT-F4, forcing one to run ps or activate the KDE System Guard process table attached to CTRL-ESC. And finally, the X11 process that mediates keyboard and mouse activity is affected, too, making any corrective action whatsoever impossible. The upshot is that this is a genuine vulnerability that cannot be solved by throwing memory at the system.
--Danny, noting that the kernel starts killing processes when it runs out of memory...
That might be helpful if I had no swap space, in which case the swap (or paging) activity that makes the system unusable would never occur. The simple empirical fact is that a process that exhibits extreme and unbounded memory consumptive behavior has one several occasions left me with no alternative but to press the mainboard's reset switch. Randall Schulz
On Saturday 26 February 2005 10:33 am, Randall R Schulz wrote:
Danny,
On Saturday 26 February 2005 08:33, Danny Sauer wrote:
On Friday 25 February 2005 04:16 pm, Randall R Schulz wrote: [...]
(*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process.
Clearly, you need more memory. :) Most modern system will accept 2GB, if not 4 or more. You should have time to kill acroread before it fills up 2GB of physical memory.
I have 1 GB. Brute force cannot be the right way to address this problem.
Maybe you have too much memory, then. The only machine I've ever had that problem with is a machine with 128MB physical and 512MB swap (and a particularly leaky server daemon, though I've yet to identify precisely which one - the machine's running SuSE 5.2 and really should just be updated, so I'm not investing time in fixing problems). :) Well, my 1.5GB machine hasn't had that problem, either. It must be you. :)
The upshot is that this is a genuine vulnerability that cannot be solved by throwing memory at the system.
Well, if you're gonna make this a serious response, how about by implementing per-process memory limits? man bash, search for ulimit - presuming you're using bash. The c shells have a similar command, named the same, IIRC. Set the memory limits, process limits, etc. It's up to the shell to enforce, but I'll bet bash will notice the problem well before the typical user would. --Danny, who doesn't set limits, largely because of laziness
Danny, On Saturday 26 February 2005 09:15, Danny Sauer wrote:
On Saturday 26 February 2005 10:33 am, Randall R Schulz wrote:
Danny,
On Saturday 26 February 2005 08:33, Danny Sauer wrote:
On Friday 25 February 2005 04:16 pm, Randall R Schulz wrote: [...]
(*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process.
Clearly, you need more memory. :) Most modern system will accept 2GB, if not 4 or more. You should have time to kill acroread before it fills up 2GB of physical memory.
I have 1 GB. Brute force cannot be the right way to address this problem.
Maybe you have too much memory, then. The only machine I've ever had that problem with is a machine with 128MB physical and 512MB swap (and a particularly leaky server daemon, though I've yet to identify precisely which one - the machine's running SuSE 5.2 and really should just be updated, so I'm not investing time in fixing problems). :) Well, my 1.5GB machine hasn't had that problem, either. It must be you. :)
The upshot is that this is a genuine vulnerability that cannot be solved by throwing memory at the system.
Well, if you're gonna make this a serious response, how about by implementing per-process memory limits?
I'm not the one who signs every message with a cute slogan. Of course I'm serious. And I have considered using limits and I know of the ulimit built-in for BASH. But that's really neither here nor there, because only rarely are these programs started via a command submitted to a shell. To be genuinely helpful, I need something with a wider scope than a limit set in a shell.
man bash, search for ulimit - presuming you're using bash. ...
--Danny, who doesn't set limits, largely because of laziness
Randall Schulz
Sat, 26 Feb 2005, by rschulz@sonic.net:
Danny,
On Saturday 26 February 2005 09:15, Danny Sauer wrote: [..]
Well, if you're gonna make this a serious response, how about by implementing per-process memory limits?
I'm not the one who signs every message with a cute slogan. Of course I'm serious.
And I have considered using limits and I know of the ulimit built-in for BASH. But that's really neither here nor there, because only rarely are these programs started via a command submitted to a shell. To be genuinely helpful, I need something with a wider scope than a limit set in a shell.
Then use pam_limits.so /usr/share/doc/packages/pam/modules/README.pam_limits Theo -- Theo v. Werkhoven Registered Linux user# 99872 http://counter.li.org ICBM 52 13 26N , 4 29 47E. + ICQ: 277217131 SUSE 9.2 + Jabber: muadib@jabber.xs4all.nl Kernel 2.6.8 + MSN: twe-msn@ferrets4me.xs4all.nl See headers for PGP/GPG info. +
Theo, On Saturday 26 February 2005 13:55, Theo v. Werkhoven wrote:
Sat, 26 Feb 2005, by rschulz@sonic.net:
Danny,
On Saturday 26 February 2005 09:15, Danny Sauer wrote:
[..]
Well, if you're gonna make this a serious response, how about by implementing per-process memory limits?
I'm not the one who signs every message with a cute slogan. Of course I'm serious.
And I have considered using limits and I know of the ulimit built-in for BASH. But that's really neither here nor there, because only rarely are these programs started via a command submitted to a shell. To be genuinely helpful, I need something with a wider scope than a limit set in a shell.
Then use pam_limits.so /usr/share/doc/packages/pam/modules/README.pam_limits
Theo
Thank you. That appears to be exactly what I need. Now all I have to do is come up with good numbers... Randall Schulz
On Saturday 26 February 2005 11:14 am, Randall R Schulz wrote:
Danny,
On Saturday 26 February 2005 09:15, Danny Sauer wrote:
On Saturday 26 February 2005 10:33 am, Randall R Schulz wrote:
Danny,
On Saturday 26 February 2005 08:33, Danny Sauer wrote:
On Friday 25 February 2005 04:16 pm, Randall R Schulz wrote: [...]
(*) The one weakness I've experienced more than any other on my SuSE Linux system is its vulnerability to a rogue process consuming so much memory that everything else gets swapped out and it becomes impossible to even kill the errant process.
Clearly, you need more memory. :) Most modern system will accept 2GB, if not 4 or more. You should have time to kill acroread before it fills up 2GB of physical memory.
I have 1 GB. Brute force cannot be the right way to address this problem.
Maybe you have too much memory, then. The only machine I've ever had that problem with is a machine with 128MB physical and 512MB swap (and a particularly leaky server daemon, though I've yet to identify precisely which one - the machine's running SuSE 5.2 and really should just be updated, so I'm not investing time in fixing problems). :) Well, my 1.5GB machine hasn't had that problem, either. It must be you. :)
The upshot is that this is a genuine vulnerability that cannot be solved by throwing memory at the system.
Well, if you're gonna make this a serious response, how about by implementing per-process memory limits?
I'm not the one who signs every message with a cute slogan. Of course I'm serious.
I'm glad you think they're cute. :)
And I have considered using limits and I know of the ulimit built-in for BASH. But that's really neither here nor there, because only rarely are these programs started via a command submitted to a shell. To be genuinely helpful, I need something with a wider scope than a limit set in a shell.
Really? How do you log in? On my system, the console is a shell. SSH logins are a shell. Common X sessions are all launched from shell scripts (start-kde, gnome-session, .xinitrc, etc). Ergo, all of the processes referred to are launched from within a shell at some level. Make that shell script set your limits, and you're set. Hint - most of those scripts don't use the --noprofile option to sh. Theo's suggestion of pam_limits is probably easier to implement (and gives the opportunity to play with pam's real power, which is cool in and of itself), but sticking a ulimit line or two in /etc/profile will have a very similar effect. Sticking it in ~/.profile should work as well, since "most" of these sessions are run as the user. --Danny, who's sometimes serious even when one of these is present :)
Hello, Everything was working fine until I upgraded my DSL through Qwest. I run the DSL via Ethernet and it should be working in SuSE as it did before, but it is not. What settings do you think I need to change? Funny thing is, when I run Limewire the connection is there, but anything else doesn't work... mozilia,suse update,gaim. The connection is there and works great when I boot into windows. Any ideas? Thanks, SuSE Noobie
On Sat, 26 Feb 2005 11:57:22 -0700 "William Westfall" <wgwestfall1@msn.com> wrote:
Hello,
Everything was working fine until I upgraded my DSL through Qwest. I run the DSL via Ethernet and it should be working in SuSE as it did before, but it is not. What settings do you think I need to change?
Funny thing is, when I run Limewire the connection is there, but anything else doesn't work... mozilia,suse update,gaim. The connection is there and works great when I boot into windows. Any ideas? Standard network stuff: First, make sure you can ping the gateway using ip address. This ascertains that you can get out of the box. Next, ping a local address within your ISP if your know them, such as DNS servers. If you can't, then your routing table: netstat -nr.
Then make sure you have the correct DNS servers set up. You can do this via YaST or just edit /etc/resolv.conf Many times, the problem is simply that you don't have any DNS servers configured. -- Jerry Feldman <gaf@blu.org> Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9
Your thoughts got me thinking and I found the answer in YaST by unclicking the update name servers and search list via DHCP. I entered in the correct ones and it works great now. Funny that my new modem puts in bogus info when that is clicked. Luckily I was able to access the modems setup features and find the true name server addresses. Maybe it's a 64-bit issue? Thanks for the help. Made with__________________________________________ Linux 2.6.8-24.11-default #1 x86 64 GNU/Linux (SuSE 9.2) William Westfall wgwestfall1@msn.com From: Jerry Feldman <gaf@blu.org> To: suse-linux-e@suse.com Subject: Re: [SLE] Lost DSL Connection Date: Sat, 26 Feb 2005 18:15:50 -0500 On Sat, 26 Feb 2005 11:57:22 -0700 "William Westfall" <wgwestfall1@msn.com> wrote:
Hello,
Everything was working fine until I upgraded my DSL through Qwest. I run the DSL via Ethernet and it should be working in SuSE as it did before, but it is not. What settings do you think I need to change?
Funny thing is, when I run Limewire the connection is there, but anything else doesn't work... mozilia,suse update,gaim. The connection is there and works great when I boot into windows. Any ideas? Standard network stuff: First, make sure you can ping the gateway using ip address. This ascertains that you can get out of the box. Next, ping a local address within your ISP if your know them, such as DNS servers. If you can't, then your routing table: netstat -nr.
Then make sure you have the correct DNS servers set up. You can do this via YaST or just edit /etc/resolv.conf Many times, the problem is simply that you don't have any DNS servers configured. -- Jerry Feldman <gaf@blu.org> Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9 << attach3 >>
On Sat, 26 Feb 2005 23:47:05 -0700 "WILLIAM WESTFALL" <wgwestfall1@msn.com> wrote:
Your thoughts got me thinking and I found the answer in YaST by unclicking the update name servers and search list via DHCP. I entered in the correct ones and it works great now. Funny that my new modem puts in bogus info when that is clicked. Luckily I was able to access the modems setup features and find the true name server addresses. Maybe it's a 64-bit issue? Could be but I don't think so. I use static IPs on my home LAN because of backups, so I need to know what Comcast's name servers are, but I always keep a third one that is very stable. -- Jerry Feldman <gaf@blu.org> Boston Linux and Unix user group http://www.blu.org PGP key id:C5061EA9 PGP Key fingerprint:053C 73EC 3AC1 5C44 3E14 9245 FB00 3ED5 C506 1EA9
participants (7)
-
Danny Sauer
-
Jerry Feldman
-
Mike
-
Randall R Schulz
-
Theo v. Werkhoven
-
William Westfall
-
WILLIAM WESTFALL