On 25/03/2021 06.17, David C. Rankin wrote:
On 3/25/21 12:12 AM, David C. Rankin wrote:
On 3/23/21 8:52 PM, Carlos E. R. wrote:
what a lot of crap, how can they design those pages that eat so many resources.
The browser programmers then have to load all this crap and make it appear to be responsive to their page-load algorithms happily loop realloc()'ing memory until all the crap has loaded. A good feature would be a virtual memory limit per-page or per-site that may slow the load a bit but would limit pages to 10M.
Thankfully, my worst offender is ~2.6M:
3231 david 20 0 2626116 144420 100288 S 0.333 1.781 0:58.64 Web Content
Thank God for uBlock Origins and NoScript..... If a site doesn't work without all the cruft, it's a site I don't visit.
I can't "not visit" sites. I need to find information wherever it is. Years ago (year 2000 or so) I had to replace my computer with a bigger one because I needed to load certain pages that would take minutes to load. And worse, I had to interact with them (job search pages) and wait for the result of my actions (even half an hour). I could not increase the ram, I had to buy a new bigger and faster computer to cope. And this has happened more times. This computer has 32 gigs and I intend to buy another 32 - not because I need them now, but because I will need them in years to come and then perhaps the modules will not be available. (If you want to see a tool that eats memory by truck loads, try to recover an XFS filesystem with lots of metadata. You may need terabytes. Swap can be useful just that day) -- Cheers / Saludos, Carlos E. R. (from 15.2 x86_64 at Telcontar)