25 Mar
2021
25 Mar
'21
05:17
On 3/25/21 12:12 AM, David C. Rankin wrote:
On 3/23/21 8:52 PM, Carlos E. R. wrote:
what a lot of crap, how can they design those pages that eat so many resources.
The browser programmers then have to load all this crap and make it appear to be responsive to their page-load algorithms happily loop realloc()'ing memory until all the crap has loaded. A good feature would be a virtual memory limit per-page or per-site that may slow the load a bit but would limit pages to 10M.
Thankfully, my worst offender is ~2.6M: 3231 david 20 0 2626116 144420 100288 S 0.333 1.781 0:58.64 Web Content Thank God for uBlock Origins and NoScript..... If a site doesn't work without all the cruft, it's a site I don't visit. -- David C. Rankin, J.D.,P.E.