On 2021-03-25 00:12:55 David C. Rankin wrote:
|On 3/23/21 8:52 PM, Carlos E. R. wrote: |> what a lot of crap, how can they design those pages that eat so many |> resources. | |You are singing to the choir. | |All the webbies out there (kids with crayons) are not programmers and have | no concept of stack, heap or .bss, .data, .rodata, or .text or | transmission protocols, packets, etc... They have GUI's where they drag | one widget to a position on a page, click "gradient fill" because the | think it looks cool, and a few additional <dim> statements for forms with | drop-shadows, add 3 or 4 third-party sites to load javascript animations | for the widgets on the forms, and then add google-analytics, qualtrics, | etc.. and then type the all important content: | | "Hello World!" | |and save to the web-server. | |Then they wonder why in the hell it takes 2Gig of virtual memory to load | their "Hello World" page -- but they are not programmers, so they assume | it is normal and move on to adding more content..... | |The browser programmers then have to load all this crap and make it appear | to be responsive to their page-load algorithms happily loop realloc()'ing | memory until all the crap has loaded. A good feature would be a virtual | memory limit per-page or per-site that may slow the load a bit but would | limit pages to 10M.
Some of those webbies have been given mandates to create the many, many enterprise business web pages that look so very flashy but are difficult-to-impossible to navigate, and that suck up our bandwidth resources to provide little beyond frustration for their customers. Leslie -- openSUSE Leap 15.2 x86_64