Apache problem or systemd?
I have a directory for a copy of my web site (browser pointing to 127.1.1.0) where I test material before uploading to a commercial Hosted DNS server. Haven't done any edits for a while but tried one today ..only to run into this: ------------------------------------------------------- # systemctl status apache2.service ● apache2.service - The Apache Webserver Loaded: loaded (/usr/lib/systemd/system/apache2.service; disabled; ... preset: disabled) Aug 17 22:14:23 localhost.localdomain systemd[1]: Starting The Apache Webserver... Aug 17 22:14:23 localhost.localdomain start_apache2[6217]: AH00558: httpd-prefork: Could not reliably determine the server's fully qualified domain name, using localhost.localdomain. Set the 'ServerName' directi> Aug 17 22:14:23 localhost.localdomain systemd[1]: Started The Apache Webserver. ------------------------------------------------------- Why would I need a "fully qualified domain name" under 127.0.0.1? ----------------------------------- Access forbidden! You don't have permission to access the requested directory. There is either no index document or the directory is read-protected. If you think this is a server error, please contact the webmaster. Error 403 ------------------------------------ Lousy communications AGAIN, what directory? I requested 127.0.0.1 in my browser. BTW the entire tree to the directory is read-write by userMe, my primary group, and readable by all.
Thu, 17 Aug 2023 22:46:32 -0400 bent fender <ksusup@trixtar.org> :
I have a directory for a copy of my web site (browser pointing to 127.1.1.0) where I test
red pencil: 127.0.0.1
material before uploading to a commercial Hosted DNS server. Haven't done any edits for a while but tried one today ..only to run into this:
------------------------------------------------------- # systemctl status apache2.service ● apache2.service - The Apache Webserver Loaded: loaded (/usr/lib/systemd/system/apache2.service; disabled; ... preset: disabled) Aug 17 22:14:23 localhost.localdomain systemd[1]: Starting The Apache Webserver... Aug 17 22:14:23 localhost.localdomain start_apache2[6217]: AH00558: httpd-prefork: Could not reliably determine the server's fully qualified domain name, using localhost.localdomain. Set the 'ServerName' directi> Aug 17 22:14:23 localhost.localdomain systemd[1]: Started The Apache Webserver. -------------------------------------------------------
Why would I need a "fully qualified domain name" under 127.0.0.1?
----------------------------------- Access forbidden! You don't have permission to access the requested directory. There is either no index document or the directory is read-protected. If you think this is a server error, please contact the webmaster. Error 403 ------------------------------------
Lousy communications AGAIN, what directory? I requested 127.0.0.1 in my browser. BTW the entire tree to the directory is read-write by userMe, my primary group, and readable by all.
From: bent fender <ksusup@trixtar.org> Date: Thu, 17 Aug 2023 22:46:32 -0400 I have a directory for a copy of my web site (browser pointing to 127.1.1.0) where I test material before uploading to a commercial Hosted DNS server. Haven't done any edits for a while but tried one today ..only to run into this: ------------------------------------------------------- # systemctl status apache2.service ● apache2.service - The Apache Webserver Loaded: loaded (/usr/lib/systemd/system/apache2.service; disabled; ... preset: disabled) Aug 17 22:14:23 localhost.localdomain systemd[1]: Starting The Apache Webserver... Aug 17 22:14:23 localhost.localdomain start_apache2[6217]: AH00558: httpd-prefork: Could not reliably determine the server's fully qualified domain name, using localhost.localdomain. Set the 'ServerName' directi> Aug 17 22:14:23 localhost.localdomain systemd[1]: Started The Apache Webserver. ------------------------------------------------------- Why would I need a "fully qualified domain name" under 127.0.0.1? Hmm. I don't think it cares -- it wants one no matter what, because I think it expects to compare this against "Host:" headers, and doesn't know when it first starts up that you aren't going to need that. Here's what I found in the Apache docs [1]: If no ServerName is specified, the server attempts to deduce the client visible hostname by first asking the operating system for the system hostname, and if that fails, performing a reverse lookup on an IP address present on the system. What I can't figure is why it doesn't give me the same error on my internal server. I'm not specifying a ServerName, but I'm also not binding to 127.0.0.1 (the server is on another machine and doesn't have a public address), so rDNS shouldn't work either. So what is it using, and where is it coming from? ----------------------------------- Access forbidden! You don't have permission to access the requested directory. There is either no index document or the directory is read-protected. If you think this is a server error, please contact the webmaster. Error 403 ------------------------------------ Lousy communications AGAIN, what directory? I requested 127.0.0.1 in my browser. BTW the entire tree to the directory is read-write by userMe, my primary group, and readable by all. Have you upgraded to a new release since you last tried this? Could you be missing a "Require all granted" or a "DirectoryIndex"? Did you try pointing the browser at the index document explicitly (or something else you know is there)? What does the Apache log say you asked for? -- Bob Rogers http://www.rgrjr.com/ [1] https://httpd.apache.org/docs/2.4/mod/core.html#servername
Thu, 17 Aug 2023 22:00:25 -0700 Bob Rogers <rogers@rgrjr.com> : ...
Have you upgraded to a new release since you last tried this? Could you be missing a "Require all granted" or a "DirectoryIndex"? Did you try pointing the browser at the index document explicitly (or something else you know is there)? What does the Apache log say you asked for?
-- Bob Rogers http://www.rgrjr.com/
[1] https://httpd.apache.org/docs/2.4/mod/core.html#servername
If I do it under Leap-15.5 it works, pun intended as I get a page that says "It works!". I did a search on my pages and nothing contains "It works!" so it must be a (very useful) system message. Maybe it shopuld say "System check: local server works!" On of my major bad habits is thinking that all of humanity is locked into what OS I'm actually booted into, fortunately in this case the above confirmed that it was Tumbleweed. Even though Leap said "It works!" there were dirty fubar footprints like <Directory "wrongOLDpath/cgi-bin"> AllowOverride None Options +ExecCGI -Includes <IfModule !mod_access_compat.c> Require all granted </IfModule> <IfModule mod_access_compat.c> Order allow,deny Allow from all </IfModule> </Directory> and in more places than just one. I cleaned up in Leap and copied configs into TW: all is well. I should be ashamed but I'm not. However sincere apologies for having bothered the list with it. -- The greatest threat of Artificial Intelligence will not come from Satan's machination but from humans, when sooner than later Political-Correctness will seem like a joke compared to AI-Correctness!
From: bent fender <ksusup@trixtar.org> Date: Fri, 18 Aug 2023 08:42:29 -0400 Thu, 17 Aug 2023 22:00:25 -0700 Bob Rogers <rogers@rgrjr.com> : ...
Have you upgraded to a new release since you last tried this? Could you be missing a "Require all granted" or a "DirectoryIndex"? Did you try pointing the browser at the index document explicitly (or something else you know is there)? What does the Apache log say you asked for?
-- Bob Rogers http://www.rgrjr.com/
[1] https://httpd.apache.org/docs/2.4/mod/core.html#servername
If I do it under Leap-15.5 it works, pun intended as I get a page that says "It works!". I did a search on my pages and nothing contains "It works!" so it must be a (very useful) system message. Maybe it should say "System check: local server works!" Too verbose. That's not the Unix Philosophy. ;-} On of my major bad habits is thinking that all of humanity is locked into what OS I'm actually booted into, fortunately in this case the above confirmed that it was Tumbleweed. Even though Leap said "It works!" there were dirty fubar footprints like . . . . . . and in more places than just one. I cleaned up in Leap and copied configs into TW: all is well. I should be ashamed but I'm not. However sincere apologies for having bothered the list with it. No worries. I always have to be careful to copy the old config into the new installation, and I still screw it up often enough. -- Bob
On 8/18/23 07:42, bent fender wrote:
I should be ashamed but I'm not. However sincere apologies for having bothered the list with it.
About as much fun as downgrading php 8.2 to php-legacy 8.1 for nextcloud and learning all the config files failed to provide access to the fcgid-bin directory leading to a authz-core denial for all php pages.... Every article (99% of them) all point to the "Allow from all" to "Require all granted" Apache 2.2 -> 2.4 config updates... Nah, it was the fact nothing provided access to fcgid-bin (instead of the normal cgi-bin) which some brilliant soul decided on to allow 8.1 and 8.2 to co-exist... While not difficult, Apache configs, php-fpm, fastcgi or fcgid are all very involved with much fine grained control. Even with the server set to debug log levels, the logs don't do a great job pointing out exactly where the access problem originates. Long story short -- I feel you pain. -- David C. Rankin, J.D.,P.E.
Sun, 20 Aug 2023 16:41:02 -0500 "David C. Rankin" <drankinatty@suddenlinkmail.com> :
On 8/18/23 07:42, bent fender wrote:
I should be ashamed but I'm not. However sincere apologies for having bothered the list with it.
About as much fun as downgrading php 8.2 to php-legacy 8.1 for nextcloud and learning all the config files failed to provide access to the fcgid-bin directory leading to a authz-core denial for all php pages....
Every article (99% of them) all point to the "Allow from all" to "Require all granted" Apache 2.2 -> 2.4 config updates... Nah, it was the fact nothing provided access to fcgid-bin (instead of the normal cgi-bin) which some brilliant soul decided on to allow 8.1 and 8.2 to co-exist...
While not difficult, Apache configs, php-fpm, fastcgi or fcgid are all very involved with much fine grained control. Even with the server set to debug log levels, the logs don't do a great job pointing out exactly where the access problem originates.
Long story short -- I feel you pain.
-- David C. Rankin, J.D.,P.E.
oh yeah, now I feel better :-) But my situation is far worse, being that of a clueless amateur who needs apache ONLY to test amateur pages locally before making an ass of himself on the real commercial DNS name server. It doesn't help that I still think that a single makesanb html page: <htm>.</html> I would never have gotten involved were it not for some equally clueless and amateur perl scripting that required fancy web-server features and their usually just as cryptic configs that remind me of a sudoers file multiplied. ALL of this could have been avoided by the availability a single user home folder named Apache where no restrictions exist and all imaginable web server features just work 'on-demand'. Come to think of it there was maybe just such a thing a long time ago in Suse. -- AvLinux Debian GNU/Linux 11 (bullseye), Kernel=6.4.9-1-liquorix-amd64 on x86_64, DM=Unknown, DE=XFCE, ST=x11,grub2, GPT, BIOS-boot
On 8/20/23 17:29, bent fender wrote:
oh yeah, now I feel better :-) But my situation is far worse, being that of a clueless amateur who needs apache ONLY to test amateur pages locally before making an ass of himself on the real commercial DNS name server. It doesn't help that I still think that a single makesanb html page: <htm>.</html> I would never have gotten involved were it not for some equally clueless and amateur perl scripting that required fancy web-server features and their usually just as cryptic configs that remind me of a sudoers file multiplied. ALL of this could have been avoided by the availability a single user home folder named Apache where no restrictions exist and all imaginable web server features just work 'on-demand'. Come to think of it there was maybe just such a thing a long time ago in Suse.
Yep, Some 20 year ago, I had the bright idea to move my office to Linux. Mandrake had just imploded as it went public and I ended up with a Boxed set of SuSE 7.0 Pro. (code name "Air") This list, before it was fragmented, got me though bind zone configs, dhcpd with dyn updates to bind, pptp dial-in internet access (remember the days...), postfix, apache, UW-imap (now using dovecot), eGroupWare and Advantfax (d. mimms from Hylafax) -- though faxes are now basically a thing of the past. Somehow we ended up with a complete open-source replacement for the MS backoffice/exchange setup -- that ran rings around the MS offerings. Fast forward 20 years and after hosting several law-offices and an engineering firm on our home-spun servers, the only pitfalls have been web-apps that have made the short-sighted decision to containerize their offerings foreclosing a backwards compatible migration path for earlier versions. Though, since the backend was always MySQL or Postgress, it's just a transition issue for nextcloud instead of a data-loss issue. The other low-lights that impacted some very good web-projects were the PHP 5.6->7+ jump. (just a man-hours issue for no-staffed large projects) The apache 2.2->2.4 config changes were a minor speedbump that always seemed to hit at the least opportune moment... New or not, Apache, PHP and now JS, CSS, Ajax with all the simple interfaces for MySQL (mariaDB) and Postgres provide a fantastic set of tools that let you solve many, many issues. Whether that be indexed clips for family home videos, or full point-of-sale or groupware commercial solutions. I'm not a webbie, but know enough to at least be dangerous at this point, and were I younger would probably put more effort into that area of tech. Mastering server setup and config for your purpose is no different than learning any other aspect of a Linux box. Just requires "butt-in-chair time with reference open" (the actual implementation of RTFM) and it helps to have an immediate need or setup goal to accomplish or just general curiosity of "How does it work?". It doesn't matter whether you pick apache or nginx as the base, the learning curve and setup is about the same. I run nginx on the PIs and apache on real hardware. Another option going forward is just paying $4-20/mo. for a hosted server and letting somebody else worry about the hardware. Though that won't fly for professional hosting where confidentiality requirements are imposed (attorney-client, physician-patient, fiduciary-client, etc..) where a certifications to the licensing board requires "no data is disclosed or subject to the inspection or control of 3rd parties". Don't tell me my medical records or work-product are secure on somebody else's site -- they aren't. Just ask any of the fortune-500 whose data has been splashed all over the dark-web. So learning the finer points of hosting your own -- is time well spent. -- David C. Rankin, J.D.,P.E.
Wed, 23 Aug 2023 20:25:46 -0500 "David C. Rankin" <drankinatty@suddenlinkmail.com> :
On 8/20/23 17:29, bent fender wrote:
oh yeah, now I feel better :-) But my situation is far worse, being that of a clueless amateur who needs apache ONLY to test amateur pages locally before making an ass of himself on the real commercial DNS name server. It doesn't help that I still think that a single makesanb html page: <htm>.</html> I would never have gotten involved were it not for some equally clueless and amateur perl scripting that required fancy web-server features and their usually just as cryptic configs that remind me of a sudoers file multiplied. ALL of this could have been avoided by the availability a single user home folder named Apache where no restrictions exist and all imaginable web server features just work 'on-demand'. Come to think of it there was maybe just such a thing a long time ago in Suse.
Yep,
Some 20 year ago, I had the bright idea to move my office to Linux. Mandrake had just imploded as it went public and I ended up with a Boxed set of SuSE 7.0 Pro. (code name "Air") This list, before it was fragmented, got me though bind zone configs, dhcpd with dyn updates to bind, pptp dial-in internet access (remember the days...), postfix, apache, UW-imap (now using dovecot), eGroupWare and Advantfax (d. mimms from Hylafax) -- though faxes are now basically a thing of the past.
Somehow we ended up with a complete open-source replacement for the MS backoffice/exchange setup -- that ran rings around the MS offerings.
Fast forward 20 years and after hosting several law-offices and an engineering firm on our home-spun servers, the only pitfalls have been web-apps that have made the short-sighted decision to containerize their offerings foreclosing a backwards compatible migration path for earlier versions. Though, since the backend was always MySQL or Postgress, it's just a transition issue for nextcloud instead of a data-loss issue.
The other low-lights that impacted some very good web-projects were the PHP 5.6->7+ jump. (just a man-hours issue for no-staffed large projects) The apache 2.2->2.4 config changes were a minor speedbump that always seemed to hit at the least opportune moment...
New or not, Apache, PHP and now JS, CSS, Ajax with all the simple interfaces for MySQL (mariaDB) and Postgres provide a fantastic set of tools that let you solve many, many issues. Whether that be indexed clips for family home videos, or full point-of-sale or groupware commercial solutions.
I'm not a webbie, but know enough to at least be dangerous at this point, and were I younger would probably put more effort into that area of tech. Mastering server setup and config for your purpose is no different than learning any other aspect of a Linux box. Just requires "butt-in-chair time with reference open" (the actual implementation of RTFM) and it helps to have an immediate need or setup goal to accomplish or just general curiosity of "How does it work?".
It doesn't matter whether you pick apache or nginx as the base, the learning curve and setup is about the same. I run nginx on the PIs and apache on real hardware.
Another option going forward is just paying $4-20/mo. for a hosted server and letting somebody else worry about the hardware. Though that won't fly for professional hosting where confidentiality requirements are imposed (attorney-client, physician-patient, fiduciary-client, etc..) where a certifications to the licensing board requires "no data is disclosed or subject to the inspection or control of 3rd parties". Don't tell me my medical records or work-product are secure on somebody else's site -- they aren't. Just ask any of the fortune-500 whose data has been splashed all over the dark-web.
So learning the finer points of hosting your own -- is time well spent.
-- David C. Rankin, J.D.,P.E.
Thank you for the detailed heads-up, I didn't know about the confidentiality risks with hosted servers! Do you mean that they are risky even if you do your own site maintenance and that because of that you prefer your own physical servers in your own lockable bunker? I can relate to that because security begins with paranoia but I had no idea it was that bad seeing that you PAY for the service. My internet life (if that could be called a life at all) is shrinking instead of expanding. I did not buy another laptop after mine crapped out, I will not buy a smart phone being not smart enough, and I just closed my youtube account ticked off about their content-claim policy glorifying remixing chiselers even at the expense of original producers (reminding one of the mp3 saga). My hosted web site is chickenhsit in comparison to any other and somewhere along the line I'm going to shut it down too :-) -- Oh Lord of the Keyrings on high, have I got bad news for you: the word trust is nowhere to be found in my security dictionary.
On 8/23/23 21:34, bent fender wrote:
Do you mean that they are risky even if you do your own site maintenance and that because of that you prefer your own physical servers in your own lockable bunker?
Yes, From the bar standpoint or medical-board standpoint, if you are sued for malpractice for having disclosed confidential information, it's not an excuse to say "I payed somebody else to keep the data safe." The problem with any off-site hosting service is -- it's not yours. You have no control over who they hire to make sure their cloud remains running, transition data when new hardware comes online, maintain their software with updates (pypi supply chain compromise? or whatever workflow they happen to use), etc.. The bottom line from a professional responsibility standpoint, is you have an affirmative duty to safeguard confidential information remain liable for any breach or inadvertent disclosure. Given the daily reports of companies, with far more resources than I'll ever have, having their sites compromised -- it's always made more sense to keep the data on-site, so if there was ever any question, I knew exactly who had access (physical and online) to the data on my servers. That way, if anything ever hit the fan, I'd a lot rather face the music for something I actually did or failed to do, rather than it being a screw-up of somebody I have no control over but remain legally responsible for. Knock on wood, but so far, the fan has remained clean... -- David C. Rankin, J.D.,P.E.
From: "David C. Rankin" <drankinatty@suddenlinkmail.com> Date: Thu, 24 Aug 2023 01:13:30 -0500 On 8/23/23 21:34, bent fender wrote:
Do you mean that they are risky even if you do your own site maintenance and that because of that you prefer your own physical servers in your own lockable bunker?
Yes, From the bar standpoint or medical-board standpoint, if you are sued for malpractice for having disclosed confidential information, it's not an excuse to say "I payed somebody else to keep the data safe" . . . In fact, when someone signs up for (e.g.) a VPS, they usually have to agree to terms of service with one of those umbrella "We're not responsible for anything" clauses. So either it's your fault for not securing your VPS properly, or it's your fault for hiring someone else that failed to keep your VPS secure. That said, Google has far more security personnel than any company I've ever worked for. (At my last company, it was just me.) But try telling that to the legal department. -- Bob
participants (3)
-
bent fender
-
Bob Rogers
-
David C. Rankin