* Rich3800@aol.com
How about using a dynamic DNS service such as dyndns.org?
Tom Nelson indicates that he has a static ip address. What would he gain from dynamic DNS services? -- Patrick Shanahan Please avoid TOFU and trim >quotes< http://wahoo.no-ip.org Registered Linux User #207535 icq#173753138 @ http://counter.li.org Linux, a continuous *learning* experience
Hi, I just enabled apache web server in my computer, It works perfectly. I follow up the secury issues and enabled my firewall and I also change the port etc. However, in my web page I want to make available some files, fotos, etc that are in my home directory in public_html but when I try to open it from the web it is not available. how should I made this directory available only for reading. Thanks Jose
On Sun, Jun 08, 2003 at 06:11:20PM -0700, Jose Sanchez wrote:
Hi,
I just enabled apache web server in my computer, It works perfectly. I follow up the secury issues and enabled my firewall and I also change the port etc. However, in my web page I want to make available some files, fotos, etc that are in my home directory in public_html but when I try to open it from the web it is not available. how should I made this directory available only for reading.
Do you have HTTPD_SEC_PUBLIC_HTML=yes in /etc/sysconfig/apache? -- Robert C. Paulsen, Jr. robert@paulsenonline.net
* Jose Sanchez
I just enabled apache web server in my computer, It works perfectly. I follow up the secury issues and enabled my firewall and I also change the port etc. However, in my web page I want to make available some files, fotos, etc that are in my home directory in public_html but when I try to open it from the web it is not available. how should I made this directory available only for reading.
Look at the: Unoffical SuSE FAQ available: http://susefaq.sourceforge.net Answers to your questions are there, 10. Web Servers Apache Related Questions In the future, please start your own thread instead of replying to a previous post and changing the subject. There has been much discussion of this, even in the last week. tks & gud luk, -- Patrick Shanahan Please avoid TOFU and trim >quotes< http://wahoo.no-ip.org Registered Linux User #207535 icq#173753138 @ http://counter.li.org Linux, a continuous *learning* experience
Hi,
Look at the: Unoffical SuSE FAQ available: http://susefaq.sourceforge.net
Answers to your questions are there, 10. Web Servers Apache Related Questions
well I just read this FAQ but I don't find the answer to my problem because I have everything enabled in my computer as it says, so I don't get what I am missing.
the future, please start your own thread instead of replying to a previous post and changing the subject. There has been much discussion of this, even in the last week.
I just do that to have the mailing address of the mailing list, I didn't know it creates some type of problem in this list, I really thought it creates a new email. Sorry about that.
Do you have
HTTPD_SEC_PUBLIC_HTML=yes
in
/etc/sysconfig/apache?
yes I have that also. I have no idea what I made wrong. Thanks Jose
On Sun, Jun 08, 2003 at 06:53:17PM -0700, Jose Sanchez wrote:
Do you have
HTTPD_SEC_PUBLIC_HTML=yes
in
/etc/sysconfig/apache?
yes I have that also. I have no idea what I made wrong.
Check the permisions on your home directory and on your public_html directory. mine are both: drwxr-xr-x but I think the following will also work: drwx--x--x -- Robert C. Paulsen, Jr. robert@paulsenonline.net
By default (at least on a SuSE distribution) the httpd.conf file prevents the
use of symbolic links. Look for the following section in /etc/httpd/httpd.conf:
<Directory />
AuthUserFile /etc/httpd/passwd
AuthGroupFile /etc/httpd/group
Options -FollowSymLinks +Multiviews
AllowOverride None
</Directory>
Note the minus sign in front of "FollowSymLinks".
As another poster has said, it is not a good idea to change this (for security
reasons). It is best to rethink your configuration and come up with something
that doesn't require enabling "FollowSymLinks".
If you place your whole set of web pages in your personal public_html
directory, the default is to *allow* symlinks from there. Look for the following section in /etc/httpd/suse_public_html.conf:
Hi
Just a stupid idea : is your home firectory and its public_html or file conatained herein world-readable ?
Check the permisions on your home directory and on your public_html
I made it public by enabling the share permissions, it is suppost to be public. directory. mine are both: drwxr-xr-x my public_html directory is like that I have no Idea, I just want to be able to download some fotos, so I created a link in the web page to that zip file which is in /home/jose/public_html. I open the web page and I try to open that page and I can't do it. Jose
On Sun, Jun 08, 2003 at 07:22:14PM -0700 or thereabouts, Jose Sanchez wrote:
I have no Idea, I just want to be able to download some fotos, so I
What do your logs say? -- Gary BREAKFAST.COM Halted - Cereal port not responding.
On Monday 09 June 2003 04.22, Jose Sanchez wrote:
my public_html directory is like that
Your home directory should be too
I have no Idea, I just want to be able to download some fotos, so I created a link in the web page to that zip file which is in /home/jose/public_html. I open the web page and I try to open that page and I can't do it.
What does the link look like? It should be http://servername/~jose/filename.zip
* Jose Sanchez
Check the permisions on your home directory and on your public_html directory. mine are both: drwxr-xr-x
my public_html directory is like that
I have no Idea, I just want to be able to download some fotos, so I created a link in the web page to that zip file which is in /home/jose/public_html. I open the web page and I try to open that page and I can't do it.
How are you linking the page?? I think it should be: http://your.web.address/~jose/index.html The address for your site (not your personal public page) would be: http://your.web.address/index.html note: the 'index.html' may not be required but will be accepted. The way you have configured httpd may not allow links to lower directory structure outside /home/jose/public_html/ or /srv/www/httpd/ -- Patrick Shanahan Please avoid TOFU and trim >quotes< http://wahoo.no-ip.org Registered Linux User #207535 icq#173753138 @ http://counter.li.org Linux, a continuous *learning* experience
Hi,
Ok, lets begin again, because I am totally lost now. My directory for
the web pages is:
* Jose Sanchez
[06-08-03 19:23]: Check the permisions on your home directory and on your public_html
directory. mine are both: drwxr-xr-x
my public_html directory is like that
I have no Idea, I just want to be able to download some fotos, so I created a link in the web page to that zip file which is in /home/jose/public_html. I open the web page and I try to open that page and I can't do it.
How are you linking the page?? I think it should be: http://your.web.address/~jose/index.html
The address for your site (not your personal public page) would be: http://your.web.address/index.html
note: the 'index.html' may not be required but will be accepted.
The way you have configured httpd may not allow links to lower directory structure outside /home/jose/public_html/ or /srv/www/httpd/
On Sun, 08 Jun 2003 21:33:17 -0700
Jose Sanchez
Hi,
Ok, lets begin again, because I am totally lost now. My directory for the web pages is:
in that directory a created a web page, index.html. Each time I run mozilla in other computer and put my ip address:port, that web page is opened.
Now I want to link a file in my home directory in the public_html (/home/jose/public_html), so I open mozilla composer create a link to the file. So how should this link look like:
../../../home/jose/public_html/fotos.zip
This is really the wrong way to do it. Don't get into that habit, or you will really have a mess later.
Obviously when I type: localhost:11380 the web page appears and when I click the link it works, because I am in the same computer. However, when I try from outside it returns:
The requested URL /home/jose/public_html/fotos.zip was not found on this server.
The URL from the outside should be: http://yourservername/~jose/fotos.zip If that dosn't work, post the error from the http logs. Maybe your servername isn't setup right in httpd.conf? It should work from this URL on your local machine: http://127.0.0.1/~jose/fotos.zip or http://localhost/~jose/fotos.zip -- use Perl; #powerful programmable prestidigitation
Ok, thanks, you were all right, I have to make the link as: http://yourservername/~jose/fotos.zip and avoid using symboliclinks as I did. Also as in the apache FAQ says it is better to have the Symboliklink deactivated for security reasons. Thanks a lot Jose zentara wrote:
On Sun, 08 Jun 2003 21:33:17 -0700 Jose Sanchez
wrote: Hi,
Ok, lets begin again, because I am totally lost now. My directory for the web pages is:
in that directory a created a web page, index.html. Each time I run mozilla in other computer and put my ip address:port, that web page is opened.
Now I want to link a file in my home directory in the public_html (/home/jose/public_html), so I open mozilla composer create a link to the file. So how should this link look like:
../../../home/jose/public_html/fotos.zip
This is really the wrong way to do it. Don't get into that habit, or you will really have a mess later.
Obviously when I type: localhost:11380 the web page appears and when I click the link it works, because I am in the same computer. However, when I try from outside it returns:
The requested URL /home/jose/public_html/fotos.zip was not found on this server.
The URL from the outside should be: http://yourservername/~jose/fotos.zip
If that dosn't work, post the error from the http logs.
Maybe your servername isn't setup right in httpd.conf? It should work from this URL on your local machine: http://127.0.0.1/~jose/fotos.zip or http://localhost/~jose/fotos.zip
Well I did it in the following way: I created a symbolic link in the /srv/www/htdocs called directory to /home/jose/public_html and then my links in the web pages I made them like /directory/filename and it worked, I think, there should be another way but in the meanwhile it works. Jose PS: thanks all for the help
../~yourname/ links to your home public_html directory FX -- ______________________ Courtesy of SuSE Linux http://www.nibz.org
Ahhhh...I knew that title would make you look;-). Seriously though, is there a "standard" that most software developers in Linux follow with regards to where programs and files are installed on the disk? The reason I ask is because SuSE installs some of its various packages in places which are different from the software developer. For example, SuSE installs Mozilla in "/opt/mozilla" whereas the non-RPM download from the Mozilla sight defaults to "/usr/local/mozilla". The same holds true for OpenOffice. I upgraded to 1.0.3 and it installed in "/usr/local/OpenOffice.org 1.0.3". SuSE installed the earlier version in "/opt/OpenOffice 1.0.2". Evolution defaults to "/opt/gnome/share/evolution" (version 1.2.3 and below). Any beta software of evolution from Ximian defaults to "/opt/gnome2/share/evolution". These are just a few examples and I'm sure there are many more. The reason I bring this up is because I have a difficult time upgrading various software programs (dependency issues and such) from the software maker and must wait for a SuSE RPM to come out before guaranteeing success. I've used other distros in the past (Red Hat, Mandrake, Lycoris, etc) and never seemed to have had near the problems I'm experiencing now when updating/upgrading. I prefer SuSE over all the others I've tried (been with it since 7.1) but this dependency thing is becoming a pain. I guess to sum it all up: Does SuSE use its own "recipe" for installing software outside the realm (or conformity) of other distros and is there an installation "standard" (even a loose one)? Many thanks on opinions and observations. Zach
I believe what you want to look at is LSB (Linux Standards Base) - Herman Zach Smith wrote:
Ahhhh...I knew that title would make you look;-). Seriously though, is there a "standard" that most software developers in Linux follow with regards to where programs and files are installed on the disk? The reason I ask is because SuSE installs some of its various packages in places which are different from the software developer.
For example, SuSE installs Mozilla in "/opt/mozilla" whereas the non-RPM download from the Mozilla sight defaults to "/usr/local/mozilla". The same holds true for OpenOffice. I upgraded to 1.0.3 and it installed in "/usr/local/OpenOffice.org 1.0.3". SuSE installed the earlier version in "/opt/OpenOffice 1.0.2". Evolution defaults to "/opt/gnome/share/evolution" (version 1.2.3 and below). Any beta software of evolution from Ximian defaults to "/opt/gnome2/share/evolution".
These are just a few examples and I'm sure there are many more. The reason I bring this up is because I have a difficult time upgrading various software programs (dependency issues and such) from the software maker and must wait for a SuSE RPM to come out before guaranteeing success. I've used other distros in the past (Red Hat, Mandrake, Lycoris, etc) and never seemed to have had near the problems I'm experiencing now when updating/upgrading. I prefer SuSE over all the others I've tried (been with it since 7.1) but this dependency thing is becoming a pain.
I guess to sum it all up: Does SuSE use its own "recipe" for installing software outside the realm (or conformity) of other distros and is there an installation "standard" (even a loose one)? Many thanks on opinions and observations.
Zach
* Zach Smith
Ahhhh...I knew that title would make you look;-). Seriously though, is
And *your* question has what to do with the thread "Enabling /home/*/public_html to be seen in my web page" which you have responded? In the future, please start your own thread instead of replying to a previous post and changing the subject. There has been much discussion of this, even in the last week and today. Thankyou for your consideration and understanding. -- Patrick Shanahan Please avoid TOFU and trim >quotes< http://wahoo.no-ip.org Registered Linux User #207535 icq#173753138 @ http://counter.li.org Linux, a continuous *learning* experience
On 09-Jun-03 Patrick Shanahan wrote:
And *your* question has what to do with the thread "Enabling /home/*/public_html to be seen in my web page" which you have responded?
In the future, please start your own thread instead of replying to a previous post and changing the subject. There has been much discussion of this, even in the last week and today.
I think Patrick and others are being unduly harsh with Zach Smith's
alleged "hijacking" of a thread. When I received his message it had
simply the Subect:
[SLE] Does SuSE deviate from "standards"
However, from the headers in his message I see:
In-Reply-To: <3EE3EF56.30404@okstate.edu>
References:
On Mon, Jun 09, 2003 at 10:34:28AM +0100, Ted Harding wrote: : : What seems to have happened here is that Zach sent a new message to the : list by "reply"ing to an existing one (perhaps to save typing/looking up : the list address), then clearing everything he could see (including the : subject) and creating a totally new message (or so it seemed) -- but not : realising that hidden below the surface there might be headers that : referenced other messages in a thread. : : Hence _some_ people whose mail reader can identify the thread from these : headers (which presumes that they have already retained other messages : from the thread) will get the impression that the thread has been changed. : : But in Zach's message and headers as received by me there is no mention : of "Enabling /home/*/public_html to be seen in my web page". So presumably : anyone who saw that in the "Subject:" line as received by them had it : put in there by their own system. : : Right or wrong ... ? I'm guessing that many of the people that had issue (and rightfully so) were using mutt or MH, which correctly track via "References" and the Message-Id's that the header contains. The text of the subject may have been changed, but for all intents and purposes, the mail still "said" that it was part of a previous thread. For further clarification, I suggest checking out RFC 2822. --Jerry : -- : Check the headers for your unsubscription address : For additional commands send e-mail to suse-linux-e-help@suse.com : Also check the archives at http://lists.suse.com : Please read the FAQs: suse-linux-e-faq@suse.com : -- Open-Source software isn't a matter of life or death... ...It's much more important than that!
On Monday 09 June 2003 12.23, Jerry A! wrote:
I'm guessing that many of the people that had issue (and rightfully so) were using mutt or MH,
A lot of MUAs track threads. My mailers of choice for example, kmail and evolution, do it extremely well. My main problem is that I think complaints should be sent off-list. There is no need to waste bandwidth with Yet Another Thread About Netiquette. An off-list nudge to the original poster is more than enough. Secondly, a much bigger problem for me is when people break threads by using mailers that strip out the reply-to and references headers.
which correctly track via "References" and the Message-Id's that the header contains. The text of the subject may have been changed, but for all intents and purposes, the mail still "said" that it was part of a previous thread.
For further clarification, I suggest checking out RFC 2822.
2822 is not yet accepted as a standard, it is still in the discussion stage. The officially accepted standard for email is still rfc 822
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Sunday 08 June 2003 20:15, Zach Smith wrote:
Ahhhh...I knew that title would make you look;-). Seriously though, is there a "standard" that most software developers in Linux follow with regards to where programs and files are installed on the disk? The reason I ask is because SuSE installs some of its various packages in places which are different from the software developer.
For example, SuSE installs Mozilla in "/opt/mozilla" whereas the non-RPM download from the Mozilla sight defaults to "/usr/local/mozilla". The same holds true for OpenOffice. I upgraded to 1.0.3 and it installed in "/usr/local/OpenOffice.org 1.0.3". SuSE installed the earlier version in "/opt/OpenOffice 1.0.2". Evolution defaults to "/opt/gnome/share/evolution" (version 1.2.3 and below). Any beta software of evolution from Ximian defaults to "/opt/gnome2/share/evolution".
These are just a few examples and I'm sure there are many more. The reason I bring this up is because I have a difficult time upgrading various software programs (dependency issues and such) from the software maker and must wait for a SuSE RPM to come out before guaranteeing success. I've used other distros in the past (Red Hat, Mandrake, Lycoris, etc) and never seemed to have had near the problems I'm experiencing now when updating/upgrading. I prefer SuSE over all the others I've tried (been with it since 7.1) but this dependency thing is becoming a pain.
I guess to sum it all up: Does SuSE use its own "recipe" for installing software outside the realm (or conformity) of other distros and is there an installation "standard" (even a loose one)? Many thanks on opinions and observations.
Zach
You really shouldn't post a message(hijack) under another topic. I am assuming that you did this by accident. ;) Anyways, the standard that you are referring to is the LSB. It is an attempt to standardize the porting of software to Linux. This includes --prefix= switches and such. Personally, i think that SuSE has it right. Here's my take on it: /bin and /usr/bin are for "seasoned" binaries........no betas! /sbin and /usr/sbin are for well-tested administrative binaries /usr/local/bin and /usr/local/sbin should be for alpha, beta, and user compiled binaries /opt should be extraneous binaries...anything that is just "eye-candy" should go here! However, i had thought i read in some documentation that SuSE "was" LSB compliant. I have never read the LSB completely...just snippets..anyone know about this? Fact or Fiction???!! ;) - -- Thomas Jones Linux-Howtos Network Administrator OpenGPG Key: 0x6A3DF6E9 -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.2-rc1-SuSE (GNU/Linux) iD8DBQE+4+SYQT2komo99ukRAs9bAKDticuZACFcwKMTnJpDwFEaV8/kzACg7OSU Z4Coe1HCj8GIfpDgsmu/+ns= =RQrZ -----END PGP SIGNATURE-----
2) /usr/local is from unix history and was designated the place where users put there stuff. So when you compile your own app to be shared with all the other people on your system, you would put it in the /usr/local tree. This way the admin could secure the rest of the system and still allow users freedom to bring in their favorite whiz-bang gadget. Originally there was no /opt so the vendors put everything into /bin, /lib, and so on. Then people started realizing we need to separate 'optional' stuff from 'mandatory & necessary to run the OS' stuff. So they created a /opt tree which was the place for the sys admin to put 'unique to this box or function' software. In a server farm, this is very useful. I can't speak for SuSE but their decision to put non-mandatory stuff into /opt conforms to a philosophy that's been around for a long time. The idea of keeping versions separated by using version identifier is also not new. That allows you to migrate users with an appropriate training period and a beta / superuser community and allows quick reversal in a corporate environment - again I applaud that. No, I don't see this as outside the realm of 'standard' - also look at the Linux Standards Base. I do see a lot of the other distros not bothering with the administrative requirements (and associated headaches) in order to keep things 'as simple as M$'. and 1) Hijacking threads (replying to an existing and changing title) is considered impolite, makes a mess for those who use threaded readers, and makes subject-oriented indexing a near impossibility. Please don't do this.
On Monday 09 June 2003 03.15, Zach Smith wrote:
Ahhhh...I knew that title would make you look;-). Seriously though, is there a "standard" that most software developers in Linux follow with regards to where programs and files are installed on the disk?
Yes, it's called the Filesystem Hierarchy Standard. You can find it at http://www.pathname.com/fhs/
The reason I ask is because SuSE installs some of its various packages in places which are different from the software developer.
This is because the FHS specifies different directories for distributors and for software the user installs on his/her own.
For example, SuSE installs Mozilla in "/opt/mozilla" whereas the non-RPM download from the Mozilla sight defaults to "/usr/local/mozilla". The same holds true for OpenOffice. I upgraded to 1.0.3 and it installed in "/usr/local/OpenOffice.org 1.0.3". SuSE installed the earlier version in "/opt/OpenOffice 1.0.2".
/usr/local and /opt are places defined in the FHS as available for "local installations", i.e. installations of software you didn't get from your distributor. The FHS says that a system upgrade may wipe any software on any part of the disk, but is not allowed to touch things installed in /usr/local or /opt. For the exact wording (I'm a little inexact here) look at the standards document. The basic point is that if you want to keep the software across a systems upgrade, you should put it in the directories reserved for local installs.
Evolution defaults to "/opt/gnome/share/evolution" (version 1.2.3 and below). Any beta software of evolution from Ximian defaults to "/opt/gnome2/share/evolution".
Well, evolution 1.2.x was gnome and evolution 1.3.x (the betas) are gnome2 :)
The 03.06.08 at 20:15, Zach Smith wrote:
Ahhhh...I knew that title would make you look;-). Seriously though, is
Please, next time, don't hijack a thread.
For example, SuSE installs Mozilla in "/opt/mozilla" whereas the non-RPM download from the Mozilla sight defaults to "/usr/local/mozilla". The
The reason is that suse makes a distribution, and we do not :-) I mean, packages that are not provided by the distributod, ie, compiled by ourselves, are put somewhere under /usr/local. Packages provided by the distributor never go there. -- Cheers, Carlos Robinson
The 03.06.08 at 18:11, Jose Sanchez wrote:
However, in my web page I want to make available some files, fotos, etc that are in my home directory in public_html but when I try to open it from the web it is not available. how should I made this directory available only for reading.
I prefer to use a directory world readable, like "/srv/www/htdocs/localusers/"
for users, symlinked from "/home/user/public_html" instead. Also, edit
"/etc/httpd/suse_public_html.conf" and change the line:
Hi,
I prefer to use a directory world readable, like "/srv/www/htdocs/localusers/" for users, symlinked from "/home/user/public_html" instead. Also, edit "/etc/httpd/suse_public_html.conf" and change the line:
for
Ok, thanks for the advice, I will try all this new ideas.
But, please, the next time, don't hijack a thread: create your own one.
Yes as I explained before it was a mistake and will not happend again. Sorry about that Jose
On Sunday 08 June 2003 2:43 pm, Patrick Shanahan wrote:
* Rich3800@aol.com
[06-08-03 16:41]: How about using a dynamic DNS service such as dyndns.org?
Tom Nelson indicates that he has a static ip address. What would he gain from dynamic DNS services?
Ummm, it's free? Hi, I'm the other "Tom", and this is exactly what I'm doing -- the "osnut..." domain is actually a part of the dyndns.org domain list [homelinux.net], and I *do* have a static address. I suppose I could set up my own DNS server and all that, but (a) that is a hassle, and (b) you usually have to have two DIFFERENT IP's [preferrably from different netblocks as well] to fully register your own domain "directly". -- Yet another Blog: http://osnut.homelinux.net
On Tuesday 10 June 2003 01:40, Tom Emerson wrote:
Hi, I'm the other "Tom", and this is exactly what I'm doing -- the "osnut..." domain is actually a part of the dyndns.org domain list [homelinux.net], and I *do* have a static address. I suppose I could set up my own DNS server and all that, but (a) that is a hassle, and
It isn't necessary. I work for a web host/design company. While we prefer to host the sites for our customers, we have no problem providing just DNS for them if they want to host their site in-house.
(b) you usually have to have two DIFFERENT IP's [preferrably from different netblocks as well] to fully register your own domain "directly".
Not sure what you mean, here. If you mean having a primary and secondary nameserver, it's more a really good idea than a requirement. You don't even need to have them in different netblocks. -- Scott Jones (scott at exti dot net)
Tom Emerson
On Sunday 08 June 2003 2:43 pm, Patrick Shanahan wrote: Tom Nelson indicates that he has a static ip address. What would he
gain from dynamic DNS services?
Ummm, it's free?
Hi, I'm the other "Tom", and this is exactly what I'm doing -- the "osnut..." domain is actually a part of the dyndns.org domain list [homelinux.net], and I *do* have a static address. I suppose I could set up my own DNS server and all that, but (a) that is a hassle, and (b) you usually have to have two DIFFERENT IP's [preferrably from different netblocks as well] to fully register your own domain "directly".
You don't need your own DNS server; you should be able to use the nameservers you use now. Many of us would give up our morning coffee for a static IP address. I can't imagine throwing one away by inserting a dynamic IP in the path. A couple of years ago I put up a web server on my machine (dynamic IP) using a script to upload my current IP to a page at my ISP which did nothing but redirect page hits to the current IP on my machine. It worked well. -rex
On Tuesday 10 June 2003 8:40 am, rex wrote:
Tom Emerson
[2003-06-10 07:26]: On Sunday 08 June 2003 2:43 pm, Patrick Shanahan wrote: Tom Nelson indicates that he has a static ip address. What would he
gain from dynamic DNS services?
Ummm, it's free?
[...] You don't need your own DNS server; you should be able to use the nameservers you use now.
Well, yes, I suppose I could ask pacbell to insert the appropriate records in their DNS servers to point some arbitrary domain to my IP address, but they want $100 to do that, and $50 for each and every little change that comes along...
Many of us would give up our morning coffee for a static IP address. I can't imagine throwing one away by inserting a dynamic IP in the path.
dyndns.org works just fine with static addresses (slightly better, actually, in that my address doesn't "expire" from their database -- if I had a dynamic address that simply never changed, I'd still have to upload the fact it hasn't changed every few days) If what you're balking about is having something like "homelinux" in the middle of my "domain", well, I'm just being "cheap" [unemployment does that to you ;) ] when I'm again "flush with cash", I'll probably donate some money their way. (note also that for a small fee, much smaller than what pacbell wants, they will do a full on "yourname.tld" style domain -- should I ever feel the need...)
A couple of years ago I put up a web server on my machine (dynamic IP) using a script to upload my current IP to a page at my ISP which did nothing but redirect page hits to the current IP on my machine. It worked well.
Who is (was) the ISP? most ISP's nowadays have a hissy fit if you even hint at "running a server", this one appearently helps you along. -- Yet another Blog: http://osnut.homelinux.net
On Tue, 10 Jun 2003, Tom Emerson wrote:
dyndns.org works just fine with static addresses (slightly better, actually, in that my address doesn't "expire" from their database -- if I had a dynamic address that simply never changed, I'd still have to upload the fact it hasn't changed every few days) If what you're balking about is having something like "homelinux" in the middle of my "domain", well, I'm just being "cheap" [unemployment does that to you ;) ] when I'm again "flush with cash", I'll probably donate some money their way. (note also that for a small fee, much smaller than what pacbell wants, they will do a full on "yourname.tld" style domain -- should I ever feel the need...)
alot of these services also support the inlcusive namespace. don't know about dyndns.org - but it is the case with no-ip.org - which is basically the same service. they also offer dynamic dns for free. and if you want all you have to do is visit www.zoneedit.com which provides free dns for domain.tld - including some in the inclusive namespace like the .god tld we manage. what you do to use your own domain free is setup dns at a service like zoneedit for free - then cname your web server www.domain.tld to your dynamic.domain.tld host name. cheers joe Joe Baptista - only at www.baptista.god "Be assured. Baghdad is safe, protected" ... Muhammed Saeed al-Sahaf former Iraqi Information Minister
Tom Emerson
On Tuesday 10 June 2003 8:40 am, rex wrote:
dyndns.org works just fine with static addresses (slightly better, actually, in that my address doesn't "expire" from their database -- if I had a dynamic address that simply never changed, I'd still have to upload the fact it hasn't changed every few days) If what you're balking about is having something like "homelinux" in the middle of my "domain", well, I'm just being "cheap" [unemployment does that to you ;) ] when I'm again "flush with cash", I'll probably donate some money their way. (note also that for a small fee, much smaller than what pacbell wants, they will do a full on "yourname.tld" style domain -- should I ever feel the need...)
Have you looked at www.mydomain.com ? They do free email & URL redirection, and you can get "yourname.tld" from them for about $10/yr. They might let you specify your fixed IP instead of a URL. Failing that, you could redirect to the page your ISP provides and redirect from there to your box. With a static IP address there would be no need to resort to something like the script I used to automatically update the page at each dialup.
A couple of years ago I put up a web server on my machine (dynamic IP) using a script to upload my current IP to a page at my ISP which did nothing but redirect page hits to the current IP on my machine. It worked well.
Who is (was) the ISP? most ISP's nowadays have a hissy fit if you even hint at "running a server", this one appearently helps you along.
They went out of business some time ago. :( However, there was no help from them involved, other than that they provided a free web page to each user which I used to redirect hits to my machine. They might have objected if the traffic level had become significant. As it was just a test to see if I could create a dynamic page using PHP & Octave (to do some mathematical calculations), the traffic was insignificant. -rex
participants (19)
-
Anders Johansson
-
Carlos E. R.
-
FX Fraipont
-
gary
-
Hans Forbrich
-
Herman Knief
-
Jerry A!
-
Joe Baptista
-
Jose Sanchez
-
Patrick Shanahan
-
rex
-
Rich3800@aol.com
-
Robert C. Paulsen Jr.
-
Scott Jones
-
Ted.Harding@nessie.mcc.ac.uk
-
Thomas Jones
-
Tom Emerson
-
Zach Smith
-
zentara