Hello,
On Fri, 18 Aug 2017, Carlos E. R. wrote:
On 2017-08-18 21:56, Marc Chamberlin wrote:
I am not sure if I should ask these questions here or on the developers mail list, but will start here and will move this over to the other mail list if told that it would be more appropriate. (I don't want to bother the developers with beginner's questions)
I am attempting to build Bacula (a backup server) from source code and in order to do so, the build configuration for Bacula requires that I give it the path to the source code files for mysql. (or in the case of openSuSE mariadb) I figured out that I could use zypper si mariadb to fetch the source code from the openSuSE repositories. (for some reason YaST is unable to fetch source code packages, which I posted and earlier query about, Er... wait. You do not need the source code of mysql for this; what you need is a package that could be named mysql-devel or mariadb-devel or something like that. It should contain the header files that you need, and in the proper directories for the Bacula configure script to find them.
Try that way, and undo the other things you did ;-) Right!
Marc, just install the needed -devel packages, and all will be fine. Thanks for the responses Carlos, Dave Howorth, and David Haller. Normally I would agree with all of you that all I should need to compile Bacula is to install the needed -devel packages and in fact that was exactly what I tried in my first attempts. That is the normal modus operandi. But I kept getting errors when I would run either the Bacula configuration script or the Bacula make. And on investigation I discovered that Bacula's build process wants all the source .c files for mysql and it is indeed trying to compile them as part of the build
On 08/18/2017 03:48 PM, David Haller wrote: process for Bacula. I too was surprised by this. The -devel packages alone are insufficient, and in fact the Bacula documentation says that I have to download the source code files for mysql, install them somewhere, and then tell the Bacula config where they were installed. I don't want the source code to be inconsistent with what openSuSE installs for mariadb so I am trying to install the source files in the YaST/zypper way. Question - If I have to build mariadb from source, is there documentation/instructions on how to do so for Leap42.3? I am finding generic instructions but I am not sure those instructions are applicable to the way the source code directories are laid out after I expanded the tar.gz file I downloaded from the openSuSE repository. David Haller - I am totally impressed that you made such an effort and took the time to try and explain autotools to me. It will take me awhile to grok everything you said but you certainly have me intrigued! I am not sure where you are headed though, are you saying that mariadb/mysql was built using autotools? From my preliminary glance at the source code it seems to me that it was built using cmake. But my eyes are not trained or up to speed on C/C++ build processes any more so take that comment with a chunk of salt! I guess another way to ask my question is -are you inferring that because there are .h.in files in the mariadb source code directories that mariadb was built using autotools? Or are you saying that I should try and use autotools to create a build process for Bacula? Thanks again for all your helpful suggestions, Marc...
But about those .in, .am or even the .in.in files (usually in the po/ subdir), it goes basically like this:
autoheader generates config.h.in from stuff in configure.ac, Automake/whatnot generate .in files from .am, autoconf generates configure (and .in from .in.in, IIRC) from configure.ac, and configure generates Makefiles, config.h and stuff from the .in files.
Actually, for once, I've no idea what the .in stands for. It is a template though, where usually configure replaces @FOO@ by the expansions of the variables concering foo, as it has determined it while it ran. It's just a big e.g.
[..] ac_cv_FOO = foo [..] ac_cv_BAR = bar [..] sed -e "s/@FOO@/${ac_cv_FOO}/" -e "s/@BAR@/${ac_cv_BAR}/" ...
at the end of the configure script running over all AC_OUTPUT files.
All that gets put into config.h[.in] is derived from configure.ac by autoheader, as you check there for specific headers with the AC_HEADER macros.
So, what starts out as e.g. three few-line files (configure.ac/Makefile.am/actual source) can generate a _selfcontained_ build supporting the classic
configure && make && make install
Heck, even a make dist works and produces a hello-0.1.tar.gz in the case shown below.
E.g.:
$ ls -l -rw-r----- 1 dh dh 21 Aug 18 22:44 Makefile.am -rw-r----- 1 dh dh 85 Aug 18 22:47 configure.ac -rw-r----- 1 dh dh 77 Aug 18 22:42 hello.c
$ for f in *; do echo " === $f ==="; cat "$f"; done; echo " ===" === Makefile.am === bin_PROGRAMS = hello === configure.ac === AC_INIT([hello], [0.1]) AM_INIT_AUTOMAKE([foreign]) AC_PROG_CC AC_OUTPUT(Makefile) === hello.c === #include <stdio.h> int main(void) { printf("Hello World\n"); return 0; } ===
Jep, 5 lines of "build-system"... Were it more source-files build and the *.o then linked, I'd just add _one_ extra line, to Makefile.am:
hello_OBJS := $(patsubst %.c, %.o, $(wildcard *.c))
In both cases, follow it by
$ autoreconf -fi $ ./configure $ make
and check it out:
$ ./hello Hello World
$ make dist $ tar tzf hello-0.1.tar.gz
's all as if by Magic ;)) Yep, 5-6 lines of "code" (4 "m4" macros and 1-2 make variables) is all it takes. Compare that to cmake and whatnot...
Usually, when you see "huge" configure.ac, they're almost always reinventing the wheel somewhere, usually as something more like an oktogon or something even cruder, as seen in the Flintstone cartoons... Basic rule: big configure.ac -> bad configure.ac.
Using the intended macros for finding stuff from the package via aclocal? Using the pkg-config macros? Naahh, NIH!!!!! .... they're too hip for that.
cmake-itis is what springs to my mind at that, each package has it's own "FindFoo.cmake" bullshit. Which usually does not work (too old or too new), be it on current Leap 42.2, on oS 12.1 or bleeding edge Gentoo. *GAH*. Only works occasionally.
Those who do not understand autotools are doomed to reinvent it. Badly... and have been doing so. Never been so true than with all those newfangled buildsystems. Well, ok, I'm glad imake/xmkmf is dying out, that was crude and inflexible[0]. But apart from that? *bwah*
Yes, autotools can be a PITA, but, used right and as intended, it's very efficient and manageable, esp. also for "downstream" packagers, who just need to call configure with an option, export a variable like CFLAGS for configure (or make) or some such. It's easy for downstream to mangle if need be. At least as easy as a plain Makefile'd be. A clean autotools setup is much more manageable that a convoluted Makefile forest. Darn, what's that project I dug through recently? Just plain Makefiles, but scatterd included files, some generated, some not all over the tree... Probably some Mozilla and/or LibreOffice or something else ;)
Anyway, when autotools is used correctly (e.g. using (predefined or locally defined) macros via aclocal), it is very efficient (for the dev and the packager, not sooo much for the user, the configure does take a bit of time firing up all those external processes to check for stuff ;)
The only real improvement I can come up with though is how "AC_OUTPUT()" can be defined, e.g. I miss wildcarding there. Haven't looked into that for about a decade though). And then again, that can be generated if need be via something like "find . -type f \( -name '*.in' -o -name '*.am' \) .....
Actually, to be "über"-correct, I should have tested for stdio.h availabiliy and printf in the above minimal sample. The former would be done via AC_CHECK_HEADER, and the result of that check would land in config.h (via config.h.in).
Just try it out ;) For functions, it's not surprisingly AC_CHECK_FUNC.
So, two more lines, but for such basic stuff as stdio.h and printf?
Hey, if _THAT_ is missing, the user has quite different problems. Well, maybe check the header, as the user might not have the -devel package installed etc.
All AC_* documented in 'info autoconf', all AM_* in 'info automake', generally, the autotools use info. Use any info-reader like pinfo, emacs, tkinfo, konqueror, whatever.
And with cmake, I've been forced to patch "Module" files in /usr/{share,lib*}/cmake/modules or wherever it is, or vice versa prune/patch package supplied .cmake module-files. I'ts just a mess from the get-go. With autotools, worst case is "patch and run aclocal (the "local" in the name, could that have a reason?? ;) (or just autoreconf to catch all). And with autotools, you can patch _any_ stage. I've e.g. _never_ had to touch any system aclocal files. Ever. And don't get me started on qmake.
With autotools, all you'd possibly need to patch is _inside_ the target-source tarball. End of story. At times it takes a bit of digging (use mc for stuff like that!), but often it's just as trivial as exporting a variable before calling configure or make.
Anyway: I understand _everybody_ that does not like autotools. But then I ask: what are the alternatives? And I as both a (small-stuff for private use) developer and routined packager, I hate autotools the least. On a scale from 1 to 10 of "hate", I'd say autotools is about a 2. Perl's ExtUtils::MakeMaker and Build about a 1.75, but everything else well exceeds the 6. And cmake is right at the top. at, say, 9.5, with the .5 for the fancy progress in percent feature. imake/xmkmf would be an 8ish, qmake a flat-out 10 ... You get the gist. Oh, and that uber-newfangled stuff barging in along nodejs and other crap?
That stuff would make the scale explode (20+ minimum). Just have a look at what "gradle" needs to just run ... And using nodejs is basically bending over in prison and shouting "atta boys!", without even the "dropping the soap" act... But if you like that kind of stuff, be me guest, but leave me out of it (and that's not because I can't dig "kinky"... You might consider stuff I do as quite kinky. But I'm not that kinky. Or not that kind of. ;)
-dnh, building packages for well over 15 years, locally and (when it became available) in the OBS, often with quite some years out of "sync" stuff in either direction (e.g. new libs on old platform or old stuff on new platform). And oftentimes helping others build stuff. E.g. mangle a KDE foo thingy qmake build to actually build as wanted. What a tiring exp. that was. Wildly patching .pri, .pro, .whatnot ... That must've been quite an exceptional(ly ...) mind that has come up with all that...
[0] as opposed to autotools. Those might still be considered crude and "bruteforce", generating those tons of shell-code, but they're very flexible, manageable, malleable, patchable, env-overridable ...
-- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org