Opened 10 years ago

Closed 3 years ago

#95 closed enhancement (fixed)

upgrade to newer autotools version

Reported by: stefan Owned by: stefan
Priority: major Component: build system
Version: trunk Keywords:
Cc: kelman@…, mlubin


It would be nice to update to a more recent autotools version, which may also fix some open issues.

Lou's thoughts:

	I vote to upgrade. Painful but waaaay overdue. Exotic environments
aside, hanging on to these ancient autotools versions is an ongoing
annoyance on mainstream flavors of unix.

	Tactics and time are serious issues. This is not something to rush
into, and it will take a while to complete. If we really want to take
advantage of the progress in autotools, we need to strip all of our
PROG_CXX, etc.) and start clean. Then we'll need to find people willing
to test on the various exotic platforms that are represented in coin.m4
to see which tweaks need to be reinstated.

	We should give serious thought to the set of platforms we're willing to
continue to support. Much as it pains me to say it, Solaris is one of
the ones that deserves that thought. It's clear that Oracle cares only
about the server market as a vehicle to sell oracle products. Solaris on
the desktop seems to be abandoned.

Attachments (2)

get_autotools.patch (2.4 KB) - added by chm.duquesne 9 years ago.
Script to build the needed autotools
coinutils_data_sample.patch (4.6 KB) - added by kelman 8 years ago.
patch to put Data/Sample back into autotools-update branch of CoinUtils

Download all attachments as: .zip

Change History (32)

Changed 9 years ago by chm.duquesne

Script to build the needed autotools

comment:1 Changed 9 years ago by chm.duquesne

While I am strongly in favor of an update to a more recent version of the autotools, I needed a way to modify the buildsystem.

The patch I just submitted is a helper for getting the right version of the autotools.

comment:3 Changed 8 years ago by kelman

I just had a discussion ( with a knowledgeable libtool contributor about an issue with Cygwin+MSVC and the absolute paths to .la files in _LIBS makefile variables. I was using Osi as an example since it appears to be the smallest COIN-OR project that depends on anything other than BuildTools or ThirdParty, in this case Osi depends on CoinUtils.

In recent automake, the compile wrapper script performs some extremely useful path conversions for use with MSVC cl. After updating autotools, it should make many of the custom macros in coin.m4 that deal with MSVC paths and flags redundant. It's used as a wrapper, so when --enable-msvc is specified to configure we should modify CXX="/path/to/compile $CXX" when CXX is cl. Same for CC and F77, if the macros are removed from coin.m4 going forward.

I suggest we add the compile script from the desired version of automake (should it be updated to 1.14 now?) into the autotools-update branch, and modify the following line in run_autotools:

  • run_autotools

    305305echo "Running autotools in $dirs"
    307 autotoolsFiles="config.guess config.sub depcomp install-sh missing"
     307autotoolsFiles="config.guess config.sub compile depcomp install-sh missing"

Down the line when the autotools update is complete, the compile script would be distributed in each project's base directory just like config.guess and config.sub. It has the same GPL exception as those files.

It's beside the point of this ticket, but for BuildTools?/trunk, this issue can be resolved by slightly modifying a couple of existing macros in coin.m4. The macros that are there already replace absolute paths to any .lib files when using MSVC cl with a CYGPATH_W version of them, but with backslashes in the path replaces with forward slashes. The same replacement can be extended to .la files to make things work in Cygwin with existing BuildTools?. This could be made a little simpler by using cygpath -m to get forward slashes instead of cygpath -w and a 20-backslash sed, but that's just a readability concern.

  • coin.m4

    36263626        # for the LIBS, we replace everything of the form "/somepath/name.lib" by "`$(CYGPATH_W) /somepath/`name.lib | sed -e s|\|/|g" (where we have to use excessive many \ to get the \ into the command line for cl)
    36273627        if test x$coin_cxx_is_cl = xtrue || test x$coin_cc_is_cl = xtrue ;
    36283628        then
    3629           m4_toupper($1_LIBS)=`echo " $m4_toupper($1_LIBS) " | [sed -e 's/ \(\/[^ ]*\/\)\([^ ]*\)\.lib / \`$(CYGPATH_W) \1 | sed -e "s|\\\\\\\\\\\\\\\\\\\\|\/|g"\`\2.lib /g']`
     3629          m4_toupper($1_LIBS)=`echo " $m4_toupper($1_LIBS) " | [sed -e 's/ \(\/[^ ]*\/\)\([^ ]*\)\.\(lib\|la\) / \`$(CYGPATH_W) \1 | sed -e "s|\\\\\\\\\\\\\\\\\\\\|\/|g"\`\2.\3 /g']`
    36303630        fi
    36313631        m4_toupper($1_PCREQUIRES)="$2"
    39593959  #    everything of the form "-Lpath" by "-libpath:`$(CYGPATH_W) path`
    39603960  if test x$coin_cxx_is_cl = xtrue || test x$coin_cc_is_cl = xtrue ;
    39613961  then
    3962     m4_toupper($1_LIBS)=`echo " $m4_toupper($1_LIBS) " | [sed -e 's/ \(\/[^ ]*\/\)\([^ ]*\)\.lib / \`$(CYGPATH_W) \1 | sed -e "s|\\\\\\\\\\\\\\\\\\\\|\/|g"\`\2.lib /g']`
     3962    m4_toupper($1_LIBS)=`echo " $m4_toupper($1_LIBS) " | [sed -e 's/ \(\/[^ ]*\/\)\([^ ]*\)\.\(lib\|la\) / \`$(CYGPATH_W) \1 | sed -e "s|\\\\\\\\\\\\\\\\\\\\|\/|g"\`\2.\3 /g']`
    39633963    m4_toupper($1_LIBS_INSTALLED)=`echo " $m4_toupper($1_LIBS_INSTALLED)" | [sed -e 's/ \(\/[^ ]*\/\)/ \`$(CYGPATH_W) \1\`/g' -e 's/ -l\([^ ]*\)/ lib\1.lib/g' -e 's/ -L\([^ ]*\)/ -libpath:\`$(CYGPATH_W) \1\`/g']`
    39643964  fi

comment:4 Changed 8 years ago by stefan

Sorry for not getting back to this earlier. But I think that Lou's initial suggestion ("strip all of our tweaks and start clean") is good.

So I merged r2552 and r2554 (install_autotools script and run_autotools updates) from the autotools-update branch and removed almost-all macros from coin.m4.

I've also updated the install_autotools/run_autotools scripts to use the most recent autotools versions and added the compile script that you mentioned.

The autotools-update branch of CoinUtils works with this BuildTools/trunk. By working I mean I can compile CoinUtils on my Linux box, but without checking any dependencies and so on.

My suggestion would be to select a small set of projects (e.g., CoinUtils, Osi, Clp, and required ThirdParty's and Data's) and to update them to work with the current BuildTools/trunk branch on Linux (gcc), then Mac OS X (clang), then cygwin/mingw on Windows, and cl/icl at last (preferably by using some wrapper script, if necessary, thus getting rid of CYGPATH_W & co).

I don't say that everything that was in coin.m4 has to be reinvented, but before copying something back we should reevaluate what do we really need and how much we can stick with the autotools defaults. This is a good point to break with our previous conventions (e.g., was pkg-config indeed a good idea?).

comment:5 Changed 8 years ago by kelman

Your progression plan sounds like the most feasible approach, I'll help where I can.

Doing this in BuildTools/trunk is a bit of a leap though, as some of the projects' trunks are currently using BuildTools/trunk as an svn:external. Either that should be fixed by changing those to stable/0.7, or we should try to do the experimentation within the autotools-update branch.

On that last note, I think pkg-config does help on balance. It makes the projects manageable more independently, allowing package managers in Debian, Fedora, Homebrew, etc to handle them more sanely. The only downsides of pkg-config that I can see are its unavailability on MSYS (however there are potential drop-in replacements available, that don't have the circular dependency on glib), and I'm guessing it might not play well with Microsoft/Intel compilers.

Last edited 8 years ago by kelman (previous) (diff)

comment:6 Changed 8 years ago by kelman

Here's a temporary "works-for-now" patch to put Data/Sample back into the autotools-update branch of CoinUtils. It would either require making a new branch of Data/Sample, or semi-breaking its trunk. I needed to put back the COIN_EXAMPLE_FILES macro into coin.m4, modified to just use AC_CONFIG_LINKS instead of depending on CHECK_VPATH, ENABLE_MSVC, etc. AC_CONFIG_LINKS has some limitations though, see the comments.

This allows the unit tests to run, and they even pass on my Red Hat machine.

Last edited 8 years ago by kelman (previous) (diff)

Changed 8 years ago by kelman

patch to put Data/Sample back into autotools-update branch of CoinUtils

comment:7 Changed 8 years ago by stefan

No project should use BuildTools/trunk in its external, unless they know what they are doing. I merged into trunk to increase the pressure a bit :-).

I'm ok with keeping pkg-config, I just wanted to give it another thought.

For Data/Sample, I created a autotools-update branch and put your patch into there. I'll do similarly with Osi, Clp, and some of ThirdParty.

comment:8 Changed 8 years ago by kelman

Ted might not be happy, but anyway. Thanks for the autotools-update branch in Data/Sample.

Initial tests of CoinUtils on various platforms are mostly good. Linux, Cygwin native, MSYS-MinGW32 and MSYS-MinGW64 all pass the unit tests. Cygwin-to-MinGW cross compile is having trouble with library paths in the wrapper executables, I might send a message to the libtool mailing list asking about that. Mac OSX 10.9 is having trouble finding isnan.

Can you add me to the cc of this ticket? It doesn't look like I have the Trac permissions to do that.

Last edited 8 years ago by kelman (previous) (diff)

comment:9 Changed 8 years ago by stefan

  • Cc kelman mlubin added

There are now autotools-update branches in place for CoinUtils?, Osi, Clp, Data/Sample, ThirdParty/Blas,Lapack,Glpk. That should be enough to get started. I gave Tony and Miles write access to all of BuildTools? and the autotools-update branches of the mentioned projects.

Tony, your enthusiasm and contributions are very welcome, but please give your ph.d. priority over this one. :-)

comment:10 Changed 8 years ago by kelman

Point taken, Stefan. I tracked down and fixed the Mac isnan problem. The Cygwin-to-MinGW cross compile paths issue doesn't have anything to do with autotools, even a Cygwin-to-MinGW cross-compiled C++ hello world needs modified PATH to find the MinGW versions of the runtime libraries.

In order to do Osi or anything more complicated, this is the place to talk about AC_COIN_CHECK_PACKAGE (and _FALLBACK). Do we like the current solution enough to keep it mostly in its current state going forward, or do we want to think about possibly cleaner solutions?

comment:11 Changed 8 years ago by stefan

Good question. What I don't like most on the current approach is that it blows up compiler flags if one checks for one project after another, e.g., because they are optional. Couenne is an example where the compiler flags fill one page of my screen.

That is, if I do (I simplify the macro call here)


and all of them are present, my CXXFLAGS have the compiler flags for finding the headers of A, B, and C (if all of them are found), because our macro just concatenates what pkg-config returns for A, B, and C. If C actually depends on A and B, then the flags for A and B are repeated.

My initial suggestion to improve on this is to only record which of A, B, and C have been found and to figure out cflags & co in a single pkg-config call afterwards, maybe during the AC_COIN_FINALIZE (should reintroduce this anyway).

The more natural approach would be to figure out cflags & libs during the build by calling pkg-config from the Makefile, but that fails if there is no pkg-config and we want to use our fallback. An alternative might be to have our own pkg-config fallback script that implements the required logic of pkg-config and a similar behavior. That could be called during build time then.

I'm open to other suggestion.

comment:12 Changed 8 years ago by kelman

Hm, the Cc appears not to have worked, didn't get an email there. Maybe one of the mailing lists, BuildTools or coin-discuss would be higher visibility, get input from a few more people.

The flags do get extremely long for big projects with lots of dependencies like the MINLP solvers and OS. Ted was even running into a problem with libtool using MinGW building OS in a long folder name, the link command line gets too long and libtool tries to break it up but something breaks in the process. And the long set of flags contributes quite a bit to build time, especially on Cygwin and MSYS where libtool is very slow to process all of the arguments.

It feels a little wrong to me for the larger projects to be referring to all their dependencies' headers w.r.t. the locations scattered around the dependencies' source trees. If the dependencies had to be make installed first, all the headers would get put in a single place (or at most one per dependency) for bigger projects to include them from, right?

Ted has thrown around some ideas of getting rid of the top-level folders and configures and svn:externals as we know them, and replacing with a smarter (but hopefully lightweight) package-management style approach. Homebrew makes this easy on Mac, and there are forks of it that may work similarly for Linux or Cygwin/MinGW. Homebrew requires Ruby and Git, but maybe we could implement something that would work similarly while staying in Bash and autotools.

If a hard dependency on pkg-config saves enough trouble to be worth going for, we shouldn't have to re-implement it ourselves. There are options we could choose to depend on for users on platforms where pkg-config isn't available, such as or if that is the direction we want to go.

Last edited 8 years ago by kelman (previous) (diff)

comment:13 Changed 8 years ago by mlubin

Referring to dependencies' internal header files is also a source of bugs: I'm also in favor of getting rid of top-level folders, at least as the primary development structure. For distributing easy-to-compile source packages for users, it wouldn't be too hard to make a single Makefile that runs ./configure; make install in the subdirectories and takes care of making sure all of the projects find each other.

comment:14 Changed 8 years ago by mlubin

A related option is to keep top-level directories and svn:externals as they are but ditch the complicated recursive autotools structure and simply use a Makefile to handle the top-level logic.

Last edited 8 years ago by mlubin (previous) (diff)

comment:15 Changed 8 years ago by kelman

Good idea, I forgot about the option of expressing logic and dependencies in makefiles. One little issue there is configure arguments, my configure calls usually have a bunch of extra options tacked on to them. How about having a small top-level configure, but all it does is populate the arguments it was called with into a that does the subfolder configures with the same set of arguments, and their makes with all the project dependencies expressed in the makefile rules.

comment:16 Changed 8 years ago by mlubin

That would work, but we could also just have a file Make.user where you set variables, including configure arguments.

comment:17 Changed 8 years ago by kelman

I guess it depends on how much parsing of the variables/configure arguments from that file you need to do at the top level. Could be a little messy in just a basic Makefile to say skip configure and make for ThirdParty/Lapack if the user provided their own --with-lapack=... somewhere in Make.user. Certainly doable, but probably simpler with the help of configure and auto-generated Makefile conditionals. Or that logic could be moved from the top level to within ThirdParty/Lapack's own configure, so it would know to shut itself off when --with-lapack=... is given.

Stefan, Ted, anyone (Bueller?), opinions one way or the other?

comment:18 Changed 8 years ago by mlubin

From examples I've seen in other projects, you can have overridable variables like LAPACK_LDFLAGS="-L/path/to/lapack -llapack -lblas, etc and simple conditionals like

   CONFIGURE_FLAGS += --with-lapack="$(LAPACK_LDFLAGS)"

I'm not an expert on this, but I do know that it's feasible. Some might consider it a step backwards though. It is also not great to provide two different interfaces for compilation options, one via these Makefile variables and the other via the configure arguments in the low-level project directories.

comment:19 Changed 8 years ago by kelman

Remember that ifeq only exists in GNU make, which may or may not be an issue (see for example Related to Lou's platform question of whether we try to support anything less common than Linux, OS X, or Windows. Your Debian packages should help find any architecture issues (post-releases, post-updating the deb packages) thanks to buildd, but are there users on any of the BSD's or commercial *nixes?

comment:20 Changed 8 years ago by stefan

  • Cc kelman@… added; kelman removed
  • Status changed from new to assigned

It was always very comfortable to me that I could checkout a project including all its dependencies and build it with one configure && make, without having to install before. This is especially useful when developing tightly intertwined projects like (e.g., Clp&Cbc).

However, I agree that this "feature" comes with a lot of issues for the buildsystem. With BuildTools 0.6 (or 0.7) we introduced the possibility to build projects against installed version of its dependencies. So I guess the next step could be to force this behavior.

What I and I believe also some users would like to preserve is the possibility to checkout a project together with all its coin-or based dependencies and build&install it in one rush. That this will change to configure/make/make install each dependency one after another instead of configuring each project, then building each one, and finally installing each one should be ok.

I am not convinced yet that a Makefile in the top-level directory will be sufficient to do all the checks on whether projects should be build or not. There might be more questions that need to be answered first. Is a project still going to be build&installed if there is already an installation present in the system? So far we allowed users to pass in compile/link flags via --with-prjct-... options or set a --without-prjct option. These were disabling a project build. Can I still build in a directory other than the project source if I only have a makefile? Further, I really like the scripts.

I would very much like to require only bash and C compilers on the users side, i.e., no dependency on package management systems. This was also the main reason for not switching to cmake or similar. Regarding pkg-config, maybe we could add one of these pkg-config alternatives to our system and build it in case no pkg-config is installed on the system? I had preferred something that does not need to be build, but I still prefer easy building over having to rewrite.

However, if we build only against installed projects, should it be easy again to not require pkg-config at all? A nice advantage of the current pkg-config setup is that a project can communicate all its compiling and linking flags up to projects that want to use it. However, installing self-contained libraries (LIBADD) and putting headers into a single directory would make it very easy again to check for the presence of a project without pkg-config. We would loose the check for version numbers (e.g., requiring coinutils >= 2.8 or so), but we have rarely used this so far anyway.

Finally, regarding requirement of GNU make, Andreas was so far the only one who requested to be able to use standard make. But this was when he was still at IBM and his code still had to build on AIX. ;-) As long as we are on the side of configure and automake, it shouldn't be difficult to keep compatibility with standard make. I think it is a good idea if at least the normal build would work on any system. For the handwritten's of examples (other than for Ipopt), we require GNU make now, and so far noone complained to me about it. If the cleanup of the buildsystem will make it easy to restore compatibility with standard make, then it might be nice to have it.

comment:21 Changed 8 years ago by kelman

I agree with you that for Cbc, Bonmin, OS and the like just doing ../configure --whatever && make test && make install a single time at the top level is most convenient for everyone. Sounds like we should aim to keep that as the top-level interface if possible, but change how the dependencies are handled inside top-level configure in a way that users don't have to notice the change. I think we can improve the internal organization yet keep the overall behavior automated and about as convenient for users as it is now.

BuildTools 0.7 way:

  • Top-level configure checks for arguments, recurses into dependency configures based on user choices for enabling/disabling/using-already-installed-versions-of different parts.
  • Top-level make recurses into dependency makes.

Proposal idea A:

  • Top-level configure does similar checks for arguments like BuildTools 0.7 top-level configure does, but handles CFLAGS and LIBS in some better way (or not at all?), and doesn't recurse into dependency configures. Instead it generates a top-level Makefile (from a which comes from, so the Makefile will get put in the build directory like normal for VPATH builds).
  • The generated top-level Makefile contains the dependency logic, with certain projects disabled/skipped/using-already-installed-versions if the user asked to do so. The Makefile rule for each dependency project is to cd to its project sub-build-folder then run its specific configure, with arguments copied over from anything the user provided to the top-level configure via command line or (plus the same --prefix as the top-level build), then run its make (hopefully doing parallel make in this step if the user asked for it at the top level), then its make install.

I think I've seen projects that do this, where make for the top-level project calls configure then make for some of the subprojects instead of recursively configuring from the start. I think it's the way GCC builds MPFR and the like if you put their source in the same tree?

Maybe we could create a ThirdParty folder for one of the pkg-config alternatives? Or as a subfolder of BuildTools since everything already has that in its externals? I'll try to play with the two I linked above and see which one is simpler to build and/or works better in MSYS, which should be the only place it's needed?

Or maybe you're right that if we force make install of dependencies before building larger projects (in an automated way, at least in the above proposal), then the no-pkg-config fallback is much simpler than in BuildTools 0.7.

As far as I know GNU make should be easily available on anything these days, it was just something to make a note of if we're considering writing Makefiles that are supposed to do important (and reasonably complicated) tasks. For anything generated by autotools it's already taken care of.

The Cc works now, thanks.

Edit 1: I can think of a potential downside to the above proposal. Say you have a large project, and something flags an error during the large project's own subfolder configure. You wouldn't want to have to wait for every single dependency to compile first before you realized you had to add a different flag to configure to fix some error. So maybe we have a set of PROJECT-configure makefile rules that go first, then a set of PROJECT-make rules. So for say the Clp top-level makefile, it looks like:

all: Clp-configure Clp-make

Clp-configure: Osi-configure CoinUtils-configure
    cd Clp && $(top_srcdir)/Clp/configure $(CONFIG_ARGS) --prefix=$(prefix)

Osi-configure: CoinUtils-configure
    cd Osi && $(top_srcdir)/Osi/configure $(CONFIG_ARGS) --prefix=$(prefix)

    cd CoinUtils && $(top_srcdir)/CoinUtils/configure $(CONFIG_ARGS) --prefix=$(prefix)

Clp-make: Clp-configure Osi-make CoinUtils-make
    cd Clp && $(MAKE)

Osi-make: Osi-configure CoinUtils-make
    cd Osi && $(MAKE) && $(MAKE) install

CoinUtils-make: CoinUtils-configure
    cd CoinUtils && $(MAKE) && $(MAKE) install

Wrapped in automake conditionals to skip subprojects (switch to a trivial rule that echo's why the project is being skipped) where appropriate.

Edit 2: Thinking out loud here, I realize this is the same as doing recursive configure.

Proposal idea B:

  • Do configure recursively, just change how we handle CFLAGS and LIBS.
  • Generated top-level Makefile works like Proposal idea A, but none of the Makefile rules call configure. Makefile rules for dependency projects do make install before larger project can start compiling.
Last edited 8 years ago by kelman (previous) (diff)

comment:22 Changed 8 years ago by kelman

Some more thoughts on this:

Using recursive make at the top level as in BuildTools/0.7 has some advantages that would be extra work to recreate by hand (maybe not that much work though), like generating recursive make clean and make distclean rules which are useful to have at the top level.

Some users want to install things into a shared system prefix, so they typically do make all && sudo make install.

Requiring dependency projects to be installed before we compile bigger projects has advantages in making CLAGS and LIBS much simpler, but could disturb the system-prefix sudo make install workflow. Requiring sudo for make all in this case doesn't feel like the right option.

We could perhaps fix support for and make use of DESTDIR for a temporary staged install of the dependencies in a location we know we always have write access to, if $prefix is not writable during normal make.

Or keep brainstorming for more ideas.

comment:23 Changed 8 years ago by mlubin

Requiring only bash and a C compiler is definitely a good goal. I've built Cbc on a Blue Gene/P (cross compiled even) without major issues, and we should keep this possible. The number of users not on Windows/Linux?/OS X is definitely a small minority though. It wouldn't be terrible to force them to install via the autotools package subdirectories if we choose to take a less platform agnostic approach (gnu make/python scripts/etc) for the top-level directories. At the end of the day it could still be the right choice to use autotools at the top level, but I just wanted to point this out.

comment:24 Changed 8 years ago by kelman

I'm finding some interesting things by experimenting with pkg-config in BuildTools 0.7.

First, there are more missing headers that aren't getting installed but should be: in Bonmin/src/Interfaces/, need to add BonExitCodes.hpp to includecoin_HEADERS, and similarly in Bonmin/src/Algorithms/QuadCuts/, need to add BonLinearCutsGenerator.hpp to includecoin_HEADERS (Couenne needs these to build against an installed Bonmin).

Second, if you set the environment variable PKG_CONFIG_DISABLE_UNINSTALLED=yes in configure, then pkg-config won't use the *-uninstalled.pc files, it will use the installed locations for everything. make all doesn't work in this case, but if you do make install immediately it does. This doesn't solve the repetition issue, but it does make CFLAGS and LIBS shorter by referring to installed locations instead of build-tree locations for headers and libraries. Note that this breaks the _DEPENDENCIES mechanism described here I believe (please correct me if I'm wrong) this mechanism is only important for executables when linked with static libraries, to force rebuild of the executable if one of its dependency libraries changes. In this situation I don't think it would be necessary to rebuild library files (either static or shared - assuming all the COIN libraries are static or all of them are shared), or executables that are linked against shared libraries.

Third, moving the pkg-config call to build-time in the Makefile as Stefan mentioned in comment:11 could be a good option. One downside to that right now is COIN_PKG_CONFIG_PATH_UNINSTALLED gets very long, so there's not as much net reduction in command line length as there could be. If we decide to keep using the -uninstalled.pc files to avoid the sudo problem or because of the _DEPENDENCIES mechanism, would it make sense to symlink them into the top-level build folder so they're all in the same place and COIN_PKG_CONFIG_PATH_UNINSTALLED can be much shorter?

Fourth, I have a works-in-testing tweak that fixes the repetition issue. This is in BuildTools 0.7's AC_COIN_CHECK_PACKAGE. Instead of concatenating CFLAGS and LIBS for the libraries given as the third argument, I replace them with a new pkg-config call using the accumulated PCREQUIRES. It would save a little time in configure to only do these once during FINALIZE or something, as opposed to every time the same library is used as the third argument to AC_COIN_CHECK_PACKAGE.

  • coin.m4

    35143514        # augment X_PCREQUIRES, X_CFLAGS, and X_LIBS for each build target X in $3
    35153515        coin_foreach_w([myvar], [$3], [
    35163516          m4_toupper(myvar)_PCREQUIRES="$2 $m4_toupper(myvar)_PCREQUIRES"
    3517           m4_toupper(myvar)_CFLAGS="$m4_toupper($1)_CFLAGS $m4_toupper(myvar)_CFLAGS"
    3518           m4_toupper(myvar)_LIBS="$m4_toupper($1)_LIBS $m4_toupper(myvar)_LIBS"
     3517          m4_toupper(myvar)_CFLAGS=`$PKG_CONFIG --cflags "$m4_toupper(myvar)_PCREQUIRES" 2>/dev/null`
     3518          m4_toupper(myvar)_LIBS="$m4_toupper(myvar)_PCLIBS `$PKG_CONFIG --libs "$m4_toupper(myvar)_PCREQUIRES" 2>/dev/null`"
    35193519        ])
    35203520      ],
    35213521      [ m4_tolower(coin_has_$1)=notGiven

This gets rid of the duplicates in CFLAGS and LIBS (thanks to pkg-config), shortening OSLIB_CFLAGS and OSLIB_LIBS by almost a factor of 4 even with everything still referring to the build tree. Combined with PKG_CONFIG_DISABLE_UNINSTALLED=yes they're even shorter.

Oh and there's all the CYGPATH_W mess going on, hopefully the compile wrapper script will let us get rid of that.

Last edited 8 years ago by kelman (previous) (diff)

comment:25 Changed 8 years ago by stefan

I like the mechanism of being able to configure all in one rush (the argument that you don't want configure to fail for the last project after having build all-but-the-last one convinced me). Still building one project against installed versions of its dependencies sounds cleaner to me than what is happening now (we could also get rid of the -D<PRJCT>_BUILD compiler flag again). However, requesting root permissions for a make (all) isn't that nice, indeed. I hadn't thought about this before. But the DESTDIR sounds like a viable alternative.

So do I understand correctly that proposal idea B is to configure all projects, using the usual configure-style recursion, then do make and make install in each project. If $prefix is not writable, then we do make install into a writable DESTDIR. If we used DESTDIR, then the make install at the toplevel will copy DESTDIR into $prefix (assuming user used sudo now), otherwise nothing needs to happen there.

Or should we use DESTDIR in any case, so things are put into $prefix only if a user said make install. Maybe he actually does not want to install and he wouldn't think about having to call make uninstall if he did not do make install explicitly.

I haven't thought much about make test, but as far as I see it wouldn't change much.

If we build against installed projects (either in $DESTDIR or $prefix), there isn't much work left to do for pkg-config. The main task is to find out which projects are actually present at configure time. Since dependencies have not been build&installed then, looking only for a library or header file will not be sufficient. The configure of a project like Osi will have to look for both CoinUtilsConfig.h and ../CoinUtils/ to see if an installed or to-be-build-and-installed version of CoinUtils is present. pkg-config would still simplify this.

If we first build&install dependencies (into DESTDIR), then I guess the *-uninstalled.pc file will have to point to DESTDIR/$prefix, while the other *.pc files have to point to $prefix only? Or it would be just one .pc file and we pass in the DESTDIR via --define-variable=DESTDIR=$DESTDIR when appropriate? (but how to know that? -- maybe not a good idea)

Symlinking the *-uninstalled.pc files in the toplevel build dir sounds like a viable option. Maybe every project can put a symlink for its own prjct-uninstalled.pc there when finishing configure?

Oh, and the _DEPENDENCIES should indeed only matter if one builds static libs. I wouldn't worry too much about it for now.

comment:26 Changed 8 years ago by kelman

I'm not positive exactly how -D<PRJCT>_BUILD works right now, don't know about that.

If we put back AC_COIN_MAIN_PACKAGEDIR (it's pretty harmless, standalone, and does what we want regarding --with-<prjct>, COIN_SKIP_PROJECTS, etc.) then normal autoconf recursive configure will populate $subdirs (lower case) with the appropriate subprojects at the top level.

The question is whether or not to then, in the top-level, set SUBDIRS = $(subdirs). Setting upper-case SUBDIRS for automake gives us the present recursive make behavior, but typically recursive make all of everything first, then recursive make install of everything. We could continue doing this but add additional make rules to enforce make install of dependencies into DESTDIR before make all of larger projects (I think I like the uniformity of always doing the same thing - and since most of make install is just copying files I don't think we would need to rewrite it to manually copy from DESTDIR to $prefix). Or we could leave upper-case SUBDIRS unset, so automake doesn't create any recursive make targets, instead we would put them together ourselves.

make test is already manually written, I doubt it would need to change at all. The difference is whether or not we'd have to manually write recursive all, install, clean, distclean, uninstall, etc rules ourselves or let automake create them.

I don't see Osi ever checking for CoinUtils headers, I only see that happening for ThirdParty projects where it isn't known whether the user actually has the source. For the internal COIN projects, I see either pkg-config or AC_COIN_CHECK_PACKAGE_FALLBACK looking for the .pc files to determine existence of the dependency projects.

There's a bit of difference here between the CoinUtils subfolder of CoinUtils, and the CoinUtils subfolder of bigger projects that depend on it. The former doesn't need to worry about this installation/DESTDIR stuff. But we'd like to come up with a mechanism that would work for both (maybe that's just doing it redundantly on the small projects even when it's not necessary). I think we'll need to look at other projects to figure out how they do it - when you install into DESTDIR, do the .pc files get modified to take that into account?

Edit 1: Or maybe we just say that for this purpose we will always use the top build dir for DESTDIR, and we modify the -uninstalled.pc files to point to things there, possibly symlink the -uninstalled.pc files into that folder too. Referring to absolute paths to the .la files on the Libs: line might even preserve the _DEPENDENCIES mechanism.

Last edited 8 years ago by kelman (previous) (diff)

comment:27 Changed 8 years ago by stefan

If it is easy to modify the behavior of make all in the top level to do a make install (with DESTDIR) in all subprojects, then keeping SUBDIRS = $(subdirs) sounds preferable to me, i.e., better modify standard behavior on one place than reimplement everything.

Do I understand your comment on not having to copy files from DESTDIR to $prefix correct in the way that you imagine make install in the top level to be just calling make install of each project without DESTDIR set? Sounds good to me.

I think what you propose in Edit 1 is similar to what I meant in comment:25 (before the Or it would be just...). I would prefer the DESTDIR to be a subdirectory of the toplevel build directory, not to be that one itself, just to keep things better separated.

comment:28 Changed 7 years ago by stefan

Short status update: CoinUtils with ThirdParty/Glpk and Data/Sample are now complete in a first version, that is, they implement the mechanism from comment:25 and further and in a very default setup (linux, shared libs), it should do what was possible with these projects before.

I doubt that static linking of executables would work, as we currently do not pass on dependencies like -ldl or -lgmp, e.g., from glpk to CoinUtils. For the shared libs, these dependencies are stored in the file now, so there is no need to pass these on.

Also all Windows workarounds have been removed, so this is unlikely to work either.

Some features from BuildTools 0.7 will be gone, e.g., doing --enable-debug and co, as we do not modify compiler flags. However, -g -O2 seems to be the default anyway, so it's optimized and debug. libtool is not reused from the base directory anymore. Also tracking dependencies between projects is not possible anymore, i.e., recalling to rebuild the clp binary when the coinutils lib changed, but these only mattered in case of static builds anyway. And probably a number of other little things.

Next steps will be do look into other ThirdParty projects (that should be straightforward) and to investigate static linking.

comment:29 Changed 4 years ago by stefan

We decided a while ago to go a different way to upgrade the autotools and started implementing this in BuildTools/trunk and various autotools-upgrade branches. I don't recall the details on what we're doing anymore and we're probably not going to add updates on what we're doing here anymore - look at the commit messages for that :-).

comment:30 Changed 3 years ago by stefan

  • Resolution set to fixed
  • Status changed from assigned to closed

Development is still going on, but not additionally documented in this ticket. Also it seems that the issue it too big to cover it with one ticket. So I close this. There is not good resolution available on Trac for this, so I'm being bold and use "fixed" :).

Note: See TracTickets for help on using tickets.