when the build process fails on a large project with a page full of dependencies with a large range of supported versions you're up shit creek regardless.
These large projects need to optimize for the common case where you can automate/hide the complexity of configuration of the build process, they need something like autoconf. If it makes life harder when it does fail, well sucks to be you.
If you are using regular makefiles a build fail is a piece of cake. You get a bunch of mesages from the compiler telling you what is wrong. Can't find include file, can't find function, etc. All of those are easily rectified in a few minutes using find(1), fgrep(1) and nm(1). It's the perfect example of why Ken Thompson remarked "Never underestimate the value of brute force."
For example, suppose you encounter undefined variables.
find /usr/include -name '*.h' -exec fgrep -f missing_vars {} \;
I'd have an awk script to construct missing_vars from the compiler messages.
Undefined functions are very similar
find /lib /usr/lib -name 'lib*' -exec nm -Ago {} \; | fgrep -f missing_funcs
I took on at least three 500,000+ line codebases which would not compile as received by me. The longest time it took to get the mess building was about 6 weeks. That was only because the two guys who wrote the programs hadn't checked all the code into RCS and it usually took them a few days to find a missing file and check it in. Once I had that going, I went on to the next one. I was also building the make system as I went.
The code was so buggy, that I set things up so that if a user typed "foo" it looked in a table of user ids and program versions and if there was a version for that user ran that instead of the regular version. I had to do that because I was often fixing different bugs in the same program during the course of a single day. Once everything was tested, it became the standard version and the user specific versions went away.
You do need to understand the linker and library behavior of the system. On *really* old systems prior to ranlib(1) when you had to use lorder(1) and tsort(1) to construct the library you could run into pathological cases which could only be fixed by putting two copies of the function in the library.
With Gnu make all you need is
ARCH=`uname -a | awk '{.....}' `
include ./config/$(ARCH)
and write the makefile to *only* call system executables by the variable name defined in the ARCH file. As you can do arbitrary string processing easily, you can patch over pretty much any system variations.
That has a single layer of indirection. The *only* thing you need to do to configure for a new system is write the ARCH file. The problem with imake, autoconf and similar is that there are so many transient files being used and an error in any one makes the whole mess fall over like the little man in the sys admin interface of AIX circa 1994.
I started the thread as I did because it was really intended for old folks. But I've been very impressed by a lot of the comments. I just wish it were practical to read all of it. It's huge fun, but it grew too fast to keep up with.
Edit: I *think* I've managed to read all the salient comments. Naturally some things were said by several people. The greatest pleasure is it is an intelligent discussion of the problems posed by complexity and fragmentation which was my intent. I'm sure we have not saved the world, but it's truly wonderful to be able to converse with such a smart and diverse group of people.