I am not saying to put everything in 1 file, just all the functions that belong to a certain cluster or component otherwise you get other problems like in the next project someone will forget to include all the files.
So for instance your clock example should be a clock.c file with SetClockTime(..), GetClockTime(..) perhaps SetClockTimeZone(..), GetClockTimeZone(..) etc.
Those functions should all be included in the same component, eg c file. You are NOT going to make 4 or more seperate c files from those functions that is ludicrous and from the 6 companies I worked for over the past 20+ years I luckily never ever seen that. A C programmer that did that, would have been fired the same week.
But seperating other functionality that in your example probably was stuck in to one bloat file that consumes masses of ram, like utils or something like that, that is also not how it should be.
Sidenote 1: modern compilers can optimize unused functions out, always check esp in embedded projects or your GUI bloat example.
Sidenote 2: self modifying Unix/Linux libraries can be a PITA since every new release you have to do it over and over again or not update which might be worse.
Instead you should raise a ticket with the maintainers and ask for seperation of functionality in more than one library due to exhaustive memory usage.
I think we're talking about software on very different scales. For small programs it doesn't matter. And if you're using globals with file scope they have to be in the same file.
Generally I put everything in one file for small programs. At a certain size, I split things up into related pieces. If I want to use something in another program I put it in a library so there is only a single version to maintain.
Having had to maintain a 500,000 line code base which had been written by copy and modify with the same bugs repeated in a dozen functions all with different names, I shudder at the idea that everything should be in one file because someone might forget to include the function. They'll find out it's missing as soon as they try to link. And if it's used in more than one place it belongs in a library.
Big packages like 500,000 line seismic processing systems with a dozen libraries are a whole other kettle of fish. For things of that size you need to be very aware of how the linker operates and how the libraries are structured internally.
I think there's been considerable confusion in this discussion between the compilation and linking phases. I'm an old VMS and Unix programmer. I don't use Windows for programming work and I don't use IDEs for anything. I strictly use an editor, make and the command line.
If you're doing mixed language (C, C++ and FORTRAN) programming you have to understand exactly how the linker works, what options are required, etc. You also need to understand the operation of the compilers at a similar level of detail. During the workstation wars of the 80's and 90's that meant i had to know how these worked on 6 different systems for a job porting a single package from VMS to Unix.
Seismic processing is both compute and I/O intensive. A common operation done on all data requires in the simplest case summing at least several hundred thousand values into each output value for many millions of output samples. I read compiler optimization books, not because I'm going to write a compiler, but because I need to know in great detail the hardware level implications of the various optimizations. I pay attention to cache associativity as a careless choice of memory access can flush the cache line on every access. So in the case of a 16 byte cache and a 4 byte variable the data transfer between main memory and the cache goes up by a factor of 4x what it should be. And the longer the cache line, the worse it gets.
I learned all this in the process of fixing problems in several million lines of other people's code. Most of it written by scientists with few if any comments. In one case I had to find and read a journal article just to find out whether the function was operating in the time or frequency domain. Once I found out what it did I rewrote it. It was probably 30-40 lines.
I learned about the linker behavior from a set of libraries which had every function in the library in a single file.