If you're doing any kind of real-time systems I certainly hope you're not surrending to compiler optimisations to make code predictable in it's execution time. Giving some meaningful statements about worst case timing will become hard, so better avoid recursion at all costs.
And yes, you want the program to work without optimisations. Try debugging your code with -O1, half of the breakpoints and locals will be unresolvable.
And, of course, anywhere the run time is critical,
learn your target machines's instruction set, and inspect the compiler's assembler output! This is also a useful pre-debugging step when the code the compiler is generating doesn't seem to make sense.
Stack usage is also a common reason to avoid recursion.
I think recursion can be useful in meta programming, which is more a thing in C++ than in just plain C. Especially if you use recursion in some kind of constants derivation, it can be OK if you can trick the compiler to evaluate as much as possible during compilation time (e.g. use constexpr effectively). The problem is that also these kind of tricks have their limits, e.g. GCC has it's max recursion depth set to 512 as default.
The downside may be the generation of a crapload of executable code, much like the VHDL case with hardware. This can be helpful on machines with GB of RAM (cellphones maybe, PCs usually), but watch carefully for bloat on embedded systems.
An extreme example might be integrating a JIT compiler with your application, say to solve a particular instance of a combinatorially prohibitive execution tree. A realization of this: Abrash's UT2k3 software render pipeline (search to find the series of articles).
For 99.9999% of all programs, such extremes are neither necessary nor desirable (JS does this constantly, I guess, and look at how much overhead is spent on that!) -- again, it's just another tool to be aware of; another way of thinking about programming that can inspire you in simpler cases.
Tim