Two common causes (at least in my experience at a giant software company).
1. Deliberately making things complex to get promoted. Not always done intentionally, but often obvious from outside. This also covers cases where something existing *should* be reused but isn't, usually with little to no justification.
I've seen something like this, though it's arguable that it was the brainchild of one ornery old (past retirement, still working) firmware engineer, and therefore fits better into the one-off, personal-project sort of category.
(To add insult to injury, that project was -- oh gosh, in the 10s of kLOC, I'm not sure exactly how much, maybe 20-50? -- of C, running on a PIC24E, what a mess of a MCU it is!)
To some extent, some people like making things hard for themselves. Masochists, perhaps? Consider the history of the $0.03 micro among western practitioners.
Business should mostly rule out such gimmickry, but it always surprises me how much leeway there actually is in a lot of industries. Some markets and businesses have amazingly elastic profit/cost margins. A lot of them have low competition, or captive markets, which is probably hinting at some kind of deeper point to be made.
2. Many incremental changes done to be as small a change as possible. Laudable on their own, but after a while they can end up building up into an utter mess that can take longer to untangle than it took to get into the state.
Taking the example from another reply of Windows, this is quite justifiable -- all things considered, they've done an astonishing job of maintaining compatibility, only breaking things after multiple generations. Win16 lasted from, what, late 80s, through NT (via NTVDM!), until Vista/7 finally dropped it! Various compatibility settings (minor breaks motivated by various things, from improved security to, I'm sure, some plain old bit-rot of modules) have been added to improve compatibility with specific versions of the API.
So, it's absolutely no accident that their system is so much sausage. Don't fool yourself, Linux has the same issues -- support of too many versions, on too many platforms, over too much history, with development by too many people. No large project can possibly be coherent and, in the sense of parts versus the whole, optimized (like, I'm sure there's a lot of redundant code in there, sometimes for good reason, sometimes not). That's simply the way it must be. At least until we get compilers that are much smarter on much higher levels, to flag or fix these sorts of things, anyway. But even that is risky, because who can possibly debug such a smart system? Bugs in compilers, that's one of the most insidious places.
Tim