You see this deficiency in a lot in software, where someone has written a function or group of functions that interact with shared state variables at various times under varying circumstances, and the result is software that usually kinda works, but sometimes does X or Y one call before or after it should, etc, because the programmer never just drew a state diagram and coded to that, but instead just sort of poked at the problem until it kinda mostly worked.
I enjoy watching speedruns of games, where this principle applies in game development. A huge fraction of tricks/exploits/glitches consist of entering some state, then doing a frame-perfect state change that clearly was not expected by the developers.
I feel that a lot of those, back in the day, were due to time crunch and hardware complexity: systems like the N64 were pushed to market, with as much dev support as they could manage to put together in that time. Which, compared to today's biggest, most popular frameworks, was probably just embarrassing. (Maybe it's not; it's easy to overstate the appeal of a framework I've never personally tried. I should really assume it's just as hacked together as every other framework in existence!) The standard graphics backend for the N64, for example, was at least pretty solid (basically a drop-in of the SGI tech they licensed), but tended to be slow. If you wanted to stray from the standard path (say to squeeze as much geometry and framerate as possible), you had a tremendous amount of work to do. Most famously, Rare's games: compare the gameplay and graphics of DK64 to Perfect Dark!
Mistakes continue to be made today, of course. i would argue that it's more a matter of scale now, just because projects are so massive that there's no way they can design, structure or test every combination of every component.
Well, that needs a lot of qualifying, too. Tools like Unity are available to literally everyone, from the single-dev indie to massive hundreds-of-devs "AAA" studios. The smaller studios will still have the same early problems: learning the tools, programming challenges in general, and just managing to put out a complete product as such. It's the larger studios to which I'm referring, of course.
Just in the last two months, Nintendo's
Breath of the Wild has been broken open with wall clipping and superspeed techniques, some of which are fairly easy, and some of which have small frame and position windows to execute.
Bethesda's games are notorious for glitches. At this point, I've got to think it's at least somewhat intentional, as an endearing feature of their games/engines. Like how the physics engine in later GTAs improved considerably, but left in the glitchy playground equipment that launches actors/vehicles.
Meh. I could go on, and on and on. Suffice it to say, if there's a reason something can fail, it will. Structural, design, management, technical, "blame the tools", you name it, some combination will be relevant.
Tim