But that's kind of a backwards way of putting it, isn't it...? I mean, ideally a model would encompass every possible usage case: input characteristics, output characteristics, like....time dependency of the power supplies even...?! But, even if such models were created as a matter of course, they would be massive and run very slowly. There's just too much stuff to account for in real life that is difficult to implement in SPICE.
A too-simple model can only demonstrate what it's useful for; the textbook model of an op-amp being simply a block of very high gain (approaching infinity) won't even converge outside of DC in SPICE. A more accurate model would be an ideal integrator (which at least occasionally shows up in textbooks, at least once the start talking bandwidth and filters), but that fails for both DC offset and output swing. Such an oversimplified model still fails for the example circuit in question, since voltage will just rise to infinity over time. (Those three terminal models are basically this, plus finite DC gain, and maybe a higher pole to represent phase margin near cutoff. They probably don't cover slew rate limiting, and certainly don't cover output saturation (what voltage rails are they supposed to saturate to?) or supply current (no rails to draw current from!).)
The crap end of it is either case: an oversimplified model that doesn't cover your situation simply isn't going to work. But an overly complex model will run so slow as to be useless, or have convergence issues preventing you from using it in the first place! (Which requires one to dabble in those obscure SPICE environment variables, which...yep.)
The best model is always a model that's just developed and accurate enough for your particular case.
So of course, in the end, the only generalization that really sticks: that the user must be wise enough to decide if the models provided (up to and including the overall circuit) are lying or reasonable. No free lunch and all.
Tim