Well, let's see. Standard mitigations apply:
- Fuse (including of PTC type, polymer or ceramic) and TVS (esp. SIDAC type)
- Series resistor and clamp (as above schematic, but mind the voltage range better, as noted at the end)
- Current limiter, especially back-to-back depletion MOS, or CCSs + diode bridge for high bandwidth (ref: old Tektronix sampler input stages)
- AC/DC (MOS) type SSR + window comparator, turn it off when out of range (still needs a TVS or whatever to handle overvoltage/current before switch opens, some ~ms)
- Custom interface: receiver could be constructed from resistors and a comparator; transmitter could use discrete transistors to achieve much higher power dissipation and voltage range while meeting specs.
Mind, there is of course no such thing as a controller, just add external transistors, to pull off that last one. Like, even given the analog experience I have, I would consider that more of a last resort. Mainly / mostly / worstly because, making your own, the output slew rate, and feed-forward of internal switching noise, are not so well controlled, making it an EMI wildcard. And it needs to be proven out over temperature, and it still takes more parts, more idle power consumption (and you need a +/-15V converter, not just a charge pump -- well, maybe, but either way it's not going to be integrated in a handy MAX232, eh?), etc. Which is all testing I know how to do, and can do right here for that matter, and, YMMV of course, but needless to say it's going to be way more work than just... like... at that point, not even fuckin' using RS-232 anymore, right?

Which is of course one among many non-answer answers. Others include: change the physical interface (don't use DE-9 or whatever!); stop using RS-232 (who even uses that anymore, right? Right?...); convince the customer that their interface(s) are dumb and need to change instead; etc.. (Obviously, contingent on how feasible any of these actually are.)
Tim