I think this highlights a point we agree on. It's the bloody users. If you dare to make an ASCII protocol, you absolutely must be prepared to handle ALL the human cases that the users will throw at it.
Of course, and you should.
Because literally the only sensible argument for the ASCII
is the human interoperability. But in the RealWorld^tm, you mostly come across UARTs parsers that fail exactly at that. Some fail more (to the point of being colossal pain), some fail less, but almost all fail.
Typical experience with an ASCII-based embedded device (some examples that come into my mind: a super expensive IMU I interfaced with a year ago; or a hard disk debug interface; or a HP programmable power supply interface; CNC glue dispenser) is this:
* You enter commands and nothing happens
* Or maybe after you have entered two commands (wondering why the first did not work), then first of the two is processed
* You Google for hours what terminal emulator programs to download and how to configure them to use one of the at least 4^2=16 line feed combinations (CR, LF, CRLF, LFCR, and yes, those can be different depending on direction)
* When you finally get the thing to accept a simple command at all, you start using the thing
* Because it's a terminal, meant to accessed by user, you type, use backspace, arrow keys, DEL - nothing of that works either at all, or worse, characters at wrong places get deleted or even more usually, what you see on the screen does not match what is internally in the buffer
* Then you again Google for hours what terminal emulator programs to use and how to configure them to use the correct control codes or escape sequences. The number of combinations explodes to millions.
* With the IMU mentioned, I had to find a specific version of a specific terminal program and compile it from sources to get the desired conversions; just to configure that IMU to enable it in BINARY MODE, after which it is usable.
* You think you have finished a command and press enter and the thing does something completely different or crashes because the internal buffer did not match with your screen, because you pressing backspace was processed differently at the two ends.
You can of course make a decision that the UART interface is not to be used by humans. But there goes the #1 argument for the thing. If you have to generate and parse responses using a Python script for example (which is a great idea to be honest), then you could have used a binary protocol just as well, and have the Python script make it human readable!
The ASCII-controlled RS232 devices of the 1980's have all but disappeared. What remains are simple application-specific UART links, often in binary because it's more efficient use of tight resources, and easier to deal with.