I've so many times had annoying troubles with python-serial (pyserial) and similar, that I nowadays recommend using nonblocking I/O (O_NONBLOCK when opening the device, preferably with O_CLOEXEC so as to not leak the descriptor to child processes), using termios to set the character device to pure 8-bit raw mode.
Python3 does have termios support built-in (see pydoc3 termios), and you'll need to use the built-in fcntl and select modules too (see pydoc3 fcntl and pydoc3 select) to implement non-blocking I/O, so it's not impossible; but, I do prefer doing it in C anyway.
For USB Serial in a Python application, it can make sense to use two Python threads instead, one for writing, and another for reading, both blocking (not non-blocking), optionally using a third thread for the coordination between the two; and communicating with the UI using Queues (one per direction). This is thread-safe, and since the threads are almost always in a blocking I/O call, the fact that the current Python interpreter only executes one thread of Python code at a time, is perfectly acceptable.
For simple Graphical User Interfaces controlling an USB Serial microcontroller widget with a native USB interface developed in the Arduino, like a Teensy, this works perfectly. The serial interface can be implemented separately for each OS (in Python), and for Linux and POSIX-type OSes, the basic threaded 2/3 worker model works very well and allows completely asynchronous interface to the device, while being so simple that even a completely new programmer can get it right.
(Again, note that in Linux, this involves only packages available in package management, so when distributed as a standard package (with sensible library version requirements!), there are no dependency issues; it Just Works.)
The completely asynchronous interface means that instead of the traditional query-response ping-pong approach, the host and the device has essentially two separate data streams that do not need to interact at all: one from the host to the device, and the other from the device to the host. Given a native USB interface, the MCU can almost always support multiple endpoints, so that you can have for example a dedicated data streams and a dedicated control stream. (I'd like to switch to bulk USB transfers, but thus far haven't found a way for really new programmers to grasp all of that as easily as USB Serial.) A common one is a gamepad or joystick, keyboard, and USB Serial.
Because Human Interface Devices (HID) do not require OS drivers or privileges, for low-bandwidth applications (much less than 64000 bytes per second, 1ms latency per message) HID is a superior choice, as it completely avoids the OS driver support mess, and Just Works.
Even the communications protocol between the application and the microcontroller then uses the event-based approach. If you have queries and responses, you'll associate each with an identifier, so that more than one thing can be under work at the same time, and there is no specific order to things, as they are identified by the identifier and can occur in whatever order. This can help a lot with latencies and delays, and can boost the achievable throughput –– say, for a data acquisition device where there is lots of data flowing in to the host, and only occasionally some control commands to the device –– avoiding the annoying "glitches" (drops in data) during command parsing. You know, useful stuff, and not at all difficult to grasp when you have your mind open and are willing to learn new stuff.
Whether the interface to such a device is implemented in Python or in C, even when the user interface was in Python, is a bigger picture question. If the device is not tightly coupled to the software, and the vendor doesn't mind the clients developing new applications for the device, then a pure Python implementation makes sense.
If the device and the software are tightly coupled together and both the device and the processing of the data to/from the device involve Top Sekrit Sauce, then C or C++ makes a lot more sense.
When interfacing with Python, I do prefer to use C and not C++, because the runtime dependencies are so much simpler for C. In Linux, for example, basically all dynamically linked executables, no matter what programming language they use, have a runtime dependency on the standard C library. So, if one limits oneself to C only, the dependencies are minimized. Others prefer the power of expression and approaches in C++, and consider the added dependencies worth the packaging effort.
As this shows once again, I don't personally use Python for its language features or its libraries, but because it has a gentle learning curve for new programmers and allows them to do modifications without a development environment, and because it is so easy to interface to native libraries, having only to write Python code for the interface; and because its nature as a scripting language provides a perfect licensing separation for more complex licensing schemes. AIUI, all other widely used scripting languages like Ruby and Lua do require additional native code for efficient bindings.