There is no 1 size that fits all. If you develop your own projects, chances are you pick 1 slightly overkill MCU and stick with it. That way you can reuse code. If you are even more restricted on time; get a platform that is supported by the community. Like Arduino, Mbed, etc.
I avoid 8-bit PICs for a long time. Their core is too old. They bolt on some funky peripherals these days, but meh. Their compiler does on support C99; end of story.
16-bit PICs are quite fun to use with PPS, as it makes 2-layer board designs so much nicer to do. I haven't ran into any silicon issues on them.
32-bit PICs feel like they are a bit outdated compared to ST or NXP ARM offerings. The ADC is only 10-bit. No DAC's.
Nevertheless setting up a driver for the built-in ethernet mac was not a big deal. Only took a few hours to write from scratch.
Probably the most hilarious errata for me was of ENC424J600 100Mbit SPI/par chipset.
Boasts at third bullet point: hardware security acceleration engines.
Errata: 5. Module: AES: At room temperature, the AES module may compute valid results. However, across voltage, temperature, and ordinary part-to-part device variation, the AES engine will compute incorrect results or fail to finish computations.
Solution: use software AES.
It always seems Microchip really wants to push untested peripherals to market. The PIC32MZ is filled with bugs because so many peripherals were renewed. Like ADC, USB, SQI, SPI at 50Mbit/s, I2C (unless you want to software poll), etc.
This has been seen with SPI FIFO's on PIC24Fs, and now again.
Makes me wonder if it really would hurt if they put a team of firmware engineers for 2 weeks on a new part writing peripheral code to test the hardware functions.