I think I know the answer to this, but am looking for additional opinions. I'm designing an SPI slave device whose shift register length is 11. I'm concerned that some IO chips and their drivers won't be able to properly handle a register length that is not a multiple of 8 bits. In the spirit of engineering for the lowest-common-demoninator, am I better off making it 16 bits totals with 5 dead bits?
For shits and grins, look at the Linear Tech
LTC1598 octal ADC. It's advertised as "I/O Compatible with QSPI, SPI, MICROWIRE ™, etc." but it's batshit daft. The channel select is shifted in with the chip select
high, and that select is the last four bits prior to the chip select going low. Sampling starts after the falling edge of chip select, then two clocks later it goes into hold mode, at which point you start clocking out the conversion result MSb first. Since it's a 12-bit converter, it takes 14 clocks to get that result, and after bit 0 shifts out, if you continue shifting (with the chip select asserted low), it shifts out the result in LSb-first order! The data sheet "helpfully" includes MC68HC05 assembly code to read the converter, which is amusing -- the result is shifted right by a bit.
I read this thing from an FPGA, so the weird shifting means I can design the "SPI" interface to do what I need. Shit like this is why I don't use an "SPI core," and instead just design the interface to meet the needs of the particular slave device. After all, SPI is just a couple of shift registers ...