I wish some MCU vendor would put in some real intelligence in the I2C peripheral to really offload the CPU. I like the way CAN peripherals are usually done, no need to poll some status bits all the time.
http://www.st.com/content/ccc/resource/training/technical/product_training/2d/5a/12/09/9e/7e/42/b3/STM32L4_Peripheral_I2C.pdf/files/STM32L4_Peripheral_I2C.pdf/jcr:content/translations/en.STM32L4_Peripheral_I2C.pdf
You mean more than setting up a DMA, and going to sleep, waking up on interrupt when finished? You cannot really cook coffee with I2C.
Yes, you really need to do
a lot more, unfortunately. Try it and you'll see.
Yes, you cannot cook coffee with I2C, that's the ridiculous part: it
should be trivial. It
should work exactly how you describe it. The I2C peripheral should be there to help.
DMA doesn't help much since you have to babysit (practically bit-bang) all kinds of "send a start bit now" "is start bit ready?" "can I send an address now?" "now write the address", "did I get ACK now?" - then finally you can DMA a few bytes of payload, whoopy doo.
A proper
minimum implementation of what could be called "CPU-offloading hardware I2C" should work so that you'd configure one register with the I2C address and set a DMA channel with the data, tx or rx. Then it would do START, ADDR, and datas automatically. A better implementation would support also configuring a device register address beforehand so it would do a full I2CADDR-WRITE DEV REG ADDR-READ/WRITE PAYLOAD cycle on DMA. Unfortunately, it doesn't work in even the minimum way in STM32, and having briefly looked at some others, it seems to be equally bad everywhere.
STM32 implementation also has extra quirks and poorly documented features/bugs which is why it's widely regarded as unreliable. It took several days of fulltime work to get it really run smoothly and reliably, using interrupt driven management state machine (7 interrupts are generated and need to be processed correctly to read one byte of actual data from a device!). Bitbanging the whole I2C in a timer interrupt would have been both easier and possibly more efficient, unless high bitrates are needed.