I like Arduino. I can bang out projects large or small ten times faster (or more) than when I was using PIC and assembly - and I'd consider myself "master level" at PIC assembly. But this is a good lesson: Don't assume library code is perfect and does things the right way, or even correctly. Look at the code, see how and how well it does what it does. Trust it? Use it. Otherwise, dump it and find another way. If library code can't be trusted, then the entire project isn't trustworthy. Within seconds of looking at delayMicroseconds(), I could see it's based on a simple delay loop. That means it is garbage for anything other than very small delays (as in ~<20 us), and even then, only for "at least this much delay", not "exactly this much delay". The function is still very useful, for example when a device requires >=2 microseconds between this and that happening. I can also see that micros() is very different, returning a microsecond timebase, independent of interrupts. Building a longer delay (dozens to hundreds of microseconds) based on this is easy. But this should get anyone thinking... is an inline delay the best way to generate timing for a waveform? It's the easiest, but rarely the best way.