Author Topic: Arduino mega tone() messing up with delayMicroseconds()  (Read 2594 times)

0 Members and 1 Guest are viewing this topic.

Offline BendbaTopic starter

  • Regular Contributor
  • *
  • Posts: 216
  • Country: au
Arduino mega tone() messing up with delayMicroseconds()
« on: May 27, 2017, 01:06:29 pm »
Hi,

I am trying to write a simple IR remote code for an Arduino mega board. Generating 38kHz on one pin and the timing on another one using an array for the timing data and delayMicroseconds.
The delay/array works alright but when I enable the tone(), the timing is way out, slowed down by a factor of approximately 2.32. The 38kHz tone is spot on though.
I tried using different pins to have the two functions using two different timers but the result is always the same.

Any idea of what I'm doing wrong?

Thanks
Stop dreaming your life, start leaving your dreams.
 

Offline Moondeck

  • Regular Contributor
  • *
  • Posts: 142
  • Country: dk
  • i really like compilers
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #1 on: May 27, 2017, 02:54:13 pm »
No code, no idea.
I'm selling 100ml bottles of free energy, PM me for pricing.
 

Online NorthGuy

  • Super Contributor
  • ***
  • Posts: 3146
  • Country: ca
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #2 on: May 27, 2017, 03:18:58 pm »
I guess the delayMicroseconds() function loops the CPU for a certain number of cycles to get the specified delay.

At the same time, tone() may use timer interrupts. The cycles spent in the ISR for the timer interrupt are not counted by delayMicroseconds(), so it takes longer for it to complete. This produces longer delays.
 

Offline macboy

  • Super Contributor
  • ***
  • Posts: 2254
  • Country: ca
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #3 on: June 01, 2017, 11:20:53 pm »
I guess the delayMicroseconds() function loops the CPU for a certain number of cycles to get the specified delay.

At the same time, tone() may use timer interrupts. The cycles spent in the ISR for the timer interrupt are not counted by delayMicroseconds(), so it takes longer for it to complete. This produces longer delays.

This is exactly what is happening. Look in hardware/arduino/avr/cores/arduino/wiring.c for the source to delayMicroseconds(). It's a dumb delay. I hate those.

The OP can instead implement his own delay function by delaying until a specified delta in the return value of the micros() function. The micros() function returns a value calculated from the system timer (the one used to generate the interrupt per millisecond). I've seen a granularity of 4 microseconds, which hopefully isn't too much of an issue. This could change according to clock speed and specific Arduino MCU used (I was using Uno at the time).
 

Offline Moondeck

  • Regular Contributor
  • *
  • Posts: 142
  • Country: dk
  • i really like compilers
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #4 on: June 02, 2017, 08:16:19 am »
Stupid Ardooino is stupid. How about just using interrupts and/or timers like a sane human being? Because makers.
I'm selling 100ml bottles of free energy, PM me for pricing.
 

Offline macboy

  • Super Contributor
  • ***
  • Posts: 2254
  • Country: ca
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #5 on: June 02, 2017, 01:52:40 pm »
I like Arduino. I can bang out projects large or small ten times faster (or more) than when I was using PIC and assembly - and I'd consider myself "master level" at PIC assembly. But this is a good lesson: Don't assume library code is perfect and does things the right way, or even correctly. Look at the code, see how and how well it does what it does. Trust it? Use it. Otherwise, dump it and find another way. If library code can't be trusted, then the entire project isn't trustworthy. Within seconds of looking at delayMicroseconds(), I could see it's based on a simple delay loop. That means it is garbage for anything other than very small delays (as in ~<20 us), and even then, only for "at least this much delay", not "exactly this much delay". The function is still very useful, for example when a device requires >=2 microseconds between this and that happening. I can also see that micros() is very different, returning a microsecond timebase, independent of interrupts. Building a longer delay (dozens to hundreds of microseconds) based on this is easy. But this should get anyone thinking... is an inline delay the best way to generate timing for a waveform? It's the easiest, but rarely the best way.
 

Offline mac.6

  • Regular Contributor
  • *
  • Posts: 225
  • Country: fr
Re: Arduino mega tone() messing up with delayMicroseconds()
« Reply #6 on: June 07, 2017, 07:04:01 am »
Stupid Ardooino is stupid. How about just using interrupts and/or timers like a sane human being? Because makers.

It's not stupid, it's a simple busy wait API that do not use any specific hardware like there is on almost all OS/framework on earth.
I guess that arduino guys could be more precise in their API reference (ie it's a best effort API that only provides at least x us delay), but at least they specify that interrupt are not disabled when calling it.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf