Author Topic: R2/R DAC function generator - advice/am I insane?  (Read 3246 times)

0 Members and 1 Guest are viewing this topic.

Offline TriodeTiger

  • Regular Contributor
  • *
  • Posts: 200
  • Country: ca
R2/R DAC function generator - advice/am I insane?
« on: September 30, 2012, 12:55:00 am »
Hadn't built many beginner projects and I thought it was time to make an R2/R ladder DAC. Alright.

Built the networks, 6 in total = 2^6=64 steps, and wired an op amp in a voltage follower configuration in a moment (I had great trouble last time finding what pin was where last time, so great!)

Figured I'd just use an Arduino I had on hand, for binary output.

--------------------Functions!----------------------------

sawtooth: i=0;i<2^5;i++; PORTD = i
sawtooth inverse: i=2^6;i>0;i--; PORTD = i
triangle: both of the above (clever!)
square: on and off...

sine wave: ugly table of integers, alright.

(ps: nice picture of what it looked like before I got insane below:
)
---------------------------------------------------
Now to the good stuff: It can pump out clean waves @ 1-10kHz with uS delays each step, or dirty but fast ones @ ~30-60kHz with no delays. Now how do I ...say...order it "Give me a 4kHz sawtooth!"? ...or 4.5kHz sine?

----------------------------------------------------

I'm sometimes poor at math but it greatly interests me, so I just spewed out crazy possibilities:

10kHz = 1000uS, sawtooth loop takes 330 cycles (=330uS/16MHz=20.625uS, verified on delayed timebase and AVRStudio's simulator to match! Yay!)

So... (1000uS - 20.625uS) / 2^5 = 30.606uS per loop iteration (of i) required delay to create roughly 1000kHz sawtooth.

Okay...
so...

Code: [Select]
for(int i=0;i<resolution;i++) {
     PORTD = i;
     _delay_us(30);
}

Uh oh... 606nS * 64 = ~39uS skew

So ...there's no _delay_ns? okay... make one!

Code: [Select]
void _delay_ns(int delay) {
for(int i=0; i < delay; i+=1/F_CPU*1e9) {
asm volatile("NOP"::);
}
}

Assuming NOP is one cycle...say F_CPU is 16MHz = 62.5nS granularity?, but 0.062uS is better than 1uS.

So... I calculate exactly what I need this time...of course this is depending on 330 cycles to be static
Code: [Select]
       float delay_per_cycle = ((1.0/freq) - (330.0/F_CPU)) / 64 * 1000000.0;
        int delay_per_cycle_us = (int)delay_per_cycle;
int delay_per_cycle_ns = (int)((delay_per_cycle - delay_per_cycle_us) * 1000);

I'm too tired to verify if that is correct, or if this even works on 8 bits+avrgcc...but okay.

Code: [Select]
for(int x = 0; x < 2^6; x++) {
    PORTB = x;
_delay_us(delay_per_cycle_us);
_delay_ns(delay_per_cycle_ns);
  }

Bam! Now my simulator test seems to make this a ~200-300Hz signal rather than 1kHz, might be due to math errors somewhere, as I am not sure how to check what numbers are in AVR studio...guess I'll have to figure out the debugger a little better....

But is this insane?

How does a DDS control frequency when it's pumping out the wave form, from I assume EEPROM (which I technically am doing for some of it), but this way I can at least create functions based on...functions, such as my for loops, although I have to calculate its total cycles each time.

I'd be upset if I spent 5 hours on this project and no one thought it was crazy, or at least interesting, for a beginner (I've never touched AVR studio or PORTD before!). :-)


EDIT: I realised the loop had a greater delay than the nop! explains the lag, and brings me to ~300nS granularity for that delay, just as I read using _delay_us(float<1 here) et. al. has...so ideas of a better way would be neat

Alexander.
« Last Edit: September 30, 2012, 01:00:59 am by MmCoffee »
"Yes, I have deliberately traded off robustness for the sake of having knobs." - Dave Jones.
 

Offline Pentium100

  • Frequent Contributor
  • **
  • Posts: 258
  • Country: lt
Re: R2/R DAC function generator - advice/am I insane?
« Reply #1 on: September 30, 2012, 01:35:29 am »
So ...there's no _delay_ns? okay... make one!

Code: [Select]
void _delay_ns(int delay) {
for(int i=0; i < delay; i+=1/F_CPU*1e9) {
asm volatile("NOP"::);
}
}

Assuming NOP is one cycle...say F_CPU is 16MHz = 62.5nS granularity?, but 0.062uS is better than 1uS.
EDIT: I realised the loop had a greater delay than the nop! explains the lag, and brings me to ~300nS granularity for that delay, just as I read using _delay_us(float<1 here) et. al. has...so ideas of a better way would be neat
Instead of making this a function, add this to your output loop (syntax will most likely be wrong):
Code: [Select]
while (i<del) i++;
find out how many cycles one iteration takes. You can also try to use asm to optimize it (I won't since I could only do it in x86 asm). Also, calculate "del" - the amount of iterations you need - before starting the output.

Quote
How does a DDS control frequency when it's pumping out the wave form, from I assume EEPROM (which I technically am doing for some of it), but this way I can at least create functions based on...functions, such as my for loops, although I have to calculate its total cycles each time.
By skipping some samples. My USB DDS (Hantek 3x25 IIRC, too lazy to check now) has a few fixed sampling rates (say, 100MHz, 200MHz and some lower) and 4K sample memory. The needed frequency is made by changing the sampling rate (in large steps), the number of cycles in the memory (100MHz sample rate means that the whole memory is read at ~25kHz, so one cycle in memory is 25kHz output, two cycles is 50kHz) and how much of the memory is read (but only by a little, so that whole cycles would be read). When I set the output frequency, the DDS (or the API) tells me the number of cycles I should put in the memory and how much space I get to put them in.
 

Offline cloudscapes

  • Regular Contributor
  • *
  • Posts: 198
Re: R2/R DAC function generator - advice/am I insane?
« Reply #2 on: September 30, 2012, 02:46:57 am »
I'd use wavetables for all shapes. never worry about timing between them.
 

Offline AndreasF

  • Frequent Contributor
  • **
  • Posts: 251
  • Country: gb
    • mind-dump.net
Re: R2/R DAC function generator - advice/am I insane?
« Reply #3 on: September 30, 2012, 09:30:08 am »
You sort of already answered your own question: the delay loop takes longer than NOP, but I'm not sure if you realize why that's the case. Try to put yourself in the processor's shoes and think about what needs to happen on every iteration of the loop:

1) check the loop condition (i<delay)
2) do nothing for (presumably) one cycle
3) increment the iterator i (depending on whether the compiler was clever or not this may actually require that the size of the increment, 1/F_CPU*1e09, is calculated on every iteration as well, though probably not)

So the whole loop will take considerably longer. Steps 1 and 3 should of course always take the same amount of time, so if you have a way of determining that time you could simply adjust your delay calculation. However, your minimum delay length (and delay resolution) is still considerably longer than one CPU cycle.

Another option would be to use a timer with it's interrupts. You'll still have some overhead in terms of cycles to set-up the timer and react to the interrupt (thereby limiting your minimum delay time), but your available delay resolution should be considerably higher than with a for or a while loop.

my random ramblings mind-dump.net
 

Offline miceuz

  • Frequent Contributor
  • **
  • Posts: 374
  • Country: lt
    • chirp - a soil moisture meter / plant watering alarm
Re: R2/R DAC function generator - advice/am I insane?
« Reply #4 on: September 30, 2012, 10:53:57 am »
There are a bit more things processor does when you call a function like _delay_ns - it puts current inctruction address into the stack, then it puts argument for the function into the stack, then it jumps to function address where it pops argument from the stack. And after the delay loop it pops return address from stack and jumps to it.

You can avoid this by declaring function "inline" - then the compiler will copy function code into all places where function is called, note, that your code size will grow.

You can try to write your delay routine by declaring it via macro, so the compiler will put lots of NOPs in places where delay is required, but I can't remember the specifics now.

alm

  • Guest
Re: R2/R DAC function generator - advice/am I insane?
« Reply #5 on: September 30, 2012, 12:10:39 pm »
I'm quite sure that _delay_us does a decent job of compensating for any overhead as long as optimization is turned on and you only pass it constant values. Don't expect to get better results by rolling your own. You can probably find tests on avrfreaks.net.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf