Author Topic: DAC input Buffer?!  (Read 1887 times)

0 Members and 1 Guest are viewing this topic.

Offline xzswq21Topic starter

  • Frequent Contributor
  • **
  • Posts: 295
  • Country: 00
DAC input Buffer?!
« on: December 16, 2017, 10:32:11 pm »
Hello
I want to send the digital from a FPGA to a high speed DAC.
as usual the input digital data is noisy.
does it affect the performance of the DAC? is it necessary to place a Buffer between the FPGA and the DAC?

Thanks
❤ ❤
 

Offline rhb

  • Super Contributor
  • ***
  • Posts: 3483
  • Country: us
Re: DAC input Buffer?!
« Reply #1 on: December 16, 2017, 10:55:17 pm »
A buffer won't help if the *data* are noisy.  If the signal is noisy you need to look at the board layout.  You *might* need a driver buffer if the DAC is a long way from the FPGA.  If that's the case, a redesign should be considered.
 
The following users thanked this post: xzswq21

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7738
  • Country: ca
Re: DAC input Buffer?!
« Reply #2 on: December 17, 2017, 12:08:38 am »
Which FPGA?
1 extra parallel DFF buffer at the output pins in your FPGA code, especially if you instruct your compiler/fitter to use the IO pin registers for that DFF buffer can clean thing up dramatically, parallel skew timing included.  Also, look at output drive current and slew rate settings for those outputs, that can help too since you may have an FPGA with 250MHz capable IOs, but you are only outputting a 20MHz or less data.  Also, when clocking your FPGA, make sure you use a global clock network, or, an internal PLL.

I've yet to make an Altera FPGA design in the past 10 years with any dirty outputs.  Now, if the noise is created by your PCB design, this is something else you need to take care of and you still shouldn't need some sort of buffer unless you are converting something like a 1.2v fpga output to a 5v signal for the dac input.
 
The following users thanked this post: xzswq21

Offline xzswq21Topic starter

  • Frequent Contributor
  • **
  • Posts: 295
  • Country: 00
Re: DAC input Buffer?!
« Reply #3 on: December 17, 2017, 06:36:30 am »
Which FPGA?
1 extra parallel DFF buffer at the output pins in your FPGA code, especially if you instruct your compiler/fitter to use the IO pin registers for that DFF buffer can clean thing up dramatically, parallel skew timing included.  Also, look at output drive current and slew rate settings for those outputs, that can help too since you may have an FPGA with 250MHz capable IOs, but you are only outputting a 20MHz or less data.  Also, when clocking your FPGA, make sure you use a global clock network, or, an internal PLL.

I've yet to make an Altera FPGA design in the past 10 years with any dirty outputs.  Now, if the noise is created by your PCB design, this is something else you need to take care of and you still shouldn't need some sort of buffer unless you are converting something like a 1.2v fpga output to a 5v signal for the dac input.

I'm using Xilinx ZYNQ.
the DAC is 200MSPS.
I have considered many tips in my PCB layout to reduce the Noise and EMI.
if I said you the data is noisy bcoz I have used usual simple switcher power supply for the FPGA.
now as your answer :
I should use 1 extra parallel DFF buffer at the output pins in my FPGA code?

is it necessary to use a low noise voltage regulator for the FPGA IO?
❤ ❤
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7738
  • Country: ca
Re: DAC input Buffer?!
« Reply #4 on: December 17, 2017, 07:10:04 am »
I'm using Xilinx ZYNQ.
the DAC is 200MSPS.
I have considered many tips in my PCB layout to reduce the Noise and EMI.
if I said you the data is noisy bcoz I have used usual simple switcher power supply for the FPGA.
now as your answer :
I should use 1 extra parallel DFF buffer at the output pins in my FPGA code?

is it necessary to use a low noise voltage regulator for the FPGA IO?
Depending on your code, you can either set the output pins to use their dedicated registers.  If this slows down your FMAX below what you need, add the extra clocked DFF, though remember that your DAC data will be pipe delayed by 1 clock.  The idea is to get your output registers right at the IO pins on the die and not have the outputs being fed routed from somewhere else on the die as this can make some parallel timing across all the data lines look un-refined on the scope.  IE, an ugly eye pattern differing from one data bit to the next...

Any old regulator is ok for the IO pins as long as you are using a solid pure ground plane, and a decoupling cap at each VDDIO/VCCIO pin.  The only exception may be you might need a cleaner supply for the analog PLL supply pins, but usually you can get away with an extra series inductor/ferrite and large cap value for that pin...

Also, read up here: https://www.google.ca/search?source=hp&ei=EBg2Wu-AEKfJjwSDnLXoBA&q=Xilinx+ZYNQ+pll&oq=Xilinx+ZYNQ+pll&gs_l=psy-ab.3..35i39k1l2.2613.2613.0.3908.3.2.0.0.0.0.77.77.1.2.0....0...1c.2.64.psy-ab..1.2.157.6...80.7CVgFFV5iZs
make sure that your noisy data isn't because of a noisy PLL settings generating jitter.  Also use the dedicated PLL clock output when feeding your dac clock, as well as using the internal pll output to feed your logic.
« Last Edit: December 17, 2017, 07:12:24 am by BrianHG »
 
The following users thanked this post: xzswq21

Offline xzswq21Topic starter

  • Frequent Contributor
  • **
  • Posts: 295
  • Country: 00
Re: DAC input Buffer?!
« Reply #5 on: December 17, 2017, 12:10:21 pm »
I'm using Xilinx ZYNQ.
the DAC is 200MSPS.
I have considered many tips in my PCB layout to reduce the Noise and EMI.
if I said you the data is noisy bcoz I have used usual simple switcher power supply for the FPGA.
now as your answer :
I should use 1 extra parallel DFF buffer at the output pins in my FPGA code?

is it necessary to use a low noise voltage regulator for the FPGA IO?
Depending on your code, you can either set the output pins to use their dedicated registers.  If this slows down your FMAX below what you need, add the extra clocked DFF, though remember that your DAC data will be pipe delayed by 1 clock.  The idea is to get your output registers right at the IO pins on the die and not have the outputs being fed routed from somewhere else on the die as this can make some parallel timing across all the data lines look un-refined on the scope.  IE, an ugly eye pattern differing from one data bit to the next...

Any old regulator is ok for the IO pins as long as you are using a solid pure ground plane, and a decoupling cap at each VDDIO/VCCIO pin.  The only exception may be you might need a cleaner supply for the analog PLL supply pins, but usually you can get away with an extra series inductor/ferrite and large cap value for that pin...

Also, read up here: https://www.google.ca/search?source=hp&ei=EBg2Wu-AEKfJjwSDnLXoBA&q=Xilinx+ZYNQ+pll&oq=Xilinx+ZYNQ+pll&gs_l=psy-ab.3..35i39k1l2.2613.2613.0.3908.3.2.0.0.0.0.77.77.1.2.0....0...1c.2.64.psy-ab..1.2.157.6...80.7CVgFFV5iZs
make sure that your noisy data isn't because of a noisy PLL settings generating jitter.  Also use the dedicated PLL clock output when feeding your dac clock, as well as using the internal pll output to feed your logic.

I think your Idea is similar in denouncing technique :) usually for the input push buttons I have been using for example 8 of DFF and NOT/AND gates... to denounce the input command
anyhow
I have used a low jitter XO on my board, the supply for the clock distributer is very low noise and the ground plane is unified and solid
❤ ❤
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7738
  • Country: ca
Re: DAC input Buffer?!
« Reply #6 on: December 17, 2017, 09:56:48 pm »
Maybe your problem is scoping.  Are you using J-Fet low capacitance amplified probes when probing?  Are you using a tiny spring at the head of the probe for the GND?  In the world above 200Mhz, these things count.

If you are getting bit errors on the dac, try inverting or changing the phase of you CLK signal.  This is a common fix for basic designs in the 200MHz region.
 

Offline xzswq21Topic starter

  • Frequent Contributor
  • **
  • Posts: 295
  • Country: 00
Re: DAC input Buffer?!
« Reply #7 on: December 18, 2017, 07:20:02 am »
Maybe your problem is scoping.  Are you using J-Fet low capacitance amplified probes when probing?  Are you using a tiny spring at the head of the probe for the GND?  In the world above 200Mhz, these things count.

If you are getting bit errors on the dac, try inverting or changing the phase of you CLK signal.  This is a common fix for basic designs in the 200MHz region.

Thanks for your attentions and remarks.
Actually the DAC is 200MSPS, the clock is LVDS, the DAC output driver is near the DAC. the maximum output frequency is 40MHz.
I have concern about noise-coupling between the FPGA and the DAC. I mean when I want to analysis the SNR of output signal, I want to have a high SNR.
❤ ❤
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7738
  • Country: ca
Re: DAC input Buffer?!
« Reply #8 on: December 18, 2017, 08:21:38 pm »
Maybe your problem is scoping.  Are you using J-Fet low capacitance amplified probes when probing?  Are you using a tiny spring at the head of the probe for the GND?  In the world above 200Mhz, these things count.

If you are getting bit errors on the dac, try inverting or changing the phase of you CLK signal.  This is a common fix for basic designs in the 200MHz region.

Thanks for your attentions and remarks.
Actually the DAC is 200MSPS, the clock is LVDS, the DAC output driver is near the DAC. the maximum output frequency is 40MHz.
I have concern about noise-coupling between the FPGA and the DAC. I mean when I want to analysis the SNR of output signal, I want to have a high SNR.

Ahhhh, are you asking about separating digital switching noise from making it from the FPGA through your DAC to the output and you are only operating at 40MHZ?  I thought you were getting data errors due to maximizing data and clock speed right at the edge of your component's capabilities.

I've been answering with the wrong advice if this is your true goal.

I would start with setting your FPGA outputs to their minimum drive current if you are only driving 40MHz directly between the FPGA and DAC.  This will soften everything up on your scope immediately.
Secondly, if your FPGA still heavily nails the data lines too hard, I would add series resistors, something like 100ohm on each data line and clock as close to the FPGA as possible.  The will reduce EMI by a long shot around the area between the FPGA and DAC, thus signal bounce due to internal capacitance in the DAC will no longer feed the high speed rise and fall spikes coming out of the FPGA.

This solution also alleviates reflection spikes/overshoots on the data lines due to a really fast outputs on the FPGA side.

Try keeping the extra DFFs at the output, this still helps as well for low speed.

« Last Edit: December 18, 2017, 10:28:53 pm by BrianHG »
 
The following users thanked this post: xzswq21

Offline xzswq21Topic starter

  • Frequent Contributor
  • **
  • Posts: 295
  • Country: 00
Re: DAC input Buffer?!
« Reply #9 on: December 18, 2017, 10:43:03 pm »

I would start with setting your FPGA outputs to their minimum drive current if you are only driving 40MHz directly between the FPGA and DAC.  This will soften everything up on your scope immediately.

Why?
I want to produce a signal within some KHz to 40MHz. but the SFDR and SNR parameters are very important to me. (I have a medical application)

Secondly, if your FPGA still heavily nails the data lines too hard, I would add series resistors, something like 100ohm on each data line and clock as close to the FPGA as possible.  The will reduce EMI by a long shot around the area between the FPGA and DAC, thus signal bounce due to internal capacitance in the DAC will no longer feed the high speed rise and fall spikes coming out of the FPGA.

This solution also alleviates reflection spikes/overshoots on the data lines due to a really fast outputs on the FPGA side.

Try keeping the extra DFFs at the output, this still helps as well for low speed.

Yes, I agree with you, I have seen this technique in Analog Devices and TI's Evaluation boards.
❤ ❤
 

Offline BrianHG

  • Super Contributor
  • ***
  • Posts: 7738
  • Country: ca
Re: DAC input Buffer?!
« Reply #10 on: December 18, 2017, 11:19:50 pm »

I would start with setting your FPGA outputs to their minimum drive current if you are only driving 40MHz directly between the FPGA and DAC.  This will soften everything up on your scope immediately.

Why?
I want to produce a signal within some KHz to 40MHz. but the SFDR and SNR parameters are very important to me. (I have a medical application)

Note that this is a software equivalent to using series resistors, though not as flexible as some FPGAs just cant do this.  Only drive each IO what you need to cleanly achieve your operating FMAX under all temperature conditions.  Example, If I were using an Altera Cyclone IV to drive a 200MHz dac at 200MHz, nothing else on the data buss, I would use 4 or 8 ma drive, not 2 or 16(default) or 24ma.  I would use 16 or 24ma when driving 1 to 2 500MHz DDR2/3 ram chip data lines.

As for the series resistors, yes, 90% of analog devices high speed converters 150MHz clock and above, especially ADCs do place series resistors on all their eval boards to mitigate signal bounce crossing from the digital side to the analog side.  The resistors should go as close to the digital output as possible.  For a 200MHz clock, I would use 47ohm resistors instead of 100Ohm.  You may use 4x series resistor packs.  And you may also tune the ohm value to your design after the PCB is built.

 
The following users thanked this post: xzswq21


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf