Author Topic: Audio line-out into an AMP, attenuation, impedance and other fun stuff  (Read 1708 times)

0 Members and 1 Guest are viewing this topic.

Offline PeteAUTopic starter

  • Contributor
  • Posts: 15
Hi,

I'm trying to design a simple little audio circuit, but there's more to it than meets the eye (at least for me).

I have an audio DAC outputting 2.1 Vrms full-scale and then an amplifier that cannot tolerate such a high signal.
I'm trying to attenuate the signal to around 0.42 Vrms, just using a resistor divider, and a cap to ground (low-pass filter).

Specs:
DAC: Outputs 2.1 Vrms, with 1K source impedance.
Amp: About 55K impedance (I think!), I measured how much it "loads" down my signal and calculated this. Needs 0.42 Vrms.

What do you think, will it work? Will there be any unexpected effects, .e.g. from impedance mismatch?
 

Offline TMM

  • Frequent Contributor
  • **
  • Posts: 471
  • Country: au
It's possible to introduce distortion like this because the current draw of the amp might not be completely linear with the signal voltage.

The ideal solution is to place a low output impedance buffer between the attenuator and the amp. If you don't require ultra-low distortion (much lower than even high end speakers are capable of) it should work just fine however.
 

Offline Richard Crowley

  • Super Contributor
  • ***
  • Posts: 4317
  • Country: us
  • KJ7YLK
I don't understand the use of any of the capacitors in your proposed circuit?  You didn't present any need for frequency-shaping in your description.

I would simply use a 10K log-taper ("audio") pot on the input of the amplifier. Then it would handle a wide range of levels, and it would be operationally adjustable. That is actually the more "normal" way of handling this.

With a 1K source impedance, the need for any further impedance buffering seems superfluous to me.
« Last Edit: June 22, 2014, 06:27:22 am by Richard Crowley »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf