Hi all,
I came across a very nice video on YT where a simple vu meter driver circuit is presented.
My meters are very similar in regards to their full scale current (around 0.5mA or 500uA) but I want them to be accurate regarding their actual VU indication (e.g 1.23 Vrms to be reading 0 VU)
As it's shown in the video the 0VU mark on these maters is around 240-250uA and the maximum reading (around 5VU) is about 500uA which should be roughly equal to 2.2Vrms input signal.
What I want is my input signal to be displayed correctly on the meters as per this table:
http://www.cranesong.com/Volts%20to%20dBu%20to%20VU%20Comparison.pdfOn the video circuit, the signal needs to be around 200mV for the meter to reach full scale and since 200mV is about 10 times less than my input signal will be I thought the circuit would work as is if I messed around with the second opamp gain.
I changed it to attenuate the previous signal (the original video has 1+2 = 3 gain so I initially tried to make this about 10 times less so I put a resistor from the positive terminal of the second opamp to v_gnd to form a voltage divider of the input signal and made the negative feedback path resistors equal.
I used 1k to form a 1/10 signal attenuation network before the signal hits the non-inverting opamp.
But the circuit did not work as expected (I was getting way too much current on the output)
I then thought that VCC could be higher if I needed to since my meter's bulbs need about 10V to light up so I upped the VCC to 10V.
Again the needle goes higher with less signal.
I'm confused to where the actual gain control on this circuit could be.
Is it the second stage?
Or should I do something before the signal even hits the peak detector?
Since I will be using a single supply I can't think of any way to do that without clipping the opamp.
Can anyone help?
Can I use this circuit to display true VU values?
Can I also use it with say a single +12 supply instead of 6V?
I attached the circuit in question and the changes I perfomed.