Hi everyone,
I'm working at a LASER lab measuring short light pulses with a photodiode (PD). The setup is simple: A LASER shines a train of light pulses into the PD, each pulse lasts for 50fs and the next pulse arrives to the PD 0.2ms later (
so something like a PWM signal at 5KHz with a 2.5E-10 duty cycle), the PD is reverse biased with an internal battery (
manual linked below) so it should put out a short current spike for every light pulse, to read it we plug the PD into an osci (
using a 50Ω coax, 2m, RG58 cable) and use the osci's input impedance as a termination resistor to read the current spikes.
As expected there are some differences in the signals' shape when setting input impedance to 50Ω or 1MΩ, and yeah I'd expect there to be some LP filtering when combining junction and cable capacitance with a big termination resistor but not like this:
Fig. 1: 50Ω input impedance. Yellow trace, pulses marked with vertical cursors for visibility
Fig. 2: 1MΩ input impedance. Yellow trace.
First image shows the waveform for a 50Ω input impedance and second one for a 1MΩ, as expected the second one has a higher level at the expense of a super long decay time but what rubs me wrong is the rise time is almost instant in comparison! I would expect a low pass filter to be independent of "direction" (i.e. rising or falling). What do you think? My guess is the photodiode is playing some role into this, kind of how an Attack/Release synthesizer module uses diodes to discharge a capacitor through different resistors but the specifics are lost on me.
Anyway my team and I would appreciate any insight into this phenomenon and how to solve it since in the future we will have to use 10m long cables and I fear cable capacitance will worsen the problem.