I'm trying to build a voltage clipping circuit that will clip my signal voltage using diodes. Essentially, if the signal gets too high or too low, the diodes will turn on accordingly and clip the voltage down to its turn-on value. This is to suppress excess voltage spikes from environmental noise. In my case, I would like to clip the signal voltage to +/-2 volt range.
Now, looking online, the concept seems pretty simple. A pair of diodes will clip the signal voltage according to its forward voltage. This can be done with or without a DC voltage bias. I am trying to do it without a voltage bias because I don't have one available. So, I'm depending on the forward voltage of the diode.
However, I'm having issues with my circuit simulation. I've attached it below for LTspice as well as a picture so you can see my test circuit. I've also included the SPICE model of the diode I'm using to try and get this circuit right. The diode is the BY203-20STR, with a max forward voltage of 2.4 volts at 200 mA. From the circuit, node IN2 should clip my 5-volt sinusoidal voltage from V2 after passing my filter to a max/min of 2.4 and -2.4 volts respectively. However, after passing the said filter, the voltage gets clipped to -0.73 and 0.73 volts, much less than what I was expecting. Is it possible that I built my circuit incorrectly? I don't really understand why this is not working the way it should be. I've tried this with several other SPICE models, but it doesn't seem to work properly.
For reference, the SPICE macromodel consists of two diodes in parallel (from my understanding), with one model being the forward bias and the other model being the reverse bias. I copied the .MODEL statements and put them together accordingly to skip the need to make a model of it. I've also had .MODEL statements of other diodes I tried to use in my schematic, and they seemed to work, though I'm taking those with a grain of salt, as I had to extract the diode model information from the datasheet, so it may not be completely accurate.