Not sure if this is a beginner's question, but I certainly feel embarrassed asking it.
4 years of engineering school and I have no idea what could be causing this behaviour!
Anyway, as mentioned in the title, I've simply tried to measure the capacitance of a "47nF" ceramic cap, by connecting it up to a 3.3V square wave through a resistor, measuring RC, then dividing by R.
And, really oddly, my measurements showed that the time constant was either not the direct product of R and C, or that C was changing
significantly depending on R.
I took multiple measurements using multiple methods (automated software which would detect the time between the 2.5V and 1V passings, then dividing this value by R*ln(2.5), doing the same thing by hand, and also checking the time from when the signal first dropped to when it reached 37% of it's initial value) multiple capacitors, and multiple resistors over multiple days.
Here is an example.
All of them showed the same thing:
The higher the resistance went, the lower the capacitance went!
The measured capacitance would be roughly the same as the rated capacitance when R = 50k to 100k, and it would be about double when R = 1k to 10k.
I understand that the components are non-ideal (they're voltage-dependent, have a tolerance and there is a parasitic L and R, at least), but can I really expect the measured capacitance to
double when changing from a 5k to 50k series resistor? Is that a normal thing that happens, or am I going insane!?
Thanks,
~Chris
(P.S. Yes, I did connect the circuit up in series with the cap to GND.)