I tried out a new display LCD display that is 3.3V and the data sheet says around 0.1V is a good voltage for the contrast pin.
https://www.newhavendisplay.com/nhdc0220aafswftw-p-3886.htmlThe datasheet shows the contrast being subtracted from VCC which I find odd.
My typical method to test contrast on an LCD is to blink a cursor. You will usually find that if the contrast isn't ideal that it is visible this way. For example, it might take too long to transition from unlit to lit, or too long from lit to unlit depending on the contrast voltage.
I hooked up the contrast pin to my AWG and played around with it and ideal seems to be around -100mV. 0V is fine. 100mV is starting to show a cursor that seems more off than on, 200mV is worse, and 300mV shows dimming in other character's pixels that are not changing. -200mV is shows a cursor that seems more on than off, -300mV shows unlit pixels starting to darken.
So this is a range of -300mV to +300mV for what I would call a valid adjustment. Ideally one would have a pot that had this range to switch between, or perhaps I could generate it with a PWM and RC network like I did on another project.
Is there an easy way to make -0.3V and 3V from a 3.3V vcc? I'm wondering if it is worth the effort. Maybe just grounding the contrast would be a suitable solution most of the time...