Here's my try at a TRNG. Note this is not for any larger endeavor. I'm merely pulling at the threads of entropy for further understanding.
The majority of the circuit (attached) is an LM324 amplifying it's own noise, which is fed into a basic PIC micro for packaging and shipping. Initially I was simply collecting the LSB from each ADC sample, but it produced a fair amount of bias. I tried a chip with a 10 bit ADC and got similar results.
So I wrote a program which generates each random bit by comparing the ADC sample to a pre-calculated threshold, effectively using the ADC as a comparator. But this on its own is not enough. So the program also keeps track of bias and adjusts a PWM accordingly, giving the noise circuit's virtual ground a very slight nudge until it's about balanced.
The result is a bit stream which still has small localised bias, but tends toward a perfect 50/50 as more is collected. It's also important to note there is no post processing done on the bit stream (whitening / von Neumann etc.). I use a couple of Python scripts to do some very basic analysis on the output (tally bits & hex characters, basic patterns). Nothing approaching the diehard tests, which requires a few GB of sample data and me knowing where to start with them.