Wow, amazing.
My patent was a lot simpler and less clever. It was just a photon counter. Well, to be honest it was a way of using up CCD sensors that had failed inspection. A CCD from a (then) high-resolution camera would be put beneath a diffuser. The diffuser, and so the CCD, would be exposed to the light source. The contents of the device would then be clocked out into a capacitor. The voltage on the capacitor would be read by a comparator. The process would continue until the threshold was passed at which point the count would stop and provide the digital value. A value that was essentially a count of the photons. Not individual photons granted, given the noise performance of the CCD sensors, but still that is the essence. Dodgy cells could be skipped which was why reject CCDs could be used. The patent has long since expired. No idea if the company ever made one. It is irrelevant now, there aren't surplus reject high-quality CCDs anymore I would assume.
Ahh, I see. So, there's a way in which it could be said to be -- skipping an A-to-D by always being D in the first place -- but, since it's still going through a capacitor, it's not, really, but "wouldn't it be cool if it were?", I guess is more where you were going with this thread?

Yeah, I can see that.
Such systems would be better called "quantum", since QM isn't really digital itself, but is stochastic in practice, so, it's not sharply timed digital, it's noisy, it's uncertain, and really these are all just alternative ways to say "analog noise" in the limiting (smallest signal) case, give or take exact physics of the system in question of course (not all systems exhibit full shot noise, for example).
Which, put another way: digital is a strict subset of analog where we define thresholds for '1' and '0' (or any other values), but if we dial those thresholds down into the noise floor, it doesn't really matter, does it, we're counting statistics of bits equally as well as analog spectra down there. Quantum is a superset of analog, I suppose is the point then.

There are indeed methods nowadays, where for example, an avalanche photodiode array can be paired with individual (per-pixel) counters, thus resolving photon counts (assuming illumination low enough that chance of overlapping events is arbitrarily small) directly. This gives an image nearly identical to a regular one, in the long-exposure limit, but the interesting part is what's enabled by more involved statistical methods, like L1-norm filtering, which, I still have to read up on, but loosely speaking and as I understand it, it's not something that's been done much before because it's computationally expensive, but in the mean time we've trivialized that, so, we can now get shockingly good image formation from rather modest photon counts.
Tim