Characterization of Clocks and Oscillators - NIST
https://www.nist.gov/sites/default/files/documents/calibrations/tn1337.pdf
Parsing some of the materials. Just for the docs, found that in full (in the quoted link is the ToC):
https://tf.nist.gov/general/pdf/868.pdfhttps://nvlpubs.nist.gov/nistpubs/Legacy/TN/nbstechnicalnote1337.pdfSo far some categories emerged (from all the suggestions):
- Quartz crystals
- Re-entrant cavities
- Coaxial and Helical resonators
Then, there is generic topics like
- Waves, Resonators, etc. (e.g. Feynman's Lectures on Physics)
- Clocks and oscillators
- Filters
- Crystal filters
Then, there is simulation vs. hands-on experiments. Well, at least here I have some progress, though it is not experimental as I would have wanted, but a simulation instead.
For a demonstration, I would stick with lumped L and C in an appropriate frequency range, since variable Cs are easy to find and connect. Crystals have very high Q, but have a narrower tuning range. In the demonstration example I discussed above, it was useful to tune the free-resonant frequency of one L-C pair over a relatively wide frequency range.
I made a tool that takes the data file produced by an LTspice simulation, and turns the ".step" directives from the schematic into a live slider that I can drag with the mouse.
For each position of the slider, a new .step dataset of results is taken from the LTspice simulation results and slapped on the chart. Since the datasets were pre-calculated by LTspice, the dataset change on the plotted charts is very fast, like it would be if it were to change the capacitor or the coupling in an experimental setup while watching live the response of the circuit on a spectrum analyzer.
As a result, there is a slider for "Cvar" and there is a slider for "Coupling factor". I found it very educative to see live, by dragging a slider, how the frequency peaks slide from left-right when changing C, or move apart while changing K. Looks crude if not lame, but that slider plot was just enough for me to get a grasp of the phenomena. Some 3D plots might make sense, too, for a better visualization.
I'm not a software developer, but that would be a very useful thing, IMO, to polish the .step to slider tool, and make it a standalone visualization tool that can plot interactive parametric simulations from LTspice.
That being said about quantum computing, I don't see why, or where exactly, a quantum particle or quantum behavior is mandatory. It should work with macroscopic objects just fine, and this is what I want to try.
You can't reproduce superposition. There is no macroscopic 2 state system that can be in both of those states at once.
I would be curious to know what made you say so? Is this because this is what you were told it is, or is it some conclusion you draw for yourself? And how did you arrived at adopting your current position? (These question are not an attempt to intimidate you, or to have a flame, I'm asking because I am genuinely interested in what makes people in general, and you in particular, to agree with something or not. Again, not curious about you, but about the process of accepting an explanation or not, please try to answer those questions, only if you don't mind, of course, or maybe in a PM if you don't want to make that public.)
Back to the the superposition, I'll say I can have that. For example, when I speak my voice produces many frequency at once, a full spectrum of it. That's a superposition of many tones. In fact, I'll dare to say that most of the macroscopic systems are rather in many states at once.
The magic and the fundamental difference to macroscopic systems is that you can do tricks to select results from this combination of all possible results in a somewhat deterministic way.
My problem is that I fail to see any magic when I look under the hood, but rather misinterpretation and/or poetic/philosophical wording of well known physics, the physics of waves.
Then, there is the misinterpretation of imaginary numbers from math. They are used a lot when it comes to waves and oscillations, but there's nothing imaginary there, the physics forces (e.g. in a mechanical oscillator), or voltages (in an electric oscillation) or whatever, those physical measures are real. In the real world there's no such thing like the imaginary part of a force, or the imaginary part of an electric current, like it is the imaginary part of a complex number, where imag(X=a + bi) is "b".
I've tried to dig in the historic aspect of all the bamboozling related with QM. Letting aside the Copenhagen interpretation vs. Bohm's guiding wave (some are saying Copenhagen interpretation won only because the Bohm's political orientation did not aligned well with those times - but I won't go there), so letting aside that, the final debate that "settled" the dispute about quantum mechanics was if the entanglement was really "spooky action at a distance", or just "a predefined pair of gloves" so nothing acts at a distance and the entanglement is nothing but a prearranged pair of something that was predetermined from the very beginning.
Well, there were many experiments trying to settle that, some quite complicated with bamboozling terms like time erasing and so on. At some point there was an experiment called now the
Bell test and
Bell's Theorem. Bottom line, it says if you measure straight lines, then entanglement is a "predetermined pair of gloves", if you see a curved shaped, like a bell (sic
), then entanglement is "spooky action at a distance".
Source: Wikipedia
https://en.wikipedia.org/wiki/Bell%27s_theoremWell, they measured the curved blue line, so it was "settled" the quantum weirdness is "for real". Well, not everybody was happy with that reasoning. Some showed how that blue curve can arise as well if you mix some noise in the clasical "pair of gloves", so the entanglement spooky action at a distance is not necessary at all to explain that blue curve that was measured experimentally.
So yeah, right now I'm inclined to believe superposition is as common as the superposition of a head/tail coin while in the air, and the entanglement is nothing but synchronous oscillations with some phase noise.
Apart from that, another thing I disagree about is how a quantum computer is considered advantageous over a digital one for certain algorithms, and where from this advantage is supposed to be coming from. I think it is calculated all wrong, but that's a different story.
That's a billion Dollar statement right there.
Not sure I understand where you aiming there. My disagreement was about how the
Big O was applied to quantum computing in order to compare various algorithm run on a QC with an equivalent algorithm run on a digital computer.
The analog nature of the numbers representation in a quantum computer was ignored, same it was ignored its parallelism. Then, it was compared with a run on a digital computer, which is essentially a binary and serial machine. That seems an unfair comparison, making the digital computers to appear much worst that they really are. If it were to have analog bits instead of binary digits, and if it were to make all the operations at once instead of making them in a serial manner, then the quantum computing won't be that advantageous after all.
The thing is, vintage analog computers were doing just that: they were operating with real numbers instead of bits and serial arithmetic, but that doesn't scales well because of the limited resolution and the noise present in analog processing. That's why we all use digital computers now, though the analog computers were invented first.
Later edit: a few typos fixes