General > General Technical Chat

Digital clocks and A/D converters

(1/2) > >>

Circlotron:
Most digital clocks have a one minute display resolution. That means that even when set right they can be at times  :) up to 60 second in error. I propose that all digital clocks should be set with a 30 second offset so they click over at the half of the minute and therefore be only plus or minus 30 seconds max error. Same deal as an A/D converter having a 1/2 LSB offset so it spans either side of any given value. Who is with me?

tom66:
If the average error is 30 seconds in both cases, who loses from either option?

tggzzz:
I propose they should all be Vetinari clocks.

Benefit: that will get people to instinctively understand the difference between precision and accuracy, and that all instruments lie to a greater or lesser extent.

Examples:
https://entertaininghacks.wordpress.com/2015/02/23/vetinari-digital-clock/

BrokenYugo:
All you need to set one of those within a few seconds is a calibrated clock with a seconds readout, I use time.gov or a cell phone. Set the clock a minute or two behind, then advance it to the correct time just as the beginning of the minute comes up on the cal clock and quickly exit set mode.

SiliconWizard:
I dunno what kind of digital clock you've been using. Many these days display seconds, which can be also set.
But even when only the minutes can be set, the non-stupid ones just start at 0 second when you validate the new setting. Meaning you just need to validate the setting when the minute just changes on your reference clock, and there you have an error of only a couple 100s ms, unless your reflexes are compromised.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod