Author Topic: Frequncy standard use in audio equipment  (Read 6349 times)

0 Members and 1 Guest are viewing this topic.

Offline tkamiyaTopic starter

  • Super Contributor
  • ***
  • Posts: 2178
  • Country: us
Frequncy standard use in audio equipment
« on: May 11, 2021, 07:31:21 pm »
I'm asking because I'm genuinely curious....

I have been seeing reference oscillators of all kinds, including Rb, OCXO, and GPSDO for audo use.  Some report says they can "clearly hear the difference."  I'm thinking for syncing digital recording, etc, regular crystal or maybe TCXO is sufficient and that human ear can't possibly distinguish the difference in phase and frequency stability. 

Is this yet another case of snake oil?  Some of those goes for $4KUS or so. 
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #1 on: May 11, 2021, 07:48:37 pm »
It's no different than fancy expensive speaker cables and other audiophool nonsense. The placebo effect is real, if you believe some gadget will make your audio sound better then it will sound better to you. A reference oscillator would have to be REALLY bad to have any sort of audible effect though, and it's unlikely the oscillators in the gear used to record and master the audio were anything exotic either.
 
The following users thanked this post: xrunner

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9505
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #2 on: May 11, 2021, 07:52:55 pm »
Master clocks and distribution are used in recording studios for equipment synchronisation I believe. I don't know to what accuracy though.

For consumer audio, the reference oscillators you mentioned are total snake oil. What does matter is noise and jitter. Better quality gear tends to have better internal clock oscillators - starting from the basic on-chip inverter oscillator through to better quality canned ones and those using discrete components. These (all things being equal) do seem to improve sound quality - Different DACs have different jitter rejection, likewise clock recovery in things like S-PDIF receiver ICs.

Absolute frequency accuracy is totally inaudible though, even with the cheapest and most basic crystal oscillator.
Best Regards, Chris
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11260
  • Country: us
    • Personal site
Re: Frequncy standard use in audio equipment
« Reply #3 on: May 11, 2021, 08:10:31 pm »
Clearly hearing the difference is not enough. Can you measure it with measurement equipment? If not, then you are full on audiophool. 

And for most audio application $1 crystal is sufficient.
Alex
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14475
  • Country: fr
Re: Frequncy standard use in audio equipment
« Reply #4 on: May 11, 2021, 08:29:37 pm »
Master clocks and distribution are used in recording studios for equipment synchronisation I believe. I don't know to what accuracy though.

Yes, but synchronizing several digital sources and the frequency stability of a single source are two completely different things.

The typical use case for an end-user is just a single source. So there's usually nothing to synchronize.

Of course, as said above, for a single source, jitter matters much more than absolute frequency. We'd need to know if those shit expensive oscillators have at least much better jitter figures than modest ones. We can suppose so, but that's really the main point to consider.

Now whether the typical jitter of even a modest crystal-based oscillator (as long as it's not pure crap) has any measurable impact on distortion (harmonic/phase), that remains to be seen. That may be measurable if you directly measure the ouput of a DAC with a very clean output stage, but once it goes through some kind of amplifier and speakers (or headphones), all bets are almost certainly off. The latter will have a much more severe effect on phase and harmonic distortion than the digital clock.

And, now even when it's actually measurable, which as said above is likely not the case with a typical audio system, whether it can actually be "heard" by our normal ears, it's yet another question. The most sensible answer to this is probably, not at all.

But as we say on a regular basis, as long as said equipment has measurable technical benefits, even if it makes no difference to the ears, then it's just a matter of luxury IMO. If you want luxury equipment, that's your call. Now OTOH, if there is absolutely NO measurable technical difference, then it's snake oil, and a different category, although certainly some people won't be able to tell the difference. But I like the "luxury" argument. If you're buying a $20k watch made with very good engineering and great materials, that may not give you much more accurate time than a $20 watch (or so), but it IS luxury. Now if you're buying a $20k that is actually really a rebranded $20, that is just rip-off.

 

Offline tkamiyaTopic starter

  • Super Contributor
  • ***
  • Posts: 2178
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #5 on: May 11, 2021, 08:55:33 pm »
Even the jitter though, regular plain crystal oscillator has decent figure.  Can one actually hear the difference if say 32K fork quartz (the watch kind) or whatever they use in cheap gear was replaced with good quality one?  (or vise versa...  replace good one with bad?)

If jitter was the problem, certain Rb units won't be usable.  I can even see the jitter using simple oscilloscope.
 

Offline Gyro

  • Super Contributor
  • ***
  • Posts: 9505
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #6 on: May 11, 2021, 09:03:25 pm »
For audio gear, you tend to be talking at least 11.2896MHz, 16.9344MHz, 27MHz (AV), 48MHz or faster depending on sample rate, not down at 32kHz. At those sort of frequencies, it's more down to the oscillator design and decent RF layout you'd never hear the difference between individual crystals (unless faulty).

I mentioned above "(all things being equal)". Of course they never are. Better gear tends to have more attention to PCB layout and routing, component quality, grounding, PSU noise etc, in addition to 'better clock circuits.

Yes, some of the snake oil alternatives are clearly inferior in the parameters that matter.
« Last Edit: May 11, 2021, 09:08:23 pm by Gyro »
Best Regards, Chris
 

Offline tkamiyaTopic starter

  • Super Contributor
  • ***
  • Posts: 2178
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #7 on: May 11, 2021, 09:05:03 pm »
So....  having bad crystals doesn't turn Waltz into Polka?  :scared:
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11260
  • Country: us
    • Personal site
Re: Frequncy standard use in audio equipment
« Reply #8 on: May 11, 2021, 09:05:35 pm »
Jitter of the regular crystal oscillators should be perfectly acceptable. Again, as soon as a person does not accept double blind test, they can be dismissed as audiophool. Let them pay $4K for a clock source and $8K for a mains cable. Suckers need to be milked.
Alex
 

Offline tkamiyaTopic starter

  • Super Contributor
  • ***
  • Posts: 2178
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #9 on: May 11, 2021, 09:07:56 pm »
Master clocks and distribution are used in recording studios for equipment synchronisation I believe. I don't know to what accuracy though.
But as we say on a regular basis, as long as said equipment has measurable technical benefits, even if it makes no difference to the ears, then it's just a matter of luxury IMO. If you want luxury equipment, that's your call. Now OTOH, if there is absolutely NO measurable technical difference, then it's snake oil, and a different category, although certainly some people won't be able to tell the difference. But I like the "luxury" argument. If you're buying a $20k watch made with very good engineering and great materials, that may not give you much more accurate time than a $20 watch (or so), but it IS luxury. Now if you're buying a $20k that is actually really a rebranded $20, that is just rip-off.

I used to collect watches.  One of the things about luxury watches is that they'll always have some value.  Most will depreciate but some appreciate.  Like you say, cheap OCXO housed in a fancy box is probably worth nothing.  Unless of course it gets sold to another like minded people.
 

Offline penfold

  • Frequent Contributor
  • **
  • Posts: 675
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #10 on: May 11, 2021, 09:13:16 pm »
Was it not the introduction of dither on the sampling clock which made CD audio viable? Or am I wrong on that?
I seem to recall that the measurable sampling noise in audio-type signal digitisation is worse without dither... and a little drift never hurt anyone.

As an poignant point on audio-pragmatism, it is currently lashing down with rain outside with constant rumbles of thunder, I can barely tell whether my CD player is switched on, let alone what type of oscillator it contains.
 

Offline Bassman59

  • Super Contributor
  • ***
  • Posts: 2501
  • Country: us
  • Yes, I do this for a living
Re: Frequncy standard use in audio equipment
« Reply #11 on: May 11, 2021, 09:30:55 pm »
Was it not the introduction of dither on the sampling clock which made CD audio viable? Or am I wrong on that?

No. It was better mastering for the medium and better converters.

Dither is used when taking, say, 24-bit words down to 16 bits, instead of simply truncating.

 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14475
  • Country: fr
Re: Frequncy standard use in audio equipment
« Reply #12 on: May 11, 2021, 11:41:15 pm »
Was it not the introduction of dither on the sampling clock which made CD audio viable? Or am I wrong on that?

No. It was better mastering for the medium and better converters.

Dither is used when taking, say, 24-bit words down to 16 bits, instead of simply truncating.

Yep. And this is what needs to be done when mastering CDs these days, as most recording is now 24-bit. Of course, a lot more than this needs to be done, such as pretty "drastic" compression, so that the end result can still be listened to with small speakers/crap headphones.
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7588
  • Country: au
Re: Frequncy standard use in audio equipment
« Reply #13 on: May 12, 2021, 12:37:01 am »
Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14475
  • Country: fr
Re: Frequncy standard use in audio equipment
« Reply #14 on: May 12, 2021, 02:07:30 am »
Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?

For that much shift, you would definitely notice when there is music and you have "absolute pitch". Which is a pretty rare thing. Otherwise, nope.

Speaking of music, even without absolute pitch, if you're playing along some music with some instrument, then you would definitely get very annoyed by such a large shift. But even a basic crystal oscillator gets you around 100 ppm or so... so, nah. You'd have to be very good to hear a difference with a 100ppm frequency shift.

 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 21611
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #15 on: May 12, 2021, 02:43:53 am »
Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?

Today a lot of reruns of older TV shows are sped up slightly, they do that in addition to editing out bits and pieces in order to squeeze in more commercials. It's annoying and noticeable if you've seen the show before.
 
The following users thanked this post: SeanB

Offline artag

  • Super Contributor
  • ***
  • Posts: 1070
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #16 on: May 12, 2021, 02:10:33 pm »
Today's methods of changing speed do it without a pitch shift, so may be less obvious.
 

Offline Haenk

  • Super Contributor
  • ***
  • Posts: 1090
  • Country: de
Re: Frequncy standard use in audio equipment
« Reply #17 on: May 12, 2021, 03:16:37 pm »
Jitter of the regular crystal oscillators should be perfectly acceptable. Again, as soon as a person does not accept double blind test, they can be dismissed as audiophool. Let them pay $4K for a clock source and $8K for a mains cable. Suckers need to be milked.

There might arise issues with separated source/DAC, when transfered data is mixed together (via SPDIF eg.). Never experienced this myself, but could happen (according to "others"). Not sure, if that really leads to serious sound issues.
This should never happen when the data/clock is directly connected via a short distance (like in a CD player).

However one should still remember the (very early) method of channel switching in the very early days, when one single channel DAC like the 1540 was constantly switching left/right channels. That's probably where all the badmouthing about awful CD sound quality came from. Because it was an awful method :)

 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #18 on: May 12, 2021, 03:45:26 pm »
Jitter of the regular crystal oscillators should be perfectly acceptable.
There are certainly some crystal oscillators whose jitter is sufficient to degrade audio, but it doesn't require anything particular fancy to be a good clock source.

At 20kHz a 1 bit error in 16 bits is equivalent to about 500ps of jitter, so it takes a pretty nasty crystal oscillator to affect things there. However, at 20kHz a 1 bit error in 20 bits (more relevant to recording than to replay) is equivalent to about 30ps of jitter. A lot of crystal oscillators give that. Maybe you aren't too concerned about the errors (or the noise they manifest as) at such high frequencies, but to be really clean you are looking for jitter levels in the ballpark of what a good crystal oscillator gives.
 
The following users thanked this post: WattsThat

Offline PlainName

  • Super Contributor
  • ***
  • Posts: 6844
  • Country: va
Re: Frequncy standard use in audio equipment
« Reply #19 on: May 12, 2021, 04:12:12 pm »
Can you measure it with measurement equipment? If not, then you are full on audiophool. 

You need to know what to measure before you can say that not reading anything is meaningful.

Sometimes it's not obvious. The brain can use the minute difference in arrival time of sound at each ear to determine direction. If you just looked at that time interval in isolation you might well consider it pretty much irrelevant compared to the frequency range of the human ear.

And don't forget eyes: movies at 25fps are fine, but gaming screens talk about 120Hz. But there are frequencies where something can flicker and we can't tell it is flickering, but if it moves it's very obvious it is (roadwork beacons and LED rear lights on older cars are good examples). What would you measure there? In fact, it took a long time before the problem was accepted to be real, so would you have even bothered to think about measuring anything?

We could go on and note that making an object flicker at a specific frequency can change the perceived colour of it. If someone said film displaying at that frequency was 'more colourful' would you write them off a videophools?

I am not suggesting that any of the stuff laughed at here is pukka in any way, but I am suggesting that a knee-jerk thoughtless dismissal is unwarranted.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11260
  • Country: us
    • Personal site
Re: Frequncy standard use in audio equipment
« Reply #20 on: May 12, 2021, 04:47:42 pm »
For stuff having to do with human perception double blind test is a way to go. But that is usually denied even more vigorously by those types of people. They know they are full of it.
Alex
 
The following users thanked this post: Cubdriver

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #21 on: May 12, 2021, 11:07:23 pm »
For stuff having to do with human perception double blind test is a way to go. But that is usually denied even more vigorously by those types of people. They know they are full of it.
Double blind testing is very important. However, it takes a lot of testing to reach a meaningful conclusion. It takes a lot of different material to flush out every audible quirk a system may have. If you ever worked on lossy compression you will know this VERY well.
 

Offline Dubbie

  • Supporter
  • ****
  • Posts: 1115
  • Country: nz
Re: Frequncy standard use in audio equipment
« Reply #22 on: May 12, 2021, 11:38:56 pm »
When I was into audio as a teenager, I ran a blind test on myself and was absolutely shocked at how high the THD could get before I could reliably detect any distortion. After that exercise, I realised that most audio "tweaks" were a waste of time.

If you have enough clean amp power and decent speakers and room, tweaks up the chain such as DACs etc, were generally a waste of time for me.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11260
  • Country: us
    • Personal site
Re: Frequncy standard use in audio equipment
« Reply #23 on: May 13, 2021, 12:06:58 am »
Double blind testing is very important. However, it takes a lot of testing to reach a meaningful conclusion.
"clearly hear the difference"  should not take that long. If the differences are so minor that it takes forever to detect them reliably, then is the upgrade really worth the money?
Alex
 

Offline Circlotron

  • Super Contributor
  • ***
  • Posts: 3180
  • Country: au
Re: Frequncy standard use in audio equipment
« Reply #24 on: May 13, 2021, 12:15:40 am »
One way to avoid all this clock jitter stuff is to only listen to obviously "superior" vinyl...








LOL.
 

Offline JohnnyMalaria

  • Super Contributor
  • ***
  • Posts: 1154
  • Country: us
    • Enlighten Scientific LLC
Re: Frequncy standard use in audio equipment
« Reply #25 on: May 13, 2021, 02:33:18 am »
Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?

True for PAL in 50Hz countries. For NTSC in 60Hz, 24 fps films are (were?) broadcast with 3/2 pull-down which doesn't require shifting.
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7588
  • Country: au
Re: Frequncy standard use in audio equipment
« Reply #26 on: May 13, 2021, 02:46:40 am »
Today's methods of changing speed do it without a pitch shift, so may be less obvious.

It wasn't obvious to most people, just as an SSB received signal can be slightly off "zero beat" without sounding strange.

Strangely, some people seem more sensitive to phase errors between stereo channels, than absolute frequency.

I'm not at all sure how that works, but it seems to be those that listen in very quiet surroundings, or with headphones.

The ABCFM transmitter at Mawson, Western Australia was (I'm not sure, if it is now) fed with the discrete L&R signals, via a combination of microwave bearers & landlines.
(The community FM transmitters that shared the TVW7 site in my later job used composite stereo on a single Programme line.)

Unbeknownst to us at the site, part of the landline section had a failure of one stereo channel, so the phone techs rerouted  that channel only, via another couple of hundred km to restore the service.

One of our techs, who was very much into Classical music, swore he could hear a phase error between the two channels.

We listened on speakers in the quite noisy Tx control room & "Told him he was dreaming."
He persisted, so "just to shut him up" we checked it.

In their wisdom, the EEs at the ABC & Telecom Australia had instituted a pilot system to operate Programme fail alarms.
This pilot was in the form of a low level 15kHz tone on each channel,  R & L.

The alarm box  at the TX end of the programme lines very kindly supplied an output of these tones, so comparing them with an Oscilloscope was easy.

Of course, he was right! the pilot tones were well out of phase.
The phase error on lower frequencies would obviously be less, but to a "golden eared"  person, it was discernible.

On another occasion, at a different place, another audio enthusiast told us that there was intermittent noise on the stereo sound signal from a Commercial TV Tx.

Again, we all listened, hung a 7L12 spec an off the aural exciter, but couldn't either hear it or see it.
At that juncture I had a couple of days off, & returned to find that the guy standing in for me had been called out when the exciter went crazy, spilling white noise right across the sound channels.

Again, the person who was seriously listening in a quiet environment warned about a problem that was not evident to people in a noisier situation, even though they were actively listening for it to happen.

I think the 7L12 wasn't quite up to the job, although I used it later to find interference on a composite FM stereo signal.




 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7588
  • Country: au
Re: Frequncy standard use in audio equipment
« Reply #27 on: May 13, 2021, 03:12:54 am »
Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?

True for PAL in 50Hz countries. For NTSC in 60Hz, 24 fps films are (were?) broadcast with 3/2 pull-down which doesn't require shifting.

I avoided saying "PAL", as 50Hz BW systems  were around for years prior to PAL, & the same thing was done with SECAM systems.

One thing that happened everywhere, though, was replaying vintage 16 (& two thirds! ;D) fps movies at 24 or 25 fps.
The Kaiser's armies didn't look fittingly ominous in history programmes, when they seemed to be prancing along.
Modern systems have restored the gravitas of that formidable war machine.
 
The following users thanked this post: JohnnyMalaria

Offline Ultrapurple

  • Super Contributor
  • ***
  • Posts: 1027
  • Country: gb
  • Just zis guy, you know?
    • Therm-App Users on Flickr
Re: Frequncy standard use in audio equipment
« Reply #28 on: May 13, 2021, 08:25:00 am »
I came to this discussion a bit late, but I think a rather obvious point has been overlooked.

Audiophoolery should be checked with a double deaf test...
Rubber bands bridge the gap between WD40 and duct tape.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #29 on: May 13, 2021, 01:24:19 pm »
Double blind testing is very important. However, it takes a lot of testing to reach a meaningful conclusion.
"clearly hear the difference"  should not take that long. If the differences are so minor that it takes forever to detect them reliably, then is the upgrade really worth the money?
Value for money is a different issue. We frequently accept compromises in performance of all sorts of products because of price. The audiophoolery issue is whether there is a perceivable difference at all.

Frequently when you find a performance issue in audio equipment the right recording makes it show up very badly, every time you play that recording. If might be a bass line that hits some horrible resonance, or a transient that provokes some horrible effect. In things like codec research you build a portfolio of recordings which produce horrible noises through existing codecs to see how they perform with the codec you are working on, and some of those issues are far from subtle.
 
The following users thanked this post: SiliconWizard

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #30 on: June 02, 2021, 04:03:40 pm »
The best codec snafus happen in video however....

There was one of the early motion estimating Snell and Wilcox scalers that in development was notoriously prone to mistake peoples heads for footballs when the input was a game with a panned shot.

Also, grass that looked very, very, strange was a common one on compression codecs.

There is some evidence that close in phase noise on the modulator clocks can be audible, should not really be a surprise, sampling is basically mixing in the multiplication sense so noise sidebands should be expected. 

Jitter was a problem, and it is rather telling that the early protools rigs sounded BETTER with an external clock (Internal XO should always beat a VCO), but that was 25 years ago, today this is something you can fuck up, but you would have to work to get it really wrong.
 
The following users thanked this post: SeanB

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7856
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #31 on: June 02, 2021, 04:52:31 pm »
I'm asking because I'm genuinely curious....

I have been seeing reference oscillators of all kinds, including Rb, OCXO, and GPSDO for audo use.  Some report says they can "clearly hear the difference."  I'm thinking for syncing digital recording, etc, regular crystal or maybe TCXO is sufficient and that human ear can't possibly distinguish the difference in phase and frequency stability. 

Is this yet another case of snake oil?  Some of those goes for $4KUS or so.

Just saw this topic, so here's my $.02

Jitter in the digital sample clock usually wouldn't show up as the 'wow and flutter' from the magnetic tape days (although you could perhaps simulate it) so you typically wouldn't hear a pitch or phase difference.  Sample clock jitter simply translates to an amplitude error--in the file if the jitter is during recording, in the audio output if it is in the DAC clock.  Uncorrelated (random) jitter shows up as noise and is less perceptible.  Correlated (periodic) jitter shows up  differently and depends on the audio signal, because that is what it is modulating. 

There have been many arguments on this topic. The audiophool salespeople will claim that picoseconds of jitter are audible unless they are selling you products with 5 orders of magnitude more than that, then they claim it either isn't an issue or that it is the 'right kind' of jitter.  I remember looking at this closely once and concluding that 5ns or less of uncorrelated jitter just couldn't be audible, even theoretically, and people who do experiments generally find much higher thresholds.  Correlated jitter would be harder to make a definitive statement about, but I suspect that this is rarely an issue outside of devices that are broken or badly designed.  Just keep in mind that having a nice clock doesn't necessarily eliminate jitter, it can be reintroduced by power supply noise, interference or other things that may vary the trigger threshold of the clock signal somewhere in the chain.  This is one reason why 'master clocks' and synchronous data in an audio reproduction system are sort of idiotic--you want asynchronous data transfer to a buffer and then one clock right at the DAC.

http://archimago.blogspot.com/2018/08/demo-musings-lets-listen-to-some-jitter.html

Dither, which can be equated to uncorrelated jitter, can be introduced to mask distortions caused by truncation (reducing bit depth) and decimation, but can also be used to lower the apparent noise floor by pushing it up the spectrum.  So on a typical CD, the jitter dither is not uniform, but is distributed so that most of the energy is in the upper octaves where it is less audible.

This stuff is all well known to competent audio engineers for many years now, but not all products take advantage of that knowledge, especially those on the very low and and, ironically, on the very high end.  Of course on the high end all of their silly design mistakes can be compensated for with expensive exotic wood component stands and speaker cables starting at $10,000.
« Last Edit: June 02, 2021, 04:54:28 pm by bdunham7 »
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline jonpaul

  • Super Contributor
  • ***
  • Posts: 3366
  • Country: fr
Re: Frequncy standard use in audio equipment
« Reply #32 on: June 03, 2021, 06:55:49 am »
Hello all et bonjour a SiliconWizard...

Been reading the thread with great interest.

We were working on digital audio since 1970s (Dolby, THX, Eventide)

Also on  AES Standards commitee  for digital audio transmission and digital microphones 1980s..2017. (AES-42, AES-13 I think...) So I wrote  many AES papers, also SMPTE and NAB.

A few notes please;

 master clocks are usually at studios that create the media. Nowadays a GPS 10 MHz is very accurate, low jitter and low cost eg the Leo Bodnar.

Most consumer devices are playback,  not media creation, so  clock and data jitter is much more important that any clock frequency.

Digital audio transmission (AES/EBU, SP/DIF, AES3id) is on a single serial stream, Manchester encoded with embedded clock. A PLL is ued for clock recovery.

Thus the DAC for reconstruction depends on a recovered clock, from the PLL.

The noise to jitter issue was well researched bak in 1980s, see especially our old friends  Dr Steve HARRIS (Crystal/Cirrus) classis papers (1987?) as well as the fine work of Dr Julian DUNN (RIP)  (Cambridge, AP).

Most of the jitter is due to CM HF asynchronous  noise contamination of the clock and PLL.

We designed special TX and RX ref circuits to reduce the jitter issues, by greatly increasing CMRR and reducing primary to secondary capacitance in the required  transformers.

The fine  results were confirmed by both lab and listening tests.

Beware that any fair listening needs a double blind like the 1970s..1980s ABX relays with random hidden selection.

The Audiophools will object, but the ABX was a wonderful Occams Razor.

The ABX are long gone but easy to build with a flip-flop, oscillator and gold contact relay.


Hope this sparks some interesting response !


Jon





« Last Edit: June 03, 2021, 07:01:23 am by jonpaul »
Jean-Paul  the Internet Dinosaur
 

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7856
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #33 on: June 03, 2021, 01:14:55 pm »
Most consumer devices are playback,  not media creation, so  clock and data jitter is much more important that any clock frequency.

You'd be surprised how many consumer devices have ADC inputs.  All of the signal processing, including tone and volume, is often done digitally in a 32-bit processing environment, so any analog line-in, microphone or phono inputs may be converted to PCM.

Quote
Digital audio transmission (AES/EBU, SP/DIF, AES3id) is on a single serial stream, Manchester encoded with embedded clock. A PLL is ued for clock recovery.

Thus the DAC for reconstruction depends on a recovered clock, from the PLL.

The noise to jitter issue was well researched bak in 1980s, see especially our old friends  Dr Steve HARRIS (Crystal/Cirrus) classis papers (1987?) as well as the fine work of Dr Julian DUNN (RIP)  (Cambridge, AP).

Most of the jitter is due to CM HF asynchronous  noise contamination of the clock and PLL.

Yes, those synchronous formats use the embedded clock and top-quality systems nowadays use a dual-PLL to recover it--and that is certainly more than good enough provided the signal is good.  Synchronous is needed for video, but playback from music servers nowadays--both local and remote--often uses asynchronous transfer to a buffer and the DAC uses a local clock.  It either works or it doesn't--there's no possibility of degrading the clock with this setup.  On my system, when I stream a radio station, play a FLAC or MP3 file or even play a CD (depending on how I do it) the streams are all buffered and clocked locally--no embedded clock and no PLL.
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline MazeFrame

  • Contributor
  • Posts: 34
  • Country: de
  • = != ==
Re: Frequncy standard use in audio equipment
« Reply #34 on: July 15, 2021, 08:06:25 pm »
I'm asking because I'm genuinely curious....

I have been seeing reference oscillators of all kinds, including Rb, OCXO, and GPSDO for audo use.
World Clocks (as used in studios with lots of conversions and digital to digital interconnects) run at 10MHz via 75 Ohm BNC cables.

There are also "re-clockers" that help the less timing stable formats (TOSLINK) keep pace, mostly because some manufacturers cheap out on the inputs.

This may make sense for PA installations on concerts or bigger recording studio setups (= your job depends on the result sounding good). At home, no point to it.
Never Forgive, Always Forget.
Perpetually Angry and Confused!
 

Online EPAIII

  • Super Contributor
  • ***
  • Posts: 1062
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #35 on: July 16, 2021, 06:33:51 pm »
TV movies played at 25 fps? Really?

I spent 45+ years in TV engineering. I designed and built TV stations. I installed the film projectors on what is called a "film island" which usually consisted of two 16mm/35mm film projectors, a 35mm slide projector, a TV camera, and the necessary optics and electronics to select one of the projectors to be seen by the camera.

The film projectors used were always run at the correct film fps rate. ALWAYS. Some of these film projectors relied on the AC power for their speed while others used the TV station's synchronizing generator. But in all cases the film was run at the same fps speed that it was recorded with. And that means that the sound track was also being run at that speed, usually 24 fps, not 25 fps. If they were run at 25 fps, the film program or commercial would have ended before it's stated run time and that would have caused problems with air time that would need to be filled. But a 50 minute film did run for a full 50 minutes, not 48 minutes which a 25 fps speed would have produced. Likewise for other run lengths.

There is a fundamental problem with the 24 fps speed of film footage and the 50 or 60 frame rates of the various analog TV standards used around the world. This was solved by mechanical systems that chopped the light path at a rate that was about five times the film pull-down speed. The actual pull-down from one film frame to the next occurred in the black interval between two of these intervals of light passage. A small time lag in the pick-up tubes in the TV camera smoothed the transitions and each TV camera frame received the same number of chopped light in the projector. In NTSC video this process is called a "3:2" pull down. In countries that used PAL or other TV video standards with a 50 frame rate the pull down was "2:2" AND the film being shot for TV was shot at 25 fps, not 24 so again, the speeds matched. This is how the non-compatible difference in TV vs. film frame rates was handled: NOT by running the film projectors at a different frame rate.

I am sure that some 24 fps films were shown in those 50 fps countries but that was not the standard. For the most part, no one noticed a shift in the audio frequencies because there was NONE. And when there was a frame rate difference, I am sure that some, probably most people did notice. 4% is not a trivial difference when sound is involved. It probably drove the music lovers nuts.

More on this:
https://en.wikipedia.org/wiki/Telecine#Frame_rate_differences



Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?
Paul A.  -   SE Texas
And if you look REAL close at an analog signal,
You will find that it has discrete steps.
 

Online bdunham7

  • Super Contributor
  • ***
  • Posts: 7856
  • Country: us
Re: Frequncy standard use in audio equipment
« Reply #36 on: July 16, 2021, 06:58:07 pm »
TV movies played at 25 fps? Really?

Yes, in markets where the screen vertical scan rate was 50Hz.....

b/t/w I've actually seen video that has been processed from 24FPS to 25FPS by simply repeating every 24th frame, a 24:25 pull-up if you will.  It looks hilarious.  But the soundtrack was OK.

"Film shot for TV"--yeah, OK.  But what if it wasn't? 
« Last Edit: July 16, 2021, 06:59:42 pm by bdunham7 »
A 3.5 digit 4.5 digit 5 digit 5.5 digit 6.5 digit 7.5 digit DMM is good enough for most people.
 

Offline vk6zgo

  • Super Contributor
  • ***
  • Posts: 7588
  • Country: au
Re: Frequncy standard use in audio equipment
« Reply #37 on: July 17, 2021, 03:18:45 am »
TV movies played at 25 fps? Really?

I spent 45+ years in TV engineering. I designed and built TV stations. I installed the film projectors on what is called a "film island" which usually consisted of two 16mm/35mm film projectors, a 35mm slide projector, a TV camera, and the necessary optics and electronics to select one of the projectors to be seen by the camera.

The film projectors used were always run at the correct film fps rate. ALWAYS. Some of these film projectors relied on the AC power for their speed while others used the TV station's synchronizing generator. But in all cases the film was run at the same fps speed that it was recorded with. And that means that the sound track was also being run at that speed, usually 24 fps, not 25 fps. If they were run at 25 fps, the film program or commercial would have ended before it's stated run time and that would have caused problems with air time that would need to be filled. But a 50 minute film did run for a full 50 minutes, not 48 minutes which a 25 fps speed would have produced. Likewise for other run lengths.

There is a fundamental problem with the 24 fps speed of film footage and the 50 or 60 frame rates of the various analog TV standards used around the world. This was solved by mechanical systems that chopped the light path at a rate that was about five times the film pull-down speed. The actual pull-down from one film frame to the next occurred in the black interval between two of these intervals of light passage. A small time lag in the pick-up tubes in the TV camera smoothed the transitions and each TV camera frame received the same number of chopped light in the projector. In NTSC video this process is called a "3:2" pull down. In countries that used PAL or other TV video standards with a 50 frame rate the pull down was "2:2" AND the film being shot for TV was shot at 25 fps, not 24 so again, the speeds matched. This is how the non-compatible difference in TV vs. film frame rates was handled: NOT by running the film projectors at a different frame rate.

I am sure that some 24 fps films were shown in those 50 fps countries but that was not the standard. For the most part, no one noticed a shift in the audio frequencies because there was NONE. And when there was a frame rate difference, I am sure that some, probably most people did notice. 4% is not a trivial difference when sound is involved. It probably drove the music lovers nuts.

More on this:
https://en.wikipedia.org/wiki/Telecine#Frame_rate_differences



Human hearing is remarkably forgiving.
For years, movie soundtracks recorded at 24 f.p.s. film rates were replayed on TV at 25 f.p.s., shifting all audio frequencies by around 4.2%.
Did anyone notice?

That may have been your experience, but it was not mine, having spent a similar time working in the TV industry.

In 50Hz countries, it was common to run ordinary film at 25fps rate.
When we were taught about telecine equipment back in the 1960s, a big thing was made about how North America, with their 30fps system, had to use complicated "pull down" systems to present film run at 24fps.

It was pointed out how fortunate 50 Hz countries were in that regard, due to the quite small difference between the standard film rate & the fps in 50 Hz countries.

Certainly in Oz, most of the telecine chains in use from the 1960s to the early 1980s were simple "projector looking at a vidicon/plumbicon camera" assemblies.
There were a few legacy "flying spot scanners", but they were relatively rare.

During the early to mid '80s, "line stores" came into use in telecines.
This allowed line readouts at a different rate to that which the projector was operating.
These were in time, supplanted by frame stores, allowing the film to be played at any rate, & seamlessly converted.

Your comment about "run times" is a bit of a "red herring".

Films shown on Oz Commercial TV back in the pre Videotape days were pretty savagely "hacked about" to fit the number of ads required, so any run time difference could have easily been absorbed.
Each film reel was, after the insertion of ads, marked with its run time, so the operators knew exactly how long each ran.

Despite that, films did run short or long----- they still do, even with all the modern stuff, as do theatre movie presentations from time to time.


The ABC (non- commercial) filled spaces between programmes with short features or musical interludes, which could be, & were cut short as required.

P.S. this is a quote from your linked site:-

"However, during the analogue broadcasting period, the 24 frame per second film was shown at a slightly faster 25 frames per second rate, to match the PAL video signal. This resulted in a fractionally higher audio soundtrack, and resulted in feature films having a slightly shorter duration, by being shown 1 frame per second faster."





« Last Edit: July 17, 2021, 03:30:04 am by vk6zgo »
 

Offline Ultrapurple

  • Super Contributor
  • ***
  • Posts: 1027
  • Country: gb
  • Just zis guy, you know?
    • Therm-App Users on Flickr
Re: Frequncy standard use in audio equipment
« Reply #38 on: July 21, 2021, 09:07:17 am »
"I'll Be Back.
(A little sooner in 50Hz countries.)"
Rubber bands bridge the gap between WD40 and duct tape.
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 8646
  • Country: gb
Re: Frequncy standard use in audio equipment
« Reply #39 on: August 02, 2021, 06:19:34 pm »
i recently got into reading up and playing with some "CHi-Fi" stuff (for free)

1 of the manufacturers from china is the firm called "SINGXER", the equipment is SU-2 and SU-6. part of the marketing tagline is "femtosecond clock" (mmmm?)

the actual clock in use inside have jitter in under 1 pico-sec
already, it is creating a circle of hype where people claim to be hear enhancements in sound

however, in SU-2, it claims it is able to inject GPSDO 10Mhz clock to "do something" ... i dont quite understand it.
there is very sparse information about what improvement it can do, and by what means it stabilizes the internal crystals for 44.1/48k.

the 2 crystals in use is not OCXO. but many thinks it is working in that way, misled by some discussion in some other forum. CCHD-957 for SU-6 and AS381B for SU-2. i browsed many similar pdf, but they are TXCO. did i read wrong?

i think this is a case of a "USB memory stick with no chips inside", for SINGXER.
The use of GPS can be practical, or it can be BS. The legitimate goal is for mixing from multiple sources. If the sources are all locked to a super accurate clock they can be mixed freely without sample slips. The BS use is to claim a super accurate sample rate matters for listening purposes.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf