Author Topic: Measuring audio and video latency of a monitor given an HDMI signal  (Read 762 times)

0 Members and 1 Guest are viewing this topic.

Offline allenwpTopic starter

  • Contributor
  • Posts: 19
  • Country: ca
Hi there,

I am primarily a software developer, but I have a particular interest in programming hardware to do things that it wasn't quite designed for. Recently I made a 3D video game engine that renders to vector displays like CRT oscilloscopes via an audio stream of a DC-coupled audio interface. ( youtube.com/watch?v=t2i9rZ1DvGo ) I feel like I understand some of the high level concepts, but I know almost nothing about the tools of the trade, so I'm coming here to look for some advice on tools (hardware and software) and equipment. Although I've used an oscilloscope as a vector display in XY mode, I've never used one for any traditional purposes and I really don't know what they're capable of or suited best for. The same could be said for any other type of test hardware equipment.

I have a budget of about $1500 CAD, but don't want to throw my money away unnecessarily.

My project is to develop a method of testing the audio latency of a device like a TV, soundbar, or receiver when given an HDMI signal. I believe I should be able to develop a method that could work with an average Windows 10 computer that has an HDMI output and a typical sound card. But I would like to verify the precision of my test method using proper equipment. While I'm at it, I would also like to measure a "source of truth" video latency to verify the precision of the Leo Bodnar Lag Testers.

This is where my question comes in: What type of equipment should I use to measure a "source of truth audio and video latency"?

I expect my process for measuring audio and video latency to be similar to the method used in this article (probably faster to google translate the original german article) where the author did something very similar to myself: He was developing a new test method for measuring video latency and wanted to verify his test method's precision using industry-grade tools and equipment.

I expect that it will be cost prohibitive to perform this analysis with an HDMI 2.1 packetized signal (4K120Hz or equivalent bandwidth/frequency). This means that I will be looking to only analyze a TMDS signal. I expect I only need need to look at one of the TMDS channels to get an idea of when pixel or audio packet information has changed from previous frames. For a 1080p signal, apparently the pixel clock is 148.5Mhz. For a HDMI 2.0 signal, it's apparently 594 MHz. Because TMDS is DC balanced, I expect that I need to have a fairly high resolution of sampled data to get an idea of when frame data has changed, but I don't need to be able to see clearly what the actual bits are, since I'm not debugging the HDMI signal -- I'm just looking for "when did this sound packet or pixel get sent" by comparing to previous blank/black frames. I should note that the frequencies I mentioned are the pixel clocks, not the bit clocks. I understand that the are 10 bits sent per tick of the pixel clock, so this is more like a 1.5GHz signal for 1080p.

Here are some thoughts I have on how I might do this:
  • Turn the audio or video output from the monitor I am testing into a simple voltage. I could use a photodiode for video and a microphone or the line out of the monitor. This part is easy.
  • Analyze one of the HDMI's TMDS channels alongside the voltage from the previous point.
  • The moment the voltage from the first step goes above a certain value, indicating that a pixel has lit up or sound is being produced, save a snapshot of the last few hundred thousand samples or so. Maybe I could do this using an oscilloscope or some sort of data logger? If I could get a 1.5GHz DC-coupled audio interface, I'd just use a that to record to my computer, but those are usually limited to 192kHz.
  • Review the waveforms of the two channels I sampled to determine the time difference between when the pixel was sent over HDMI and when the voltage of the photodiode/mic changed. Again, this is super trivial at a frequency of 192kHz with a DC coupled audio interface because analyzing a waveform with something like Audacity is super easy. But I need to be operating at frequencies that are muuuuuuch higher, of course.

Thanks for taking the time to read through this. I really appreciate you taking the time to help out people like myself who have no formal education on any of these sorts of things :)

Allen

(Edit: I understand that this might simply not be possible with a $1500 CAD budget. It seems like the article I mentioned, where a similar test was done, used a 12.5Ghz Tek DSA71254 to analyze a 1080p TMDS signal. Their research also suggests that TMDS encoding is such that a new frame that is identical to the last may have different bits, so vsync patterns needed to be used... This might mean that audio packet analysis is impossible without a TMDS decoder...)
« Last Edit: May 31, 2021, 08:38:52 pm by allenwp »
 

Offline allenwpTopic starter

  • Contributor
  • Posts: 19
  • Country: ca
Re: Measuring audio and video latency of a monitor given an HDMI signal
« Reply #1 on: June 01, 2021, 08:03:53 pm »
I'm going to take a stab at answering my own question here:

This could be done with one of these tools:

  • Digital Storage Oscilloscope
  • Mixed Signal Oscilloscope
  • Logic Analyzer that also has analog inputs (Is this just a Mixed Signal Oscilloscope?)
  • A PC/USB version of one of the above three tools
  • Build my own PC based logger with an ADC board

Is there another type of test device that I'm overlooking?

If I were to use a logic analyzer, I would need to make sure it had an analog input to measure the monitor output (light, sound, or audio output voltage) alongside the HMDI digital signal. Maybe a logic analyzer with analog inputs isn't a always the same as a "mixed signal oscilloscope" because it doesn't have other oscilloscope functions? An example of the type of logic analyzer I'm talking about is the Saleae Logic Pro which seems to be advertised as a logic analyzer, even though it also has analog 10/12 bit inputs.

It seems like measuring a digital signal requires a bandwidth of about about 5 times the signal frequency that would mean I would need something like a 7.5GHz device for a 1.5GHz HDMI signal, which I couldn't get at my budget. Because of this, I'm considering doing the test with a 640x480 HDMI signal, which might be within the bandwidth range that I can afford. I am also considering renting time at a university lab or something, if that's a thing I can do in my area once the pandemic is over - this would allow me to use a high frequency device for a one-off test.

Let me know if you have any thoughts :)

 

Offline Manul

  • Super Contributor
  • ***
  • Posts: 1158
  • Country: lt
Re: Measuring audio and video latency of a monitor given an HDMI signal
« Reply #2 on: June 01, 2021, 11:44:49 pm »
I have zero knowledge about hdmi, but my guess is that it can be done without a 10Ghz scope. I think that you should be able to detect some kind of signal envelope at much lower frequencies which correspond to some specific picture patterns. I dont think that you need to go bit level scoping for this task. But the problem is that a typical scope probe will put too much load on the signal lines because of high capacitance. So probing might need improvising. You migh be able to detect something with an EMC probe too. There should be something observable when signal changes from black to white, or maybe various size grids of checkerboard pattern.
 

Online Berni

  • Super Contributor
  • ***
  • Posts: 5024
  • Country: si
Re: Measuring audio and video latency of a monitor given an HDMI signal
« Reply #3 on: June 02, 2021, 05:40:31 am »
You can get HDMI reciever chips that take HDMI in on one side and output some other format on the other side. If you take one that has DPI RGB output then you get nice separate hsync vsync lines and the bits of each pixel in parallel bus format. So if you just count the appropriate number of clock cycles from the vsync pulse then you can easily check a certain pixel for being white or black. Some might even extract the audio data into a separate serial bus. Might be best to just buy a eval board for one of these chips.

The encoding on HDMI makes it look like a mess on the scope and if HDCP is enabled this data is even encrypted. Older video formats are much easier to see on a scope such as VGA or oldschool analog video (Composite, Component, Svideo)

 

Offline allenwpTopic starter

  • Contributor
  • Posts: 19
  • Country: ca
Re: Measuring audio and video latency of a monitor given an HDMI signal
« Reply #4 on: June 04, 2021, 07:49:30 pm »
Thanks for the thoughts, that's very helpful! I actually have a 150MHz analog Tektronix 2445 bench scope already. I'm thinking a good start would be to try and get a feel for what the signal looks like using this. This should help me make a more informed decision on the next steps, along with the points and recommendations you've made :)
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf