Author Topic: Glitch capture  (Read 4824 times)

0 Members and 1 Guest are viewing this topic.

Offline desowinTopic starter

  • Contributor
  • Posts: 21
Glitch capture
« on: December 22, 2011, 12:54:26 pm »
Assume there is software running on microcontroller which outputs logic clock signal with nominally X µs between rising/falling and falling/rising edges. In rare moments the delay between logic transitions can be longer - that's perfectly fine. Define problem as time between rising/falling (or falling/rising) edges is lower than X µs. If problem doesn't appear in trillion cycles, then it's assumed to be good.

What is best suited tool to empirically check if such problem exists? Can a digital oscilloscope be used for that (if yes, what besides appropiate bandwidth does it need to have, and how to set the trigger)?
 

Offline Psi

  • Super Contributor
  • ***
  • Posts: 10416
  • Country: nz
Re: Glitch capture
« Reply #1 on: December 22, 2011, 01:14:54 pm »
You can set the scope to trigger on pulses shorter than X µs then check back every so often to see how long it takes before it captures something.
But you can run into issues with some scopes not having the right trigger options.
I think the rigol only has "pulse width greater than" and not "less than", but i could be wrong.

There is probably some piece of test gear better suited to this though, something that automates the process.




« Last Edit: December 22, 2011, 01:20:43 pm by Psi »
Greek letter 'Psi' (not Pounds per Square Inch)
 

Offline Balaur

  • Supporter
  • ****
  • Posts: 525
  • Country: fr
Re: Glitch capture
« Reply #2 on: December 22, 2011, 01:41:22 pm »
I would guess that there is something like an universal counter or some data acquisition system that could be able to fulfill your task with some programming.

However, the approach I would use in this case is to design a quick test rig with the following algorithm:

I. Use a fast & stable clock (100 MHz let's say with a <1ps jitter and a few ppm stability) as a timebase
II. Record how many cycles of this signal you count between the each edge of your signal
III. Put that data into a spreadsheet, analyze the results, plot a histogram, etc

By preference, I would implement the counting/recording into a FPGA connected to a PC for data logging and analysis.

Cheers,
Dan
 

Offline baljemmett

  • Supporter
  • ****
  • Posts: 665
  • Country: gb
Re: Glitch capture
« Reply #3 on: December 22, 2011, 02:34:06 pm »
Assume there is software running on microcontroller which outputs logic clock signal with nominally X µs between rising/falling and falling/rising edges. In rare moments the delay between logic transitions can be longer - that's perfectly fine. Define problem as time between rising/falling (or falling/rising) edges is lower than X µs.

I'm pretty sure my 80s-era HP logic analyser can do this sort of thing -- but it wouldn't be ideal for this situation because I think the time it takes to rearm after a trigger will leave a blind spot and slow proceedings right down.  Although thinking about it, by defining the trigger as 'steady for less than X µs before edge' it should capture at full speed and only trigger/re-arm on the failing events, which might do the job.  Then just leave it in continuous acquisition and see how many triggers it counts...

However, regardless of whether it can do this precise job, knowing that it can do something similar suggests that maybe more modern LAs will have similar features that could be useful?
 

Offline bfritz

  • Regular Contributor
  • *
  • Posts: 134
  • Country: us
Re: Glitch capture
« Reply #4 on: December 22, 2011, 05:28:24 pm »
Assume there is software running on microcontroller which outputs logic clock signal with nominally X µs between rising/falling and falling/rising edges. In rare moments the delay between logic transitions can be longer - that's perfectly fine. Define problem as time between rising/falling (or falling/rising) edges is lower than X µs.

I'm pretty sure my 80s-era HP logic analyser can do this sort of thing -- but it wouldn't be ideal for this situation because I think the time it takes to rearm after a trigger will leave a blind spot and slow proceedings right down.  Although thinking about it, by defining the trigger as 'steady for less than X µs before edge' it should capture at full speed and only trigger/re-arm on the failing events, which might do the job.  Then just leave it in continuous acquisition and see how many triggers it counts...

However, regardless of whether it can do this precise job, knowing that it can do something similar suggests that maybe more modern LAs will have similar features that could be useful?

The above is absolutely correct.  The 80's-era HP logic analyser can do this with no problem.  It will capture at the full digitizing rate without having to re-arm and re-trigger.  Some of the better Tek, Agilent, and LeCroy scopes have this option, but not sure exactly which models.  You'd just have to wade through the manuals.

I'd probably look for something like "Conditional Triggering" in the scope manual.
 

Offline Rufus

  • Super Contributor
  • ***
  • Posts: 2095
Re: Glitch capture
« Reply #5 on: December 22, 2011, 09:48:17 pm »
Assume there is software running on microcontroller which outputs logic clock signal with nominally X µs between rising/falling and falling/rising edges. In rare moments the delay between logic transitions can be longer - that's perfectly fine. Define problem as time between rising/falling (or falling/rising) edges is lower than X µs. If problem doesn't appear in trillion cycles, then it's assumed to be good.

A digital scope with infinite persistence (a very common feature) can be set to trigger on one edge and collect and display the positions of multiple instances of the following edge.  The attached trace shows the spread in delay between the edges on two signals for example.

You didn't mention any clock frequency. Observing a trillion cycles of a 100kHz clock will take more than 3 years so you might want to reconsider your assumptions.
 

Offline desowinTopic starter

  • Contributor
  • Posts: 21
Re: Glitch capture
« Reply #6 on: December 23, 2011, 07:01:51 pm »
I will have a check for these features on the lab equipment I can get access to - thanks guys.

You didn't mention any clock frequency. Observing a trillion cycles of a 100kHz clock will take more than 3 years so you might want to reconsider your assumptions.

Frequency basically depends on X, hence I didn't provide any.
Speaking of time - well, there's where "empirical" comes in - just collect enough and hope it'll be the same later.

This question was more about asking how to do such thing and what are the right tools, but well, I might have used the "infinite" instead of "trillion" to make it more generic.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf