Products > Programming

Can't understand an algorithm (Pan-Tompkins QRS detection)

(1/3) > >>

Karel:
Is it me or is there an error in the article?

In the section "Adjusting the Average RR Interval and Rate Limits" is written that
the RR LOW LIMIT, RR HIGH LIMIT and RR MISSED LIMIT are derived from RR AVERAGE2.

But RR AVERAGE2 is updated only with RR intervals that are within the RR LOW LIMIT and RR HIGH LIMIT.
So, if suddenly the RR interval changes to a value outside of the limits, the limits will never be adjusted.

My impression is that these limits should be updated from RR AVERAGE1 instead.

What do you think?

mrflibble:

--- Quote from: Karel on March 30, 2020, 03:33:55 pm ---Is it me or is there an error in the article?

In the section "Adjusting the Average RR Interval and Rate Limits" is written that
the RR LOW LIMIT, RR HIGH LIMIT and RR MISSED LIMIT are derived from RR AVERAGE2.

But RR AVERAGE2 is updated only with RR intervals that are within the RR LOW LIMIT and RR HIGH LIMIT.
So, if suddenly the RR interval changes to a value outside of the limits, the limits will never be adjusted.

My impression is that these limits should be updated from RR AVERAGE1 instead.

What do you think?

--- End quote ---

Just read it, and I see no inconsistencies. Or more precisely, ASSUMING a valid start condition, I see no problems.

"So, if suddenly the RR interval changes to a value outside of the limits, the limits will never be adjusted."

And? The definition is the definition is the definition. If following your example "suddenly the RR interval changes to a value outside of the limits, the limits will never be adjusted", the signal subsequently exhibits the behavior as given in my-this-here-example --> "after the previous sudden excursion the RR interval returns to a boring median value", then the RR AVERAGE2 will have fullfilled its purpose.

An alternative would be: get two other papers/books/whatever on the topic at hand. Waaaaay back I've learned that having just 1 source of information as default learning plan is a bad plan. You are bound to encounter errors, author pet opinions, learning style mismatches, etcetera in the long run. When learning something new I always try to get at least three "decent" sources, because, well, humans. Spending a large amount of time just to decode someone's exposition is something for an era where the average person was lucky to own a book at all (i.e, the past). Besides, this paper is from 1985. Should be plenty of similar material out there. With a bit of luck there's a different paper out there that fills in some of the blanks...

SiliconWizard:
As I see it, the limits get adjusted from RR_AVERAGE2 for a purpose. They are refined at each iteration. What I haven't got clearly (didn't read/ though about it enough) is how the whole thing gets "reset" - otherwise you may expect the limits to stay put or shrink at every iteration, so the possible range would get narrower and narrower. It needs a way to get the other way I guess. Again didn't study that enough, the answer may be obvious.

mrflibble:

--- Quote from: SiliconWizard on April 01, 2020, 06:01:27 pm ---What I haven't got clearly (didn't read/ though about it enough) is how the whole thing gets "reset" - otherwise you may expect the limits to stay put or shrink at every iteration, so the possible range would get narrower and narrower.

--- End quote ---

If I understood it correctly (big IF  ;D ) then the selection window for RR_AVERAGE2 can grow just fine. It looks like the relative window size is chosen such that: 1) it can grow larger in response to a slowly changing RR_AVERAGE2 and 2) it will remain unchanged in the face of short term perturbations aka heart arrhythmia.

The selection criterion is  0.92*RR_AVERAGE2 <= RR <= 1.16*RR_AVERAGE2. So if the difference is within -8% / +16% of the current RR_AVERAGE2, it will get selected and included in the update for RR_AVERAGE2.

Besides, in the healthy situation, there is a defined "reset" as per equation (29).
If the 8 most recent intervals are all within the defined interval, then heart rate is labeled as stable.
And in that (stable) case RR_AVERAGE2 value is updated to RR_AVERAGE1.

So for normal heart, with a skipped beat every now and then --> model works as intended. And using this as an example ... healthy heart with slowly increasing RR_AVERAGE1 --> slowly increasing RR_AVERAGE2 --> slowly widening of selection window.

But it's not exactly fool proof. I can think of a bimodal oscillator that will neatly fsck things up. No idea if that artificially constructed signal has any mapping to human physiology though. :-// For the exact case, probably not. But for the "close enough" case, I would not be surprised at all if nature did come up with an intriguing failure mode...

Karel:
Thank you for trying to help. I guess I wasn't completely clear. What I meant was:

"So, if suddenly the RR interval changes to a value outside of the limits and stays there, the limits will never be adjusted."

An example:

Let's say we have an ECG recording that starts with a stable heartrate at 60 bpm.  low limit is set to 55.2 bpm and high limit is set to
69.6 bpm. The heart rate slowly moves up to 75 bpm and stays there for some time. The limits are adjusted accordingly because
it happens all slowly.
New values at this point are 69 bpm for low limit and 87 bpm for high limit.
Now, suddenly, the heart rates drops quickly back to 60 bpm (these things happen!). The high and low limits are never updated
as long as the heart rate doesn't go up to at least 69 bpm for some time.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod