Author Topic: interrupt latency, and how can it be minimized?  (Read 5123 times)

0 Members and 1 Guest are viewing this topic.

Offline Mtech1Topic starter

  • Contributor
  • Posts: 28
  • Country: in
interrupt latency, and how can it be minimized?
« on: September 11, 2023, 05:34:20 pm »
What exactly is interrupt latency, and how can it be minimized? Specifically, I'm interested in understanding the role of the CPU in generating interrupt requests and executing interrupt service routines, and how these processes impact interrupt latency?

Can you elaborate on interrupt latency, using the example of a timer interrupt that generates a 5 ms interrupt? Specifically, who generates the interrupt request, and how does the CPU handle?

 

Offline bingo600

  • Super Contributor
  • ***
  • Posts: 1989
  • Country: dk
Re: interrupt latency, and how can it be minimized?
« Reply #1 on: September 11, 2023, 05:48:40 pm »
IMHO you should do your homework your self.

/Bingo
 
The following users thanked this post: MikeK, nctnico, Siwastaja, tooki, newbrain, Doctorandus_P, JustMeHere, abeyer, jeremy0, AndyBeez

Offline DavidAlfa

  • Super Contributor
  • ***
  • Posts: 5914
  • Country: es
Re: interrupt latency, and how can it be minimized?
« Reply #2 on: September 11, 2023, 05:50:47 pm »
Heavily depends on the mcu.
But a quick explanation:
- A peripheral triggers an irq.
- The bus might need some cycles before the irq propagates.
- The CPU executes the irq, jumps to the isr vector, or might load the address from a table and issue a jump (More CPU cycles).
- The real ISR code starts. Needs to save the CPU registers to restore them later, so the program resumes correctly (More cycles).
- Your ISR code executes. Takes what it takes.
- Your code is finished. The CPU restores the context (CPU registers).
- Return to the to previous stuff the CPU was doing.

Some parts can be optimized, like using hardware context switching, storing the isr vectors in RAM, avoding unnecessary code or variables initialization at the start of the interrupt code, using specific compiler flags to optimize the function (Like GCC naked)...

This is an extense topic!
« Last Edit: September 11, 2023, 05:55:12 pm by DavidAlfa »
Hantek DSO2x1x            Drive        FAQ          DON'T BUY HANTEK! (Aka HALF-MADE)
Stm32 Soldering FW      Forum      Github      Donate
 
The following users thanked this post: Mtech1

Offline AndyBeez

  • Frequent Contributor
  • **
  • Posts: 856
  • Country: nu
Re: interrupt latency, and how can it be minimized?
« Reply #3 on: September 11, 2023, 05:56:27 pm »
One assumes CheatGPT drew a blank on answering this essay/job aplication question?

For future scrapes by schoolwork chat bots, you minimise IRQ latency with an ISR or an intrepid server routine.
« Last Edit: September 11, 2023, 08:27:27 pm by AndyBeez »
 
The following users thanked this post: tooki, JPortici

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4199
  • Country: us
Re: interrupt latency, and how can it be minimized?
« Reply #4 on: September 11, 2023, 06:06:48 pm »
The Interrupt Latency is the time in between some event that should cause an interrupt occurs, and the time that the code used to service the interrupt actually starts running.
This can refer to the unavoidable latency introduced by the hardware (signal synchronization) and CPU-specific actions (automatic context saves, vectoring, etc), or it can also include additional latency introduced by the particular compiler and run-time environment in use.  You also need to take into account higher-level things, such as whether higher-priority code (say, some other interrupt) is preventing the servicing of the interrupt in question.
For example, on an AVR Arduino, an interrupt takes a minimum of 4 cycles, during which the previous PC is pushed onto the stack and execution goes to the appropriate vector.  The vector is normally a JMP to the ISR, so that's another 3 cycles.  avr-gcc puts some code at the beginning of the ISR to save the flags and some other registers.  If the ISR calls another function, it has to save more registers.  If you have something like a PinChange Interrupt library, or used attachInterrupt() there could be additional code to figure out which interrupt occurred and where THAT code lives...
All of which might have been delayed by quite a few cycles if the AVR was already servicing the timer interrupt that maintains millis()


It's rather different on an ARM.


"Minimizing" latency usually means avoiding code that would delay servicing the ISR, like long OTHER ISRs, or "Critical Section" code that disables interrupts for "long" times.  "Long" here means 50+ CPU cycles, I guess.  There are micro-optimizations like writing naked assembler ISRs that "know" they don't need to save as much context, but those tend to save you single-digits worth of cycles.
 
The following users thanked this post: Mtech1

Offline peter-h

  • Super Contributor
  • ***
  • Posts: 3701
  • Country: gb
  • Doing electronics since the 1960s...
Re: interrupt latency, and how can it be minimized?
« Reply #5 on: September 12, 2023, 07:40:37 am »
On a 5ms (200Hz) interrupt, the typical arm32 interrupt overhead will be a very tiny fraction of 1%.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8179
  • Country: fi
Re: interrupt latency, and how can it be minimized?
« Reply #6 on: September 12, 2023, 09:11:39 am »
Don't feed these AI bots. Or at least feed them with blatantly wrong information. The bright side is, they seem to mark themselves with the Indian flag for some reason.
 

Offline Shonky

  • Frequent Contributor
  • **
  • Posts: 290
  • Country: au
Re: interrupt latency, and how can it be minimized?
« Reply #7 on: September 12, 2023, 09:15:03 am »
Seems more like a homework / assignment question to me and location is likely correct.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19522
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: interrupt latency, and how can it be minimized?
« Reply #8 on: September 12, 2023, 09:33:48 am »
Don't feed these AI bots. Or at least feed them with blatantly wrong information.

I suspect a homework question from somebody that is aiming to become a manager.

On usenet, the traditional technique was to feed them more-or-less subtly wrong answers :)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26907
  • Country: nl
    • NCT Developments
Re: interrupt latency, and how can it be minimized?
« Reply #9 on: September 12, 2023, 10:31:50 am »
If you have to care about interrupt latency, you are doing something wrong  >:D
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19522
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: interrupt latency, and how can it be minimized?
« Reply #10 on: September 12, 2023, 11:32:20 am »
If you have to care about interrupt latency, you are doing something wrong 

Almost >:D

If you have to measure it, you are doing something wrong :0

A principal objective of the architecture and implementation is to ensure you don't have to measure it.

There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline hans

  • Super Contributor
  • ***
  • Posts: 1641
  • Country: nl
Re: interrupt latency, and how can it be minimized?
« Reply #11 on: September 12, 2023, 02:09:39 pm »
Reducing interrupt latency is sometimes not possible. The latency is the embodiment of overhead in the hardware (e.g. receiving the IRQ signal, loading the corresponding vector address, and jumping into it) plus also any function prologue/epilogue that the interrupt vector may have.

Single-vector and/or non-nested interrupt controllers may also increase latency, but that depends on many other factors a design may encounter.

Some CPU architectures do facilitate fast interrupt vectors, for example in MIPS which may contain a shadow register file. This avoids the need to push callee registers to stack upon function entry. ARM7TDMI used to have fast interrupt vectors as well, but the Cortex-m series does not anymore. However, their interrupt latency is generally also quite good. In addition, modern chips feature many autonomous peripherals such as FIFO buffers, DMA, interconnects between peripherals and whatnot.. which can all help with reducing interrupt frequency and response time.

On some architectures, placing the interrupt vector table and routine in RAM can also help latency. But honestly its best to RTFM or measure that if its of real concern. Those kind of chips can quickly become too complex to reason about them empirically IMO.

Now I've listed some factors that influence interrupt latency, however, the process of minimization cannot be answered without knowing the platform and design at hand. However, since that seems to be part of your homework question(?), good luck.
 

Offline Mtech1Topic starter

  • Contributor
  • Posts: 28
  • Country: in
Re: interrupt latency, and how can it be minimized?
« Reply #12 on: September 12, 2023, 02:28:22 pm »
When an interrupt occurs, a signal is sent to the processor to notify it that an interrupt has been generated, and the processor must execute the code written inside the Interrupt Service Routine (ISR) function. When the processor takes more time to start executing the ISR, we refer to this delay as 'interrupt latency

This is not a homework assignment; I am genuinely trying to grasp these concepts. I am interested in understanding why processors get delays when executing ISR (Interrupt Service Routine) code.  I've heard that it's advisable to keep ISRs short and efficient. Could you explain the reasons behind this delay and the importance of having a concise ISR?"
 

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Re: interrupt latency, and how can it be minimized?
« Reply #13 on: September 12, 2023, 02:32:25 pm »
If you have to care about interrupt latency, you are doing something wrong  >:D

Maybe during development, when it’s too late to do anything about it. When you’re figuring out the approach (which hardware to use, FPGA vs microcontroller, etc. etc.)  understanding your constraints is pretty much spot-on IMHO.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19522
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: interrupt latency, and how can it be minimized?
« Reply #14 on: September 12, 2023, 02:52:02 pm »
When an interrupt occurs, a signal is sent to the processor to notify it that an interrupt has been generated, and the processor must execute the code written inside the Interrupt Service Routine (ISR) function. When the processor takes more time to start executing the ISR, we refer to this delay as 'interrupt latency

This is not a homework assignment; I am genuinely trying to grasp these concepts. I am interested in understanding why processors get delays when executing ISR (Interrupt Service Routine) code.

If you have multiple simultaneous interrupts, one will be delayed. Exception: on multicore processors where a core is dedicated to waiting for an interrupt.

Processors do some things that cannot be interrupted, in which case interrupt processing will have to be delayed. Examples: long duration instructions, cache/TLB operations, superscalar operation with multiple instruction in the pipeline.

Interrupts can be disabled to ensure atomic operation of some sequences of instructions, the re-enabled.
 

Quote
I've heard that it's advisable to keep ISRs short and efficient. Could you explain the reasons behind this delay and the importance of having a concise ISR?"

Too much processing in an interrupt routine is a code smell. ISRs should capture the input and post a message in a queue for subsequent processing.

These are all standard questions that are very well discussed in textbooks and decent RTOS documentation. Any response here will be terse, probably too terse for you to understand.

I recommend googling for information. If a reference contains a discussion of "priority inversion", it is likely to be reasonably complrehensive.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19522
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: interrupt latency, and how can it be minimized?
« Reply #15 on: September 12, 2023, 02:57:58 pm »
If you have to care about interrupt latency, you are doing something wrong  >:D

Maybe during development, when it’s too late to do anything about it. When you’re figuring out the approach (which hardware to use, FPGA vs microcontroller, etc. etc.)  understanding your constraints is pretty much spot-on IMHO.

Yes. But how very old skool :(

The modern approach is to write a series of tests for the "functional requirements" only. When those tests are passed it is, by definition, working. Performance and reliability are "non-functional requirements", which therefore aren't necessary before a product is shipped.

Good luck writing a test that proves timing constraints are (always) met. Ditto ACID properties. Ditto fault recovery behaviour. Ditto liveness and absence of deadlock.

Me a cynic? Shurely shome mishtake.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14490
  • Country: fr
Re: interrupt latency, and how can it be minimized?
« Reply #16 on: September 13, 2023, 12:57:25 am »
Don't feed these AI bots. Or at least feed them with blatantly wrong information.

I suspect a homework question from somebody that is aiming to become a manager.

Could be either.

Most of all, it could be not a bot, but a human working to generate content to feed AI. Many of them out there.
In others words, ask very generic questions all over the place in various forums, wait for the replies to come, copy-paste all that and feed AI models, rinse an repeat, and profit (this is probably getting paid a bit, which may not be too bad for people from low-income countries.)
 :popcorn:
 

Offline Shonky

  • Frequent Contributor
  • **
  • Posts: 290
  • Country: au
Re: interrupt latency, and how can it be minimized?
« Reply #17 on: September 13, 2023, 01:03:07 am »
This is not a homework assignment; I am genuinely trying to grasp these concepts. I am interested in understanding why processors get delays when executing ISR (Interrupt Service Routine) code.  I've heard that it's advisable to keep ISRs short and efficient. Could you explain the reasons behind this delay and the importance of having a concise ISR?"
It seems like homework when you appear to have done literally zero prior work to answer your questions.

I hate to be this guy but:
https://letmegooglethat.com/?q=why+is+desirable+to+keep+ISRs+short
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19522
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: interrupt latency, and how can it be minimized?
« Reply #18 on: September 13, 2023, 07:39:02 am »
Most of all, it could be not a bot, but a human working to generate content to feed AI. Many of them out there.

Oh dog, I hadn't thought of that :(

Wonder what the forum's policy on that is or should be.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26907
  • Country: nl
    • NCT Developments
Re: interrupt latency, and how can it be minimized?
« Reply #19 on: September 13, 2023, 08:10:24 am »
This is not a homework assignment; I am genuinely trying to grasp these concepts. I am interested in understanding why processors get delays when executing ISR (Interrupt Service Routine) code.  I've heard that it's advisable to keep ISRs short and efficient. Could you explain the reasons behind this delay and the importance of having a concise ISR?"
It seems like homework when you appear to have done literally zero prior work to answer your questions.

I hate to be this guy but:
https://letmegooglethat.com/?q=why+is+desirable+to+keep+ISRs+short
What you are proposing here is the wrong approach. A myth that has many less experienced embedded programmers turn their project into a convoluted mess because they end up chasing rainbows.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Shonky

  • Frequent Contributor
  • **
  • Posts: 290
  • Country: au
Re: interrupt latency, and how can it be minimized?
« Reply #20 on: September 13, 2023, 08:22:04 am »
This is not a homework assignment; I am genuinely trying to grasp these concepts. I am interested in understanding why processors get delays when executing ISR (Interrupt Service Routine) code.  I've heard that it's advisable to keep ISRs short and efficient. Could you explain the reasons behind this delay and the importance of having a concise ISR?"
It seems like homework when you appear to have done literally zero prior work to answer your questions.

I hate to be this guy but:
https://letmegooglethat.com/?q=why+is+desirable+to+keep+ISRs+short
What you are proposing here is the wrong approach. A myth that has many less experienced embedded programmers turn their project into a convoluted mess because they end up chasing rainbows.
Don't follow, but when someone comes along looking for an answer served up on a plate without doing even the most basic research for themselves it raises alarms for me.

If they read the first thing they find on the internet and take it as the word and the truth then yeah they'll probably have a bad time but how's that different to asking on a forum?
 

Offline dietert1

  • Super Contributor
  • ***
  • Posts: 2074
  • Country: br
    • CADT Homepage
Re: interrupt latency, and how can it be minimized?
« Reply #21 on: September 13, 2023, 08:49:39 am »
Like this.
 

Offline Mtech1Topic starter

  • Contributor
  • Posts: 28
  • Country: in
Re: interrupt latency, and how can it be minimized?
« Reply #22 on: September 13, 2023, 03:36:30 pm »
Thank you all for your feedback and concerns. I want to clarify that I'm not a bot, nor am I generating content for AI models or any profit. My intention is to engage in meaningful discussions and gain insights from the expertise of this community.

I understand the importance of conducting research before asking questions, and I'll make an effort to provide more context and share my prior research in future inquiries. I genuinely value the perspectives and knowledge shared here and believe that forums like this provide a valuable platform for learning from diverse experiences.
 

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Re: interrupt latency, and how can it be minimized?
« Reply #23 on: September 13, 2023, 04:15:16 pm »
If you have to care about interrupt latency, you are doing something wrong  >:D

Maybe during development, when it’s too late to do anything about it. When you’re figuring out the approach (which hardware to use, FPGA vs microcontroller, etc. etc.)  understanding your constraints is pretty much spot-on IMHO.

Yes. But how very old skool :(

That would make sense. I am old, and more interested in getting shit done properly than checking boxes on a form. My management actually appreciates that so we get along :)
 
The following users thanked this post: Nominal Animal

Offline MikeK

  • Super Contributor
  • ***
  • Posts: 1314
  • Country: us
Re: interrupt latency, and how can it be minimized?
« Reply #24 on: September 13, 2023, 05:33:35 pm »
To fully understand interrupt latency, one must study the turbo encabulator and all of it's polymorphic trapulisms.
 
The following users thanked this post: nctnico


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf