Author Topic: Using Interrupt ... Where ? When ? Pro & Con ?  (Read 14613 times)

0 Members and 1 Guest are viewing this topic.

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Using Interrupt ... Where ? When ? Pro & Con ?
« on: October 24, 2015, 07:45:58 pm »
I've been following the discussion on topic using interrupt with great interest at this thread -> FPGA Vs µC

A bit of my background, as a hobbyist, I came from Atmel MCU (no, not Arduino)  ^-^ , and used quite bit of C and very tiny of assembly part just for fun & learning.  :P

Recently .. arrhhmm actually its almost a year now, I jumped ship into ARM, TI M4F core to be exact, still slow in progress, and during learning, I was in awe with ARM's power, like when I understand NVIC for the 1st time, and was thinking why not just slap in everything using interrupt as its powerful (compared to 8 bit mcus), noob's journey I guess.  :-//

Now reading that thread I pointed, to experience fellows, please, enlighten us noobies like me on using interrupt at mcu, discuss please, like the pro, con, when, where etc .. or never ?  ???

... preparing pop corn ...  :popcorn:

Offline Matje

  • Regular Contributor
  • *
  • Posts: 135
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #1 on: October 24, 2015, 11:19:37 pm »
Recently .. arrhhmm actually its almost a year now, I jumped ship into ARM, TI M4F core to be exact, still slow in progress, and during learning, I was in awe with ARM's power, like when I understand NVIC for the 1st time, and was thinking why not just slap in everything using interrupt as its powerful (compared to 8 bit mcus), noob's journey I guess.  :-//

Now reading that thread I pointed, to experience fellows, please, enlighten us noobies like me on using interrupt at mcu, discuss please, like the pro, con, when, where etc .. or never ?  ???

Where/when: if you need to react to some event immediately.

Pro: you can react immediately (well, maybe not, that depends).

Con: using too many interrupts enormously complicates things.

An interrupt can occur at any time. If e.g. your program is reading a variable larger than the processors wordsize (i.e. it needs more than one machine instruction to read it) than the value of that variable may change *while it is read*, if the ISR is changing it. Some hardware fiddling may be sensitive to timing, so you need to remember to disable interrupts for it.

ISRs may nest, with priorities, or they may not. Either way, predicting the worst case timing becomes "interesting", and with non-nesting ISR or priorities the "react immediately" thing may practically disappear.

Depending on the actual workings of the program nasty things like dead-locks and race conditions (the "variable changes while read" is an example of that) can occur.

So I say use interrupts when you really need to for hard technical reasons. Don't overuse them, that turns nasty, fast. The human mind just isn't built for dealing with multiple things (potentially) happening in parallel.
 

Offline rx8pilot

  • Super Contributor
  • ***
  • Posts: 3634
  • Country: us
  • If you want more money, be more valuable.
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #2 on: October 24, 2015, 11:28:24 pm »

Where/when: if you need to react to some event immediately.

Pro: you can react immediately (well, maybe not, that depends).

Con: using too many interrupts enormously complicates things.

An interrupt can occur at any time. If e.g. your program is reading a variable larger than the processors wordsize (i.e. it needs more than one machine instruction to read it) than the value of that variable may change *while it is read*, if the ISR is changing it. Some hardware fiddling may be sensitive to timing, so you need to remember to disable interrupts for it.

ISRs may nest, with priorities, or they may not. Either way, predicting the worst case timing becomes "interesting", and with non-nesting ISR or priorities the "react immediately" thing may practically disappear.

Depending on the actual workings of the program nasty things like dead-locks and race conditions (the "variable changes while read" is an example of that) can occur.

So I say use interrupts when you really need to for hard technical reasons. Don't overuse them, that turns nasty, fast. The human mind just isn't built for dealing with multiple things (potentially) happening in parallel.

+1

I use them for timer overflows and 'emergency' inputs. My projects have a lot of hardware monitoring fault conditions like over-current and over-voltage that rarely occur but need immediate attention when they do. Those sections will trigger an interrupt where the ISR shuts down the offending area of the circuit gracefully and goes on trying to figure out what just went wrong. For buttons, I generally use polling during the TIMER ISR and simply change a flag if the button press is detected. I also have hardware de-bounce so I don't have any debounce code slowing down the ISR.

Factory400 - the worlds smallest factory. https://www.youtube.com/c/Factory400
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #3 on: October 25, 2015, 12:40:10 am »
Quote
I also have hardware de-bounce so I don't have any debounce code slowing down the ISR.
It takes very little bandwidth to debounce buttons. You could debounce 50 button presses in the same total time it takes to do, say, an ADC reading or a simple mathematical calculation. On any reasonable clock speed and with a good code, software debouncing is essentially free, aside from the code space. I don't see this slowing down the ISR. External debouncing can realistically free up internal resources, if you are out of memory, I suppose. But in any higher volume application, it will usually be cheaper to use more microcontroller and save on the external components.
« Last Edit: October 25, 2015, 12:49:27 am by KL27x »
 

Offline TerminalJack505

  • Super Contributor
  • ***
  • Posts: 1310
  • Country: 00
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #4 on: October 25, 2015, 01:01:42 am »
My opinion regarding the use of interrupts is that you can/will make more complex use of them on less complex systems (=lower-end hardware.)  The complexity will be easier to manage on a smaller system.  You also won't likely have an RTOS to leverage.

Once your system gets complex enough that you employ an RTOS, you will likely use tasks to do much of the processing that you once did in an interrupt handler.  There may still be a related ISR but it will likely just save some data (ADC value, for example) and flag a synchronization object (semaphore, for example) that causes a task to do the bulk of the processing.  Doing it this way makes the system more manageable and flexible.  It also makes it easier to port to different hardware since you'll have a cleaner delineation between hardware-related code (handled by the ISR) and hardware-neutral code (handled by the task.)

Edit: Said "mutex" but meant "semaphore."
« Last Edit: October 25, 2015, 01:28:43 am by TerminalJack505 »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26755
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #5 on: October 25, 2015, 01:14:55 am »
Even with an OS you'll need to thouroughly check if the system can meet its performance specs. You can't do without a proper analysis and testing! Not so long ago I have build an embedded video player using Linux. I did spend some time profiling the software to make sure the software/hardware didn't skip frames and it didn't had to wait unnecessary for things to happen (order of execution).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2214
  • Country: 00
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #6 on: October 25, 2015, 08:07:24 am »
Interrupts are very important and many times there's no alternative than to use them.

An obvious example is when reading incoming data from a peripheral like SPI/UART.
Even with a hardware fifo of 8 or 16 bytes, it's practically impossible to do this with polling when data arrives at high speed.

The same when you connect an adc to an mcu. If the adc has a built-in clock and a "data ready" output,
the mcu need to act very fast to prevent that the data in the adc gets overwritten by the new sample.
If the adc does not have an internal clock but expects a "convert" pulse from the mcu,
you need to use a timer-interrupt to avoid jitter in the output samplefrequency.

So, despite interrupts can be nasty and hard to debug, they are necessarily used a lot.

 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #7 on: October 25, 2015, 08:27:25 am »
interrupts are hard to be debugged if you do not have a true ICE
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #8 on: October 25, 2015, 08:55:26 am »
i think 8,16 and 32 bit variable are written in 1 operation in 32 bit MCU, so it can't be interrupted and when on other processor try to read it , it will end up with a correct value not half old half new value. am using STM32 a lot, and been asked to prove that , i can't find any information on ST document.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19281
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #9 on: October 25, 2015, 09:06:31 am »
Interrupts are very important and many times there's no alternative than to use them.

They can always be avoided, but your hardware might make it impractical.

Don't believe it? Understand how the XMOS processors achieve it and still maintain hard realtime performance with instruction timing guaranteed  by the compiler and IDE.

Short answer: up to 32 small cores each spinlooping on an input, no caches, interprocessor communication based around Hoare's CSP i.e. something that has a good theoretical, analysable behaviour. See Digikey for prices, which are remarkably low (£2 - £23, one off)
« Last Edit: October 25, 2015, 10:50:03 am by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #10 on: October 25, 2015, 10:14:46 am »
mips is also safe about that, even if mips-R3K is a bit more complex than mips32 about that
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26755
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #11 on: October 25, 2015, 11:08:10 am »
interrupts are hard to be debugged if you do not have a true ICE
Utter nonsense. Just set a GPIO on entry and clear on exit. edit: and check with an oscilloscope / logic analyser.
« Last Edit: October 25, 2015, 11:50:53 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #12 on: October 25, 2015, 11:18:17 am »
However you do need to be careful about read-modify-write, as something like x+=y; can turn into a load, an arithmetic operation and a store, an interrupt in the middle that changes x can be lost, obvious but surprisingly easy to miss when writing in C (and the resulting race can easily be a once in 6 months thing).

Interrupts are very convenient in some ways (Sometimes to the point of being the only sane way to do things), but  NEVER forget that they can mess with your program flow almost anywhere.....

Regards, Dan.
 

Offline SL4P

  • Super Contributor
  • ***
  • Posts: 2318
  • Country: au
  • There's more value if you figure it out yourself!
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #13 on: October 25, 2015, 12:01:45 pm »
The best thing about understanding and using interrupts is - event driven programming...
(look it up)
Once you've mastered when to use them, your code will make a lot more sense, and functions become called (usually) in a more logical strategy.
I guess it also prompts you to think more carefully about variable scope, factoring and use of functions in general...
Don't ask a question if you aren't willing to listen to the answer.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19281
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #14 on: October 25, 2015, 12:53:20 pm »
The best thing about understanding and using interrupts is - event driven programming...
(look it up)
Once you've mastered when to use them, your code will make a lot more sense, and functions become called (usually) in a more logical strategy.
I guess it also prompts you to think more carefully about variable scope, factoring and use of functions in general...

Yes indeed.

Make sure whatever you read discusses the merits of finite state machines (FSMs) and their formal representation. Harel Statecharts are a well-known version of that, but by no means the only one.

For inter-processor communication, use messages to convey events and/or Hoare's CSP.

Note - all of that has been proven to work well and reliably over the last 30+ years. Don't reinvent wheels, especially if you make them elliptical!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #15 on: October 25, 2015, 10:27:49 pm »
i think 8,16 and 32 bit variable are written in 1 operation in 32 bit MCU, so it can't be interrupted and when on other processor try to read it , it will end up with a correct value not half old half new value. am using STM32 a lot, and been asked to prove that , i can't find any information on ST document.

You may find the answer in an ARM architecture reference manual. If the processor saves the return (from interrupt) address as the next instruction to be executed then that would indicate that the currently executing instruction when the interrupt occured will be completed before branching to execute the interrupt service routine.

If it were possible to get into a situation where a variable is only partially updated by an unfinished instruction execution then the architecture would have to provide a means to detect it. Otherwise it would be chaos. That doesn't happen.

yap i did found the answer ... but i don't quite get what they are trying to say  :-DD

http://www.st.com/st-web-ui/static/active/en/resource/technical/document/programming_manual/CD00228163.pdf

-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )
-i understand that there is some mechanism to prevent multiple access to memory, so the processor will not execute two op on the same memory location simultaneously, but this need to execute some instruction before executing the memory operation. Question is does the compiler add those instr.
-a memory error exception do exist ... so  :scared:
 

Offline IanB

  • Super Contributor
  • ***
  • Posts: 11790
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #16 on: October 25, 2015, 10:41:20 pm »
The short answer is that all computers are interrupt driven, all the time. The more complicated answer is that on a desktop OS the existence of interrupts is largely concealed and you might never know they are happening. But even then, the hardware interrupts are often translated by the operating system (via device drivers) into "software interrupts", or events, and in your program code you write event handlers to deal with them. For example, in a graphical program when someone clicks on a bit of the screen with the mouse button, a "mouse button 1 down" event is generated and some code somewhere reacts to that in an event handler.

So with this in mind, they way you deal with interrupts depends on how close you are to the hardware. If you are programming a raw microcontroller and you don't have an operating system to manage it for you, then most of your code (all of it?) will be interrupt handlers. You will react to timer interrupts for regular events, and I/O interrupts for irregular events. Your code will look like a collection of finite state machines, and each interrupt handler will change the state of one or more of these machines.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19281
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #17 on: October 25, 2015, 10:55:58 pm »
The short answer is that all computers are interrupt driven, all the time.

False, both historically and now.

High reliability computers, very simple computers, and those with tight or predictable latency requirements tend to be based around polling.

The results of polling may be turned into software events, especially in the form of messages, which are consumed by FSMs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19281
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #18 on: October 25, 2015, 10:59:03 pm »
-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )

Beginners often don't understand the causes (and ways to avoid) "priority inversion".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #19 on: October 25, 2015, 11:11:58 pm »
-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )

Beginners often don't understand the causes (and ways to avoid) "priority inversion".

they do now   ;D
 

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #20 on: October 26, 2015, 04:42:37 am »
High reliability computers, very simple computers, and those with tight or predictable latency requirements tend to be based around polling.

Noted, never realized this important matter.  :-+

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #21 on: October 26, 2015, 04:43:29 am »
Are there good examples or scenarios, that you just can NOT escape from using interrupt in your previous projects ? Share it please.

Of course, I understand the different scale of complexities, lets make the scope of this topic narrowed to micro-controllers such as 8 bit up to ARM "micro"-controller class, not big powered processor class like the beasty intel cpu in desktop PC or high performance multi cores ARM.
« Last Edit: October 26, 2015, 05:28:04 am by BravoV »
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19281
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #22 on: October 26, 2015, 07:50:16 am »
Are there good examples or scenarios, that you just can NOT escape from using interrupt in your previous projects ?

In theory there are no such examples. Of course, the difference between theory and practice is that in theory there is no difference in practice, but in practice...

To make that more explicit, look at the XMOS processors which - to all intents and purposes - don't use interrupts. The technique they use is to have many small simple cores with each core dealing with "one" input source, and spinlooping until there is a relevant input. That gives a minimum latency, a predictable latency and, since all other aspects of the cores are predicatable (e.g. no caches) predicable response times. They also have decent inter-core comms mechanisms. (And they are cheap, and have cheap devkits - see DigiKey)

Of course, if you don't have enought processors then you may need to multiplex multiple inputs "into" the core, and the traditional mechanism for that is interrupts. But interrupts come with significant disadvantages w.r.t. design predictability and, on big iron, implementation reliability problems. For the latter just look at the Itanic, but the HPC mob find problems in other machines.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Brutte

  • Frequent Contributor
  • **
  • Posts: 614
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #23 on: October 26, 2015, 09:18:13 am »
There are the events that can wait, there are those that cannot, with continnuum of constraints spanning in between.
As it is impossible to service  n events with p<n cores instantly, whole idea is to meet the events' deadlines with limited available resource/s. The known solution is to run-time pick some sequence and process accordingly. You can do sorting with NVIC, you can do sorting with bubblesort() or you can even poll() or roundrobin() if all of the events meet the deadline. The sequence is arbitrary and there are no generic commandments on „Ultimate Sequence” or „Thou shalt not poll on Sundays”.

Mind in a typical real-world scenario it is the event's deadline that has to be met so it is the ret of the event that matters, not the entry. However NVIC (Cortex M) is not able to sort the events/IRQs based on ret of events as that would be harder to implement. Sorting based on IRQ triggering time + priorities scheme only approximates the "deadline sorting" but in most cases it does the job (do not ask me about formal proof). Of course
Code: [Select]
assert(now<deadline);won't hurt at ret.
 

Online mikerj

  • Super Contributor
  • ***
  • Posts: 3233
  • Country: gb
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #24 on: October 26, 2015, 01:28:34 pm »
Interrupts are very important and many times there's no alternative than to use them.

They can always be avoided, but your hardware might make it impractical.

They could be avoided given suitably designed hardware, but given a single core micro and high bandwidth asynchronous peripherals then impractical can easily become impossible. 
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf