Author Topic: Using Interrupt ... Where ? When ? Pro & Con ?  (Read 14704 times)

0 Members and 1 Guest are viewing this topic.

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Using Interrupt ... Where ? When ? Pro & Con ?
« on: October 24, 2015, 07:45:58 pm »
I've been following the discussion on topic using interrupt with great interest at this thread -> FPGA Vs µC

A bit of my background, as a hobbyist, I came from Atmel MCU (no, not Arduino)  ^-^ , and used quite bit of C and very tiny of assembly part just for fun & learning.  :P

Recently .. arrhhmm actually its almost a year now, I jumped ship into ARM, TI M4F core to be exact, still slow in progress, and during learning, I was in awe with ARM's power, like when I understand NVIC for the 1st time, and was thinking why not just slap in everything using interrupt as its powerful (compared to 8 bit mcus), noob's journey I guess.  :-//

Now reading that thread I pointed, to experience fellows, please, enlighten us noobies like me on using interrupt at mcu, discuss please, like the pro, con, when, where etc .. or never ?  ???

... preparing pop corn ...  :popcorn:

Offline Matje

  • Regular Contributor
  • *
  • Posts: 135
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #1 on: October 24, 2015, 11:19:37 pm »
Recently .. arrhhmm actually its almost a year now, I jumped ship into ARM, TI M4F core to be exact, still slow in progress, and during learning, I was in awe with ARM's power, like when I understand NVIC for the 1st time, and was thinking why not just slap in everything using interrupt as its powerful (compared to 8 bit mcus), noob's journey I guess.  :-//

Now reading that thread I pointed, to experience fellows, please, enlighten us noobies like me on using interrupt at mcu, discuss please, like the pro, con, when, where etc .. or never ?  ???

Where/when: if you need to react to some event immediately.

Pro: you can react immediately (well, maybe not, that depends).

Con: using too many interrupts enormously complicates things.

An interrupt can occur at any time. If e.g. your program is reading a variable larger than the processors wordsize (i.e. it needs more than one machine instruction to read it) than the value of that variable may change *while it is read*, if the ISR is changing it. Some hardware fiddling may be sensitive to timing, so you need to remember to disable interrupts for it.

ISRs may nest, with priorities, or they may not. Either way, predicting the worst case timing becomes "interesting", and with non-nesting ISR or priorities the "react immediately" thing may practically disappear.

Depending on the actual workings of the program nasty things like dead-locks and race conditions (the "variable changes while read" is an example of that) can occur.

So I say use interrupts when you really need to for hard technical reasons. Don't overuse them, that turns nasty, fast. The human mind just isn't built for dealing with multiple things (potentially) happening in parallel.
 

Offline rx8pilot

  • Super Contributor
  • ***
  • Posts: 3634
  • Country: us
  • If you want more money, be more valuable.
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #2 on: October 24, 2015, 11:28:24 pm »

Where/when: if you need to react to some event immediately.

Pro: you can react immediately (well, maybe not, that depends).

Con: using too many interrupts enormously complicates things.

An interrupt can occur at any time. If e.g. your program is reading a variable larger than the processors wordsize (i.e. it needs more than one machine instruction to read it) than the value of that variable may change *while it is read*, if the ISR is changing it. Some hardware fiddling may be sensitive to timing, so you need to remember to disable interrupts for it.

ISRs may nest, with priorities, or they may not. Either way, predicting the worst case timing becomes "interesting", and with non-nesting ISR or priorities the "react immediately" thing may practically disappear.

Depending on the actual workings of the program nasty things like dead-locks and race conditions (the "variable changes while read" is an example of that) can occur.

So I say use interrupts when you really need to for hard technical reasons. Don't overuse them, that turns nasty, fast. The human mind just isn't built for dealing with multiple things (potentially) happening in parallel.

+1

I use them for timer overflows and 'emergency' inputs. My projects have a lot of hardware monitoring fault conditions like over-current and over-voltage that rarely occur but need immediate attention when they do. Those sections will trigger an interrupt where the ISR shuts down the offending area of the circuit gracefully and goes on trying to figure out what just went wrong. For buttons, I generally use polling during the TIMER ISR and simply change a flag if the button press is detected. I also have hardware de-bounce so I don't have any debounce code slowing down the ISR.

Factory400 - the worlds smallest factory. https://www.youtube.com/c/Factory400
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #3 on: October 25, 2015, 12:40:10 am »
Quote
I also have hardware de-bounce so I don't have any debounce code slowing down the ISR.
It takes very little bandwidth to debounce buttons. You could debounce 50 button presses in the same total time it takes to do, say, an ADC reading or a simple mathematical calculation. On any reasonable clock speed and with a good code, software debouncing is essentially free, aside from the code space. I don't see this slowing down the ISR. External debouncing can realistically free up internal resources, if you are out of memory, I suppose. But in any higher volume application, it will usually be cheaper to use more microcontroller and save on the external components.
« Last Edit: October 25, 2015, 12:49:27 am by KL27x »
 

Offline TerminalJack505

  • Super Contributor
  • ***
  • Posts: 1310
  • Country: 00
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #4 on: October 25, 2015, 01:01:42 am »
My opinion regarding the use of interrupts is that you can/will make more complex use of them on less complex systems (=lower-end hardware.)  The complexity will be easier to manage on a smaller system.  You also won't likely have an RTOS to leverage.

Once your system gets complex enough that you employ an RTOS, you will likely use tasks to do much of the processing that you once did in an interrupt handler.  There may still be a related ISR but it will likely just save some data (ADC value, for example) and flag a synchronization object (semaphore, for example) that causes a task to do the bulk of the processing.  Doing it this way makes the system more manageable and flexible.  It also makes it easier to port to different hardware since you'll have a cleaner delineation between hardware-related code (handled by the ISR) and hardware-neutral code (handled by the task.)

Edit: Said "mutex" but meant "semaphore."
« Last Edit: October 25, 2015, 01:28:43 am by TerminalJack505 »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #5 on: October 25, 2015, 01:14:55 am »
Even with an OS you'll need to thouroughly check if the system can meet its performance specs. You can't do without a proper analysis and testing! Not so long ago I have build an embedded video player using Linux. I did spend some time profiling the software to make sure the software/hardware didn't skip frames and it didn't had to wait unnecessary for things to happen (order of execution).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Karel

  • Super Contributor
  • ***
  • Posts: 2217
  • Country: 00
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #6 on: October 25, 2015, 08:07:24 am »
Interrupts are very important and many times there's no alternative than to use them.

An obvious example is when reading incoming data from a peripheral like SPI/UART.
Even with a hardware fifo of 8 or 16 bytes, it's practically impossible to do this with polling when data arrives at high speed.

The same when you connect an adc to an mcu. If the adc has a built-in clock and a "data ready" output,
the mcu need to act very fast to prevent that the data in the adc gets overwritten by the new sample.
If the adc does not have an internal clock but expects a "convert" pulse from the mcu,
you need to use a timer-interrupt to avoid jitter in the output samplefrequency.

So, despite interrupts can be nasty and hard to debug, they are necessarily used a lot.

 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #7 on: October 25, 2015, 08:27:25 am »
interrupts are hard to be debugged if you do not have a true ICE
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #8 on: October 25, 2015, 08:55:26 am »
i think 8,16 and 32 bit variable are written in 1 operation in 32 bit MCU, so it can't be interrupted and when on other processor try to read it , it will end up with a correct value not half old half new value. am using STM32 a lot, and been asked to prove that , i can't find any information on ST document.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #9 on: October 25, 2015, 09:06:31 am »
Interrupts are very important and many times there's no alternative than to use them.

They can always be avoided, but your hardware might make it impractical.

Don't believe it? Understand how the XMOS processors achieve it and still maintain hard realtime performance with instruction timing guaranteed  by the compiler and IDE.

Short answer: up to 32 small cores each spinlooping on an input, no caches, interprocessor communication based around Hoare's CSP i.e. something that has a good theoretical, analysable behaviour. See Digikey for prices, which are remarkably low (£2 - £23, one off)
« Last Edit: October 25, 2015, 10:50:03 am by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #10 on: October 25, 2015, 10:14:46 am »
mips is also safe about that, even if mips-R3K is a bit more complex than mips32 about that
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #11 on: October 25, 2015, 11:08:10 am »
interrupts are hard to be debugged if you do not have a true ICE
Utter nonsense. Just set a GPIO on entry and clear on exit. edit: and check with an oscilloscope / logic analyser.
« Last Edit: October 25, 2015, 11:50:53 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #12 on: October 25, 2015, 11:18:17 am »
However you do need to be careful about read-modify-write, as something like x+=y; can turn into a load, an arithmetic operation and a store, an interrupt in the middle that changes x can be lost, obvious but surprisingly easy to miss when writing in C (and the resulting race can easily be a once in 6 months thing).

Interrupts are very convenient in some ways (Sometimes to the point of being the only sane way to do things), but  NEVER forget that they can mess with your program flow almost anywhere.....

Regards, Dan.
 

Offline SL4P

  • Super Contributor
  • ***
  • Posts: 2318
  • Country: au
  • There's more value if you figure it out yourself!
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #13 on: October 25, 2015, 12:01:45 pm »
The best thing about understanding and using interrupts is - event driven programming...
(look it up)
Once you've mastered when to use them, your code will make a lot more sense, and functions become called (usually) in a more logical strategy.
I guess it also prompts you to think more carefully about variable scope, factoring and use of functions in general...
Don't ask a question if you aren't willing to listen to the answer.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #14 on: October 25, 2015, 12:53:20 pm »
The best thing about understanding and using interrupts is - event driven programming...
(look it up)
Once you've mastered when to use them, your code will make a lot more sense, and functions become called (usually) in a more logical strategy.
I guess it also prompts you to think more carefully about variable scope, factoring and use of functions in general...

Yes indeed.

Make sure whatever you read discusses the merits of finite state machines (FSMs) and their formal representation. Harel Statecharts are a well-known version of that, but by no means the only one.

For inter-processor communication, use messages to convey events and/or Hoare's CSP.

Note - all of that has been proven to work well and reliably over the last 30+ years. Don't reinvent wheels, especially if you make them elliptical!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #15 on: October 25, 2015, 10:27:49 pm »
i think 8,16 and 32 bit variable are written in 1 operation in 32 bit MCU, so it can't be interrupted and when on other processor try to read it , it will end up with a correct value not half old half new value. am using STM32 a lot, and been asked to prove that , i can't find any information on ST document.

You may find the answer in an ARM architecture reference manual. If the processor saves the return (from interrupt) address as the next instruction to be executed then that would indicate that the currently executing instruction when the interrupt occured will be completed before branching to execute the interrupt service routine.

If it were possible to get into a situation where a variable is only partially updated by an unfinished instruction execution then the architecture would have to provide a means to detect it. Otherwise it would be chaos. That doesn't happen.

yap i did found the answer ... but i don't quite get what they are trying to say  :-DD

http://www.st.com/st-web-ui/static/active/en/resource/technical/document/programming_manual/CD00228163.pdf

-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )
-i understand that there is some mechanism to prevent multiple access to memory, so the processor will not execute two op on the same memory location simultaneously, but this need to execute some instruction before executing the memory operation. Question is does the compiler add those instr.
-a memory error exception do exist ... so  :scared:
 

Online IanB

  • Super Contributor
  • ***
  • Posts: 11859
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #16 on: October 25, 2015, 10:41:20 pm »
The short answer is that all computers are interrupt driven, all the time. The more complicated answer is that on a desktop OS the existence of interrupts is largely concealed and you might never know they are happening. But even then, the hardware interrupts are often translated by the operating system (via device drivers) into "software interrupts", or events, and in your program code you write event handlers to deal with them. For example, in a graphical program when someone clicks on a bit of the screen with the mouse button, a "mouse button 1 down" event is generated and some code somewhere reacts to that in an event handler.

So with this in mind, they way you deal with interrupts depends on how close you are to the hardware. If you are programming a raw microcontroller and you don't have an operating system to manage it for you, then most of your code (all of it?) will be interrupt handlers. You will react to timer interrupts for regular events, and I/O interrupts for irregular events. Your code will look like a collection of finite state machines, and each interrupt handler will change the state of one or more of these machines.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #17 on: October 25, 2015, 10:55:58 pm »
The short answer is that all computers are interrupt driven, all the time.

False, both historically and now.

High reliability computers, very simple computers, and those with tight or predictable latency requirements tend to be based around polling.

The results of polling may be turned into software events, especially in the form of messages, which are consumed by FSMs.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #18 on: October 25, 2015, 10:59:03 pm »
-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )

Beginners often don't understand the causes (and ways to avoid) "priority inversion".
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline hamdi.tn

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #19 on: October 25, 2015, 11:11:58 pm »
-first i do know now that higher priority interrupt will be always be executed even if the processor is handling an other lower priority interrupt ( that explain for me some good unexpected behavior and some weird unexpected behavior )

Beginners often don't understand the causes (and ways to avoid) "priority inversion".

they do now   ;D
 

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #20 on: October 26, 2015, 04:42:37 am »
High reliability computers, very simple computers, and those with tight or predictable latency requirements tend to be based around polling.

Noted, never realized this important matter.  :-+

Offline BravoVTopic starter

  • Super Contributor
  • ***
  • Posts: 7547
  • Country: 00
  • +++ ATH1
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #21 on: October 26, 2015, 04:43:29 am »
Are there good examples or scenarios, that you just can NOT escape from using interrupt in your previous projects ? Share it please.

Of course, I understand the different scale of complexities, lets make the scope of this topic narrowed to micro-controllers such as 8 bit up to ARM "micro"-controller class, not big powered processor class like the beasty intel cpu in desktop PC or high performance multi cores ARM.
« Last Edit: October 26, 2015, 05:28:04 am by BravoV »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #22 on: October 26, 2015, 07:50:16 am »
Are there good examples or scenarios, that you just can NOT escape from using interrupt in your previous projects ?

In theory there are no such examples. Of course, the difference between theory and practice is that in theory there is no difference in practice, but in practice...

To make that more explicit, look at the XMOS processors which - to all intents and purposes - don't use interrupts. The technique they use is to have many small simple cores with each core dealing with "one" input source, and spinlooping until there is a relevant input. That gives a minimum latency, a predictable latency and, since all other aspects of the cores are predicatable (e.g. no caches) predicable response times. They also have decent inter-core comms mechanisms. (And they are cheap, and have cheap devkits - see DigiKey)

Of course, if you don't have enought processors then you may need to multiplex multiple inputs "into" the core, and the traditional mechanism for that is interrupts. But interrupts come with significant disadvantages w.r.t. design predictability and, on big iron, implementation reliability problems. For the latter just look at the Itanic, but the HPC mob find problems in other machines.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Brutte

  • Frequent Contributor
  • **
  • Posts: 614
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #23 on: October 26, 2015, 09:18:13 am »
There are the events that can wait, there are those that cannot, with continnuum of constraints spanning in between.
As it is impossible to service  n events with p<n cores instantly, whole idea is to meet the events' deadlines with limited available resource/s. The known solution is to run-time pick some sequence and process accordingly. You can do sorting with NVIC, you can do sorting with bubblesort() or you can even poll() or roundrobin() if all of the events meet the deadline. The sequence is arbitrary and there are no generic commandments on „Ultimate Sequence” or „Thou shalt not poll on Sundays”.

Mind in a typical real-world scenario it is the event's deadline that has to be met so it is the ret of the event that matters, not the entry. However NVIC (Cortex M) is not able to sort the events/IRQs based on ret of events as that would be harder to implement. Sorting based on IRQ triggering time + priorities scheme only approximates the "deadline sorting" but in most cases it does the job (do not ask me about formal proof). Of course
Code: [Select]
assert(now<deadline);won't hurt at ret.
 

Offline mikerj

  • Super Contributor
  • ***
  • Posts: 3237
  • Country: gb
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #24 on: October 26, 2015, 01:28:34 pm »
Interrupts are very important and many times there's no alternative than to use them.

They can always be avoided, but your hardware might make it impractical.

They could be avoided given suitably designed hardware, but given a single core micro and high bandwidth asynchronous peripherals then impractical can easily become impossible. 
 

Offline boriz

  • Newbie
  • Posts: 3
  • Country: dk
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #25 on: October 26, 2015, 03:59:55 pm »
Are there good examples or scenarios, that you just can NOT escape from using interrupt in your previous projects ? Share it please.

It propably depends on how hard the mcu is occupied in your main loop, from having
to poll all sorts of inputs.

Personally i use interrupts where ever I can as I consider polling dirty business, taxing the
mcu when it isn't nessesary.

You might be fine not utilizing interrupts at all on your mcu, but then again you might be
able to get more responsiveness and performance out of your device, utilizing them or even
being able to use a lower priced mcu to do the same for less cost.

High speed UART you definately need interrupt based reception, else you wont be able to
get all the data from the point your polling loop detects reception till the point you read the
data out from the mcu's rx buffer. (that i have experienced myself)



 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #26 on: October 26, 2015, 04:26:38 pm »
Interrupts are very important and many times there's no alternative than to use them.
They can always be avoided, but your hardware might make it impractical.
They could be avoided given suitably designed hardware, but given a single core micro and high bandwidth asynchronous peripherals then impractical can easily become impossible.

Well, if you choose inadequate hardware then of course it is impractical! But I already stated that.

If the bandwidth is "high" for whatever processor has been chosen, then I question whether interrupts are sufficient. Polling in a tight loop would avoid the interrupt handling overhead, which can be very significant. Alternatively it is sometimes appropriate to have an FPGA deal with the high bandwidth i/o, e.g. see Xilinx Zynq devices.

If you want medium latency i/o handling while the processor is also occupied with background tasks, then interrupts are helpful - provided their well-known dysfunctional attributes are dealt with correctly.


There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #27 on: October 26, 2015, 09:37:24 pm »
^ It sure sounds like you are suggesting that resorting to the use of interrupts is indicative of using inadequate hardware from the start. Devices that include hardware interrupts enjoy a pretty huge market.

Polling in a tight loop is efficient if you have a tight and predictable loop. It becomes entirely more efficient to poll with a timer interrupt if your code has significant branches and potential delays in these detours. Your code loop may also be too fast. For instance, u may want a decreased frequency of say an ADC, since each reading draws current.

With interrupts you can decide on your frequency up front, balancing current draw and latency, and adjust it independently from the main code loop. Rather than implementing delays or loop counts in code loop, which will then need to be tweaked every time you alter, edit, add any code. I'm not sure of what this interrupt overhead is, of which you speak. A few extra instructions for the jump and the servicing, is all.

I can not even comprehend NOT using timer interrupts, if given the availability. It would seem downright stupid. It would be like choosing a multicore device and only using one core. Or using a device with a hardware math processor and doing the calculations in the code.
« Last Edit: October 26, 2015, 10:13:26 pm by KL27x »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #28 on: October 26, 2015, 10:11:42 pm »
^ It sure sounds like you are suggesting that resorting to the use of interrupts is indicative of using inadequate hardware from the start. Devices that include hardware interrupts enjoy a pretty huge market.

I have neither suggested nor implied that, any more than the statement "all crows are black birds" implies "all black birds are crows".

The word "inadequate" is used in the context of the statment I made, not in any other sense.

Quote
Polling in a tight loop is efficient if you have a tight and predictable loop. It becomes entirely more efficient to poll with a timer interrupt if your code has significant branches and potential delays in these detours. Your code loop may also be too fast. For instance, u may want a decreased frequency of say an ADC, since each reading draws current.

"Efficiency" is only one metric. Correctness is another.

Who cares about efficiency if it gives the wrong answer? Or - more corrosively - if you aren't sure whether it will always give a correct answer.

Quote
With interrupts you can decide on your frequency up front. Rather than implementing delays in code loop, which will then need to be tweaked every time you alter, edit, add any code.

With a little imagination it can be seen that doesn't need to be the case.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #29 on: October 26, 2015, 10:16:50 pm »
Quote
With a little imagination it can be seen that doesn't need to be the case.
I'll bow out. I don't have enough competency in C to go here. I can't even imagine how C manages to work with any degree of code/resource efficiency without using timer interrupts, itself. Take PWM an output as an example. I can imagine that a compiler could calculate and spit out perfectly calculated and balanced code loops for doing god knows how many things at once. To where all the branches are either the same number of instructions, or the code loop counter will keep tabs on the branches and add everything up. But it seems like that would be somewhat complex and code-inefficient, compared to using timers (or of course a hardware PWM module.)

Quote
"Efficiency" is only one metric. Correctness is another.

Who cares about efficiency if it gives the wrong answer? Or - more corrosively - if you aren't sure whether it will always give a correct answer.
OTOH, coding in assembly, I have absolutely no problem with using interrupts and getting correctness.

It seems like all the talk is interrupts vs polling. And in my mind that doesn't even make sense. Timer interrupts are perfect for polling.
« Last Edit: October 26, 2015, 10:39:07 pm by KL27x »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #30 on: October 26, 2015, 10:31:18 pm »
Quote
With a little imagination it can be seen that doesn't need to be the case.
I'll bow out. I don't have enough competency in C to go here. I can't even imagine how C manages to work with any degree of code/resource efficiency without using timer interrupts, itself.
My point is language agnostic. It is true whatever language is used, including assember.

Quote
Quote
"Efficiency" is only one metric. Correctness is another.
Who cares about efficiency if it gives the wrong answer? Or - more corrosively - if you aren't sure whether it will always give a correct answer.
OTOH, coding in assembly, I have absolutely no problem with using interrupts and getting correctness.

Prove it. Seriously. You will find it to be extraordinarily difficult. (Note: you cannot prove very much by testing something - all you can say is that a test didn't reveal a fault)

I suspect you are under the illusion that complex processors (and compilers for poorly specified languages) behave as you expect based on your reading their data sheets. Unfortunately life isn't that simple.

I suggest you have a look at the errata for a modern x86 processor, or, if you want a good laugh, at the Itanium. Or some PICs. Consider the effect of having an interrupt start in the middle of an instruction that causes page fault that causes a TLB miss. Usually it will work as expected.
« Last Edit: October 26, 2015, 10:33:56 pm by tggzzz »
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #31 on: October 26, 2015, 10:46:26 pm »
I can totally see how a compiled language can be tricky with interrupts, since interrupts (and the specific problem you mentioned) are often device specific. I imagine compilers screw that pooch, quite frequently.

Quote
Prove it. Seriously. You will find it to be extraordinarily difficult.
Oh, I agree on this. Even if my code works as expected, I know there are bugs in there, somewhere. I have maintained complex code for years. But in assembly, the problems you mention are just another potential source of bugs out of a million. And interrupts make many tasks so much simpler to code, the coding (and the debugging) is much easier by employing them. I am more confident in my code when I'm using the tools that make it way, way simpler, shorter, more efficient, and easier to follow.

FWIW, the datasheet (hopefully, not the errata!) should specify which instructions must not be interrupted, and you should temporarily disable interrupts where needed. Things like this would be the least of my worries when coding a project.

I think TerminalJack has an insightful observation:
Quote
My opinion regarding the use of interrupts is that you can/will make more complex use of them on less complex systems (=lower-end hardware.)  The complexity will be easier to manage on a smaller system.  You also won't likely have an RTOS to leverage.

Maybe the question is loaded, depending on the application/device/language.

On another note, I'm still having a real hard time understanding how a RTOS operates without timers. It seems to me that the device must be using timers in the background, perhaps just hidden from the user.
Yes, you can easily have as many counters as you want in the code loop, but calculating the total delay with the variety of the code branches would be ludicrous. The clock is most certainly incrementing counters and prescalers and storing and acting on that information very similar to the way timers are implented directly by the user in lower level devices.
« Last Edit: October 26, 2015, 11:35:53 pm by KL27x »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #32 on: October 26, 2015, 11:34:24 pm »
Thereby demonstrating that you ( kl27x)  haven't understood what I wrote.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #33 on: October 26, 2015, 11:38:42 pm »
Eli5 me.  :popcorn:

Silicon errata happens so interrupts suck?

But your IDE, compiler, libraries are 100% bug-free by default.

I didn't know we were arguing, let alone insulting each other. But if that's the case:
Quote
They can always be avoided, but your hardware might make it impractical.
Quote
Well, if you choose inadequate hardware then of course it is impractical! But I already stated that.
You basically said that choosing hardware with interrupts and having to use them is choosing inadequate hardware. You said this in response to an example of using a UART. Why does the UART generate an interrupt, in the first place? Just in case you chose inadequate hardware? And what if power consumption is a priority, and you do not want to run at upteen MHz? Or maybe you want the device to even be asleep and still receive communication?

Quote
avoid the interrupt handling overhead
And when you talk about overhead of an interrupt, I wonder if even you know what you're talking about. The resource overhead for, say independently PWMing some output pins using timer interrupt with concurrent, asynchronous processing is almost nothing. Do you imagine that your compiled code loop is going to be anywhere near as efficient as that? It won't be in the same universe. "It's not about efficiency." Ok, then. The interrupt code will also have more bandwidth leftover for concurrent processing. "But it's not about speed, either. It's about "correctness."" Ok, I missed that memo, and I don't think of "overhead" as correlating to "correctness," nor understand why that is the now the discussion. But since we're here, which solution do you think will be more "correct" regarding PWM timing accuracy, while also dealing with asynchronous events (hence the main code loop potentially taking many different branches)?

I didn't receive any notice that this thread was about life-critical proof-able code. If I responded to that question, it was inadvertent. OP asked when/where to use interrupts, and my answer is anywhere it makes my life easier. There are a lot of good examples in this thread.
« Last Edit: October 27, 2015, 01:33:36 am by KL27x »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #34 on: October 27, 2015, 01:33:01 am »
IMHO the most important thing when using interrupts is that their maximum occurance frequency has to be predictable. From there you can calculate maximum response times and the maximum stack usage.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #35 on: October 27, 2015, 07:34:25 am »
IMHO the most important thing when using interrupts is that their maximum occurance frequency has to be predictable. From there you can calculate maximum response times and the maximum stack usage.
Calculating those maxima is impractical on modern embedded processors - and is steadily becoming more impractical.

Consider an ARM A9 such as is found in phones and Zynq FPGAs, which is dual core, out-of-order, superscalar, dynamic-length pipeline (8-12 stages), L1 and L2 caches, shared memory. Add in that one core is probably running a linux and the other may be running an RTOS, and I challenge you to calculate (not measure) the maximum stack depth and interrupt latency time!

Or you can look at Intel's embedded x86 processors, and you will find the similar problems.

Of course if the discussion is limited to a subset of embedded systems without those features, then it is easier to calculate the maxima.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #36 on: October 27, 2015, 07:44:26 am »
Eli5 me.  :popcorn:

Silicon errata happens so interrupts suck?

But your IDE, compiler, libraries are 100% bug-free by default.

I didn't know we were arguing, let alone insulting each other. But if that's the case:
Quote
They can always be avoided, but your hardware might make it impractical.
Quote
Well, if you choose inadequate hardware then of course it is impractical! But I already stated that.
You basically said that choosing hardware with interrupts and having to use them is choosing inadequate hardware. You said this in response to an example of using a UART. Why does the UART generate an interrupt, in the first place? Just in case you chose inadequate hardware? And what if power consumption is a priority, and you do not want to run at upteen MHz? Or maybe you want the device to even be asleep and still receive communication?

Quote
avoid the interrupt handling overhead
And when you talk about overhead of an interrupt, I wonder if even you know what you're talking about. The resource overhead for, say independently PWMing some output pins using timer interrupt with concurrent, asynchronous processing is almost nothing. Do you imagine that your compiled code loop is going to be anywhere near as efficient as that? It won't be in the same universe. "It's not about efficiency." Ok, then. The interrupt code will also have more bandwidth leftover for concurrent processing. "But it's not about speed, either. It's about "correctness."" Ok, I missed that memo, and I don't think of "overhead" as correlating to "correctness," nor understand why that is the now the discussion. But since we're here, which solution do you think will be more "correct" regarding PWM timing accuracy, while also dealing with asynchronous events (hence the main code loop potentially taking many different branches)?

I didn't receive any notice that this thread was about life-critical proof-able code. If I responded to that question, it was inadvertent. OP asked when/where to use interrupts, and my answer is anywhere it makes my life easier. There are a lot of good examples in this thread.

Sigh. If you snip and combine partial quotes then you can "prove" anything - but nobody is impressed with strawman arguments. Especially when you zoom off in directions more-or-less unrelated to the quotes.

I think I have a pretty good idea of interrupt latency in a very wide range of systems; I've been creating embedded systems since the 1970s, using everything from 6800s/Z80s to "big iron" PA-RISC processors. Indeed I successfully persuaded companies to avoid the Itanium/Itanic because of its interrupt latency (how long does it take to save 4000(!) hidden registers?) and gross complexity of the hardware and compilers.

Finally, you made points without important caveats limiting their applicability. Don't complain if I show where your points aren't valid.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #37 on: October 27, 2015, 07:59:46 am »
Quote
Sigh. If you snip and combine partial quotes then you can "prove" anything
I quoted you, because I was responding to those statements. You're the one that said these things. I quoted, and provided context. It's all in the thread, if you forgot.
 
Quote
- but nobody is impressed with strawman arguments. Especially when you zoom off in directions more-or-less unrelated to the quotes.
That's the kettle calling the pot black. You did not respond to my posts, at all. Well, I take that back. I'm not familiar with Itanic. If that's the chip you base your interrupt overhead on, well I'm not sure that it is representative in general. And I thought I responded very directly to w/e I quoted.

Quote
Calculating those maxima is impractical on modern embedded processors - and is steadily becoming more impractical.

Consider an ARM A9 such as is found in phones and Zynq FPGAs, which is dual core, out-of-order, superscalar, dynamic-length pipeline (8-12 stages), L1 and L2 caches, shared memory. Add in that one core is probably running a linux and the other may be running an RTOS, and I challenge you to calculate (not measure) the maximum stack depth and interrupt latency time!

Or you can look at Intel's embedded x86 processors, and you will find the similar problems.

Of course if the discussion is limited to a subset of embedded systems without those features, then it is easier to calculate the maxima.
Isn't this one of those strawman arguments you were speaking of? No one here ever tried to do such a thing. The post started out discussing AVR's, as well. I don't see any mention of this in your explanations. Do they fit with your overall argument? Or are they, indeed, different?
Quote
I think TerminalJack has an insightful observation:


Quote

My opinion regarding the use of interrupts is that you can/will make more complex use of them on less complex systems (=lower-end hardware.)  The complexity will be easier to manage on a smaller system.  You also won't likely have an RTOS to leverage.


I quoted Terminal Jack, here, even. Thoughts? Maybe you agree with him? You are undoubtedly knowledgeable. I know this because you posted your resume. Is anyone else on the right track?



« Last Edit: October 27, 2015, 08:23:01 am by KL27x »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26891
  • Country: nl
    • NCT Developments
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #38 on: October 27, 2015, 11:17:35 am »
IMHO the most important thing when using interrupts is that their maximum occurance frequency has to be predictable. From there you can calculate maximum response times and the maximum stack usage.
Calculating those maxima is impractical on modern embedded processors - and is steadily becoming more impractical.

Consider an ARM A9 such as is found in phones and Zynq FPGAs, which is dual core, out-of-order, superscalar, dynamic-length pipeline (8-12 stages), L1 and L2 caches, shared memory. Add in that one core is probably running a linux and the other may be running an RTOS, and I challenge you to calculate (not measure) the maximum stack depth and interrupt latency time!
Latency is not worst case response time. Worst case response time is the worst case scenario where other interrupts are handled first. In a good ISR the latency is dwarfed by the amount of code which needs to be executed anyway. For a proof a system keeps working you are after the worst case scenario so assume all caches and pipelines are empty and need to be filled first. Stack depth is easy as well: use a seperate stack for interrupts.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 19468
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #39 on: October 27, 2015, 12:09:34 pm »
IMHO the most important thing when using interrupts is that their maximum occurance frequency has to be predictable. From there you can calculate maximum response times and the maximum stack usage.
Calculating those maxima is impractical on modern embedded processors - and is steadily becoming more impractical.

Consider an ARM A9 such as is found in phones and Zynq FPGAs, which is dual core, out-of-order, superscalar, dynamic-length pipeline (8-12 stages), L1 and L2 caches, shared memory. Add in that one core is probably running a linux and the other may be running an RTOS, and I challenge you to calculate (not measure) the maximum stack depth and interrupt latency time!
Latency is not worst case response time.

Que? You have mean latency, mode latency, 95% percentile (etc) latency, worst case latency. And the "latency" is whatever you define it to be.

Try to calcuclate any of those with the h/w & s/w I mentioned, and see how far you ge!

Quote
Worst case response time is the worst case scenario where other interrupts are handled first. In a good ISR the latency is dwarfed by the amount of code which needs to be executed anyway.

That's entirely processor and application dependent. Note that all the hidden processor state has to be saved, and in Itaniums that is ~4000 64 bit registers plus who knows what else!

Quote
For a proof a system keeps working you are after the worst case scenario so assume all caches and pipelines are empty and need to be filled first.

No, that's a test not a proof, and only a single test at that. Good luck setting up the entire system to ensure your test preconditions - and demonstrating that you have done so.

Quote
Stack depth is easy as well: use a seperate stack for interrupts.

That is processor dependent and not changeable by the programmer.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline KL27x

  • Super Contributor
  • ***
  • Posts: 4099
  • Country: us
Re: Using Interrupt ... Where ? When ? Pro & Con ?
« Reply #40 on: October 27, 2015, 08:50:31 pm »
Continually pushing an argument towards the way an RTOS, which is built to handle interrupts in a pre-specified way, or on Itanium and x86 microprocessors primarily used with an OS is... what it is.

An RTOS is another additional layer that is already built to handle interrupts in a certain, limited way. Of course the use of interrupts will be more limited. And being built to handle parallel tasks to begin with, of course using the RTOS tools to handle events is more appealing.

If you are have direct control over interrupt handling on your device, interrupts can be very broadly useful. You could implement interrupts for a specific application/task, rather than set up a handling algorhithm that works for a general purpose OS. TerminalJack pretty much covered it, and yet the hand keeps trying to talk to the face.
Quote
Of course if the discussion is limited to a subset of embedded systems...

is necessary context to make some of the posts on this thread sensible. Mostly yours.

OP: can we keep this limited to 8 bit microcontrollers through single core ARM, for instance?
- Well, sure, you don't need interrupts if you use a 32 core XMOS processor. And if you need to use interrupts you chose insufficient hardware.

The truth is that you're comparing XMOS architecture to a typical RTOS. (Even among RTOS's, there are different ways of managing multithreading and interrupts.) XMOS architecture and extended language can result in a shorter and more predictable latency than a given RTOS running on a single or multi core processor. This is because an RTOS on a multicore processor is doing the same thing, only it's fixed in one method of doing so. The XMOS extended language gives users the access to spinlock/loop and handle interrupts as they see fit for a given task/s. So the interrupt latency can be shorter and more predictable in this case because you're not fixed to a specific RTOS, which may be written with different priorities. Even (especially) with a single core, use of interrupts can be done with extremely short latency and high predictability... depending on the application and/or use (or non-use) of an OS. I imagine that in many specific applications, a single core micro using interrupts can be faster and more predictable even than an XMOS, given the same processor speed. If your app doesn't make good use of multiple cores, maybe you are choosing overadequate hardware. Is XMOS superior at multithreading than any given RTOS? More flexible, it sure seems. But some apps don't benefit from multithreading. For some hardware applications, interrupts are ideal.

I'll put it this way. For certain applications, using a certain, adequate single-core device, one MIGHT be able to accomplish said task satisfactorily using code loop polling. And to achieve this, you MIGHT need to be a genius or a masochist. Whereas if you cannot do said task better (lower and more predictable latency) with an interrupt, you would have to be a moron.
« Last Edit: October 29, 2015, 05:34:16 am by KL27x »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf