Author Topic: FPGA Vs µC  (Read 20583 times)

0 Members and 1 Guest are viewing this topic.

Offline hamdi.tnTopic starter

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
FPGA Vs µC
« on: October 21, 2015, 08:58:15 am »
Hi, a basic question just popup in my mind , it happen sometimes  :-DD

When system security matter , i mean for life threatening fail kind of application. Which technology is better for easier software implantation and safer execution.

-My knowledge on FPGA are pretty much basic but what i know is a FPGA is a much more like hardwired logic put somehow to execute something, the µC execute some instruction in an order defined by the software. Second point FPGA in capable of multi independent tasking which offer the possibly to keep an eye on every crucial data while doing something else.
So would be true to think that FPGA are much more easier for such problem.

-Facing EMP which technology is more reliable, assuming proper pcb design and shielding.

-Since FPGA are basically a hardware implantation , how easy would it be to turn the burn same hardware into an ASIC and economically speaking at what point ASIC are interesting.

 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #1 on: October 21, 2015, 09:20:10 am »
Both - or none.

It depends upon how good the design is verified, not how it is implemented. An µC programmed in Ada by a trained person is very reliable (military-grade). An µC programmed in Assembler is not reliable. C99 or even C++ is better provided that you have a skilled developer (it's virtually impossible to find a C++ one) but still not as good as Ada.

A Verilog or VHDL FPGA design is somewhere between C99 and Ada. It is quite easy to verify. On the other side, it would be probably larger which makes it harder to maintain (and increases the probability of verifying it incorrectly.

Do not trust tests. Only a formal proof guarantees that your product works well. A formal proof is easily doable in Ada, VHDL or Verilog, a bit harder in C99 and good C++11 code. It is hardly possible in poorly-written C++ or C89 code.
The difficult we do today; the impossible takes a little longer.
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #2 on: October 21, 2015, 09:27:28 am »
Keep in mind that multitasking is a complex problem on its own. When multiple processes are executed in parallel (does not matter how), there is a BIG problem of communication. The formal verification of such a communication in general is proven to be impossible (the halting problem in computability theory) but is still doable in many practical cases. It is however so complex that virtually nobody makes it correctly (and nobody cares). Both MCU and FPGA are prone to this. Before introduced any multitasking, invent a way to make a formal proof of your algorithm.
The difficult we do today; the impossible takes a little longer.
 

Offline asgard20032

  • Regular Contributor
  • *
  • Posts: 184
Re: FPGA Vs µC
« Reply #3 on: October 21, 2015, 09:56:56 am »
Both - or none.

It depends upon how good the design is verified, not how it is implemented. An µC programmed in Ada by a trained person is very reliable (military-grade). An µC programmed in Assembler is not reliable. C99 or even C++ is better provided that you have a skilled developer (it's virtually impossible to find a C++ one) but still not as good as Ada.

A Verilog or VHDL FPGA design is somewhere between C99 and Ada. It is quite easy to verify. On the other side, it would be probably larger which makes it harder to maintain (and increases the probability of verifying it incorrectly.

Do not trust tests. Only a formal proof guarantees that your product works well. A formal proof is easily doable in Ada, VHDL or Verilog, a bit harder in C99 and good C++11 code. It is hardly possible in poorly-written C++ or C89 code.

May we get an example of such proof in those languages? Because formal proof isn't something people talk about often, so most of us are not familiar with this.

Also, if we want to program in Ada, which mcu development environment allow Ada programming? (I am not talking about compiling my own gcc tool-chain with ada enable)
« Last Edit: October 21, 2015, 10:02:09 am by asgard20032 »
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #4 on: October 21, 2015, 09:59:15 am »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #5 on: October 21, 2015, 10:02:42 am »
Both - or none.

It depends upon how good the design is verified, not how it is implemented. An µC programmed in Ada by a trained person is very reliable (military-grade). An µC programmed in Assembler is not reliable. C99 or even C++ is better provided that you have a skilled developer (it's virtually impossible to find a C++ one) but still not as good as Ada.

A Verilog or VHDL FPGA design is somewhere between C99 and Ada. It is quite easy to verify. On the other side, it would be probably larger which makes it harder to maintain (and increases the probability of verifying it incorrectly.

Do not trust tests. Only a formal proof guarantees that your product works well. A formal proof is easily doable in Ada, VHDL or Verilog, a bit harder in C99 and good C++11 code. It is hardly possible in poorly-written C++ or C89 code.

May we get an example of such proof in those languages? Because formal proof isn't something people talk about often, so most of us are not familiar with this.

Also, if we want to program in Ada, which mcu development environment allow Ada programming?

Ada's subset SPARK is used for formal validation and verification. Ada tool-chain for bare-metal embedded programming is available for free at least for AVR and ARM. Of course there are others if you have budget to pay for it. There might also be other freely available ports, but I haven't been looking at those.
« Last Edit: October 21, 2015, 10:08:09 am by Kalvin »
 

Offline hamdi.tnTopic starter

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: FPGA Vs µC
« Reply #6 on: October 21, 2015, 10:23:33 am »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.

i think 'simple' is relative. everyone i talked to him about this basically say the same thing you just said "polling + state machines"

except one there is a lot of micro-task that should run practically in the same time ( managing communication with a PC while communicating with a display MCU while running ADC while running the main task of the whole thing ) i guess a state machine is a bit restrictive on how you should manage all that in a secure way.
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #7 on: October 21, 2015, 10:38:17 am »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.

i think 'simple' is relative. everyone i talked to him about this basically say the same thing you just said "polling + state machines"

except one there is a lot of micro-task that should run practically in the same time ( managing communication with a PC while communicating with a display MCU while running ADC while running the main task of the whole thing ) i guess a state machine is a bit restrictive on how you should manage all that in a secure way.

Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
 

Offline daqq

  • Super Contributor
  • ***
  • Posts: 2302
  • Country: sk
    • My site
Re: FPGA Vs µC
« Reply #8 on: October 21, 2015, 10:38:26 am »
Quote
-Facing EMP which technology is more reliable, assuming proper pcb design and shielding.
Both are just as reliable I'd say in this. If your question is: How will they react when their inputs are nonsensical (such as sensor malfunction), that depends on how you design your software/logic. You can screw up an FPGA based design just as efficiently as a microcontroller based design. For critical stuff follow the simple rule: Simpler is better.

Quote
-Since FPGA are basically a hardware implantation , how easy would it be to turn the burn same hardware into an ASIC and economically speaking at what point ASIC are interesting.
Varies wildly - NRE (non recurring engineering costs) for a full custom ASIC, depending on what technology is used can range from several tens of thousands (simple stuff, well understood big node sizes) to several millions (high end, top of the line, bleeding edge tech/process). Add to that the price of software tools and work (semiconductor design is not a trivial tasks).

There are several areas between Full custom ASIC and general purpose FPGA though - you have ICs that have a lot of simple blocks that you connect by means of just one mask set... see https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/po/ss-hcasics.pdf . It's non volatile, cheaper (above a certain quantity) than an FPGA.

It should be noted that FPGAs are not a particularly non volatile hardware implementation - the internal structure is fixed and adjusted by loads of multiplexors, switches, that are controlled by a configuration that is inserted on power-up - as such, if you get a really nasty power spike (you assume EMP?) you will get a reset state (if not worse) just as you'd get with a microcontroller).
Believe it or not, pointy haired people do exist!
+++Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
 

Offline hamdi.tnTopic starter

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: FPGA Vs µC
« Reply #9 on: October 21, 2015, 10:53:58 am »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.

i think 'simple' is relative. everyone i talked to him about this basically say the same thing you just said "polling + state machines"

except one there is a lot of micro-task that should run practically in the same time ( managing communication with a PC while communicating with a display MCU while running ADC while running the main task of the whole thing ) i guess a state machine is a bit restrictive on how you should manage all that in a secure way.

Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.

Good,  but it return to what Gall said
Keep in mind that multitasking is a complex problem on its own. When multiple processes are executed in parallel (does not matter how), there is a BIG problem of communication. The formal verification of such a communication in general is proven to be impossible (the halting problem in computability theory) but is still doable in many practical cases. It is however so complex that virtually nobody makes it correctly (and nobody cares). Both MCU and FPGA are prone to this. Before introduced any multitasking, invent a way to make a formal proof of your algorithm.

i find it easier for spit it in half in one micro , by the way that non-critical stuff are done by the available hardware ( DMA & DMA interrupt are helpfull with that ) and critical stuff are done by polling.
cause communication are a problem by it's self , that you should manage to make it reliable and robust , in a way that both micro manage to resync dropped com, un-hang blocked com line ( like happen in I2C)  and i think it's easier to just check that every process in giving a valid data to the other process while being executed in the same chip


It should be noted that FPGAs are not a particularly non volatile hardware implementation - the internal structure is fixed and adjusted by loads of multiplexors, switches, that are controlled by a configuration that is inserted on power-up - as such, if you get a really nasty power spike (you assume EMP?) you will get a reset state (if not worse) just as you'd get with a microcontroller).

True, totally forgot that FPGA depend on the program they load on boot.
 

Offline daqq

  • Super Contributor
  • ***
  • Posts: 2302
  • Country: sk
    • My site
Re: FPGA Vs µC
« Reply #10 on: October 21, 2015, 11:00:19 am »
Quote
True, totally forgot that FPGA depend on the program they load on boot.
If that bothers you, CPLDs are more fixed - they have a non-volatile memory inside, but are far less complex. But at the end of the day a power spike will still reset your device (whatever it is) to an initial state - the connections might be there, but the internal state of the RAMs, flip flops, etc. might get reset.
Believe it or not, pointy haired people do exist!
+++Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #11 on: October 21, 2015, 12:09:09 pm »
May we get an example of such proof in those languages? Because formal proof isn't something people talk about often, so most of us are not familiar with this.

I'll give it a try.

First, let's prove that our algorithm is correct. An example: how do we prove quicksort:

An array is sorted if and only if each element is not smaller than any of the preceeding elements. (And not larger than any of the following elements, which is an obvious consequence).

Let's prove that quicksort gives a sorted array on its output. It is obvious that an array of only one element is always sorted. On each step of quicksort we split the array so that the first part contains only elements not larger than the pivot and the second has only larger elements. That means, any element of second subarray is not smaller as any element of the first one. Since it is not smaller than the pivot, this operation does not affect the "sorted" property: if both subarrays are sorted, so is the result. By mathematical induction, this means that the algorithm makes sorted array for any number of elements. Proven.

Now let's prove that our implementation is really the correct implementation of the proven algorithm. For illustrative purposes, I make a very inefficient C99 implementation:
Code: [Select]
void quicksort(int data[], size_t size)
{
    if (size < 2)
        return;
    const int pivot = data[0];

    int left[size];
    size_t left_size = 0;
    int right[size];
    size_t right_size = 0;
   
    for (size_t i = 1; i < size; ++i)
    {
        if (data[i] <= pivot)
        {
            left[left_size++] = data[i];
        }
        else
        {
            right[right_size++] = data[i];
        }
    }

    quicksort(left, left_size);
    quicksort(right, right_size);

    for (size_t i = 0; i < left_size; ++i)
        data[i] = left[i];
    data[left_size] = pivot;
    for (size_t i = 0; i < right_size; ++i)
        data[i + left_size + 1] = right[i];
}

I wrote it so that there is some place for errors, but we can still prove that it is correct (to some extents, see below).

If size < 2, this means, that the array has only one or zero elements, It is already sorted. Do nothing. Ok.

We choose very first element as a pivot. Ok.

We create two arrays and their sizes. This is the place where our program can fail if there is not enough stack memory. Here we have no guard against it, and we have to proof elsewhere that our array size is small enough to fit our stack. In the worst case we'll need at least size*(size+1)*sizeof(int) stack space + size recursion depth. Let's consider here that we have a proof of having such a stack space to the point of the call. Ok.

Then we loop for the rest of elements, started at the second one. This loops guarantees that each element is going to the left or to the right (but not both), and all elements not larger than the pivot are going to the left. This means, left and right will have the property required by our theoretical proof, and sum of their sizes wull be exactly size-1. Ok.

Call quicksort twice. Ok.

Copy elements from the left. Such copying takes exactly one element and keeps their amount and order. Ok.

Copy pivot. Pivot should go after the left array. Ok.

Copy right side. Here we should check array indices carefully. We could see that, if i = 0, our element goes right after pivot. And the index of the last element is size - 1 + left_size + right_size + 1 = size - 1, which is correct too. Ok.

Proven, with the exception of stack size problem.

A general rule here is: "I do not write what I can't prove". Not every code can be proven in such a way. This is the matter of human discipline. No technology is completely fool-proof.

Functional languages like Lisp and languages like Ada are designed to make such formal proofs as easy as possible. C, C++ and Java are not. Languages like Java and (sometimes) C++ have too much possibilities so the proof is hardly possible in many cases. The limited functionality of the language is good for this purpose as it limits the number of possibilities one has to consider during the proof.


Quote
Also, if we want to program in Ada, which mcu development environment allow Ada programming? (I am not talking about compiling my own gcc tool-chain with ada enable)
Sorry, the only Ada toolchain I know is exactly that gcc. Its Ada compiler, gnat, is the one used by the Pentagon.
The difficult we do today; the impossible takes a little longer.
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #12 on: October 21, 2015, 12:11:16 pm »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable.
Right! This is the obvious way to make the code simple enough to be proven. multitasking and interrupts are something that could happen everywhere, making the proof as hard as impossible.
The difficult we do today; the impossible takes a little longer.
 

Offline Ice-Tea

  • Super Contributor
  • ***
  • Posts: 3070
  • Country: be
    • Freelance Hardware Engineer
Re: FPGA Vs µC
« Reply #13 on: October 21, 2015, 01:14:48 pm »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.

Then, off course, you may have that some time passes between an event and the reaction to that event. Time may be as little as nothing and as much as the entire loop takes to process. This is something the FPGA has no trouble with: it evaluates everything, all the time.

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #14 on: October 21, 2015, 01:21:15 pm »
Comments on using MCU: Don't use interrupts or multitasking. Instead use polling, simple state machines and a "run to the completion" tasker or something similar which is predictable. For example, the book "Patterns for Time-Triggered Embedded Systems: Building Reliable Applications with the 8051 Family of Microcontrollers" is a good starting point and the concepts are easily portable to the other microcontroller. And, use watchdog properly.

Then, off course, you may have that some time passes between an event and the reaction to that event. Time may be as little as nothing and as much as the entire loop takes to process. This is something the FPGA has no trouble with: it evaluates everything, all the time.

That is why one should always determine the real-time requirements (soft and hard) and also determine the maximum loop processing time analytically and/or using an instruction simulator. For non-critical systems even an oscilloscope is enough for determining typical loop processing time and event response time.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
Re: FPGA Vs µC
« Reply #15 on: October 21, 2015, 01:47:57 pm »
Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
That is a recipe for dissaster! Instead of one microcontroller which can lock up you suddenly have 2 microcontrollers which can lock up and not to mention the asynchronous communication between them (2 microcontrollers = running 2 parallel asynchronous tasks).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #16 on: October 21, 2015, 02:01:14 pm »
Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
That is a recipe for dissaster! Instead of one microcontroller which can lock up you suddenly have 2 microcontrollers which can lock up and not to mention the asynchronous communication between them (2 microcontrollers = running 2 parallel asynchronous tasks).
Really? Are you quite sure about what you are saying? The critical part is running on a MCU without any fancy OS stuff performing its mission-critical or life-critical function autonomously. It will send the data to the other MCU running the non-critical part of the application. It will receive (non-blocking poll) data from the non-critical part for settings and other configuration stuff. The system is designed so that the critical part of the application will perform its task even if the non-critical part is not up and running (ie. it has crashed, it has missed its soft real-time deadline, etc). The communication protocol is idesigned to have no interlocks. However, implemented on the same processor, you will be in trouble if the non-critical part of the application crashes, performs some goofy stuff, application hangs due to programming error and the watchdog will restart the system... Think again. 
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #17 on: October 21, 2015, 02:05:40 pm »
Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
That is a recipe for dissaster! Instead of one microcontroller which can lock up you suddenly have 2 microcontrollers which can lock up and not to mention the asynchronous communication between them (2 microcontrollers = running 2 parallel asynchronous tasks).
Exactly.

There is NO RECIPE that makes it "just work", there is only one rule: keep it simple enough for the formal verification. It does not matter how you do that all as long as you can verify it.

In most cases, multitasking, interrupt handling and multiple-MCU solutions are low-hanging fruits. They all give a false sense of simplicity, being in fact not simple at all. For example, using interrupts to achieve "faster response" will probably lead to unpredictable response time, really fast in 99.9% cases but unacceptably slow in 0.1% cases. That is, it's the 0.1% probability that your device will fail, which is unacceptable for a good device. It is better to respond slower but with 100% probability (Ok, algorithmically 100% probability, since there is always a non-zero probability that your device will be hit by an asteroid).

The goal is to achieve 100% reliability of the software source code itself, so that all failures are essentially hardware or compiler failures. Both modern compilers and hardware are very reliable.
The difficult we do today; the impossible takes a little longer.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
Re: FPGA Vs µC
« Reply #18 on: October 21, 2015, 02:09:44 pm »
Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
That is a recipe for dissaster! Instead of one microcontroller which can lock up you suddenly have 2 microcontrollers which can lock up and not to mention the asynchronous communication between them (2 microcontrollers = running 2 parallel asynchronous tasks).
Really? Are you quite sure about what you are saying? The critical part is running on a MCU without any fancy OS stuff performing its mission-critical or life-critical function autonomously. It will send the data to the other MCU running the non-critical part of the application.
Try to build such a system and you'll see why it is a bad idea. In the end the functions on both processors will be much more intertwined than you think/want at first glance. The start of the slippery slope: Chances are both processors will need to do something special if one of them fails.
It can only work if you can make a very clean break between the two microcontrollers and put them in seperate boxes with an RS232 or RS485 interface (cable) between them.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #19 on: October 21, 2015, 02:14:53 pm »
Really? Are you quite sure about what you are saying? The critical part is running on a MCU without any fancy OS stuff performing its mission-critical or life-critical function autonomously.
The problem of such approach is, your fancy OS may send wrong commands to the critical MCU or display wrong state on the screen. The whole chain is not stronger as its weakest link.

In a mission-critical application, the user interface is as critical as the core functionality. The communication channel between the device and the operator shall be reliable. If you can't make a reliable GUI, better use LEDs and hardware buttons instead.
The difficult we do today; the impossible takes a little longer.
 

Offline Kalvin

  • Super Contributor
  • ***
  • Posts: 2145
  • Country: fi
  • Embedded SW/HW.
Re: FPGA Vs µC
« Reply #20 on: October 21, 2015, 02:19:55 pm »
Really? Are you quite sure about what you are saying? The critical part is running on a MCU without any fancy OS stuff performing its mission-critical or life-critical function autonomously.
The problem of such approach is, your fancy OS may send wrong commands to the critical MCU or display wrong state on the screen. The whole chain is not stronger as its weakest link.

In a mission-critical application, the user interface is as critical as the core functionality. The communication channel between the device and the operator shall be reliable. If you can't make a reliable GUI, better use LEDs and hardware buttons instead.

Take a look inside your car. There is quite clear separation between the mission-critical MCU systems and non-critical MCU systems. Same principle applies for example to Airbus which can be flown even if the instrument panel freezes, blanks and reboots.
« Last Edit: October 21, 2015, 02:57:13 pm by Kalvin »
 

Offline dmills

  • Super Contributor
  • ***
  • Posts: 2093
  • Country: gb
Re: FPGA Vs µC
« Reply #21 on: October 21, 2015, 03:44:58 pm »
The famous counterpoint was a cockup involving a certain radiotherapy machine (The UI could be gotten fatally (literally) out of sync with the physical state of the hardware)......

The Therac-25 should be a cautionary tale for every embedded systems engineer.

High SIL level systems are just HARD.

It is telling that the railways over here will not allow any use of interrupts in the code running the signalling systems (And that the Victorian era railway had an accident caused by a few tens of millisecond race condition in a strictly mechanical points interlock system).

Regards, Dan.
 

Offline Gall

  • Frequent Contributor
  • **
  • Posts: 310
  • Country: ru
Re: FPGA Vs µC
« Reply #22 on: October 21, 2015, 03:59:47 pm »
Another example of the communication via the human-machine interface was the crash of Tatarstan Airlines Flight 363 in Kazan. The actual cause for the crash was misinterpretation of the displays by the pilot. A software failure in displays would result in exactly the same.

And that's why cars have steering and brakes controlled mechanically/hydraulically, not electronically.
The difficult we do today; the impossible takes a little longer.
 

Offline hamdi.tnTopic starter

  • Frequent Contributor
  • **
  • Posts: 623
  • Country: tn
Re: FPGA Vs µC
« Reply #23 on: October 21, 2015, 05:16:12 pm »
Split the design in half: a) Mission-critical part running on MCU A and b) non-critical part running on another MCU B. The dedicated processors will use simple interprocessor communication protocol between the processors.
That is a recipe for dissaster! Instead of one microcontroller which can lock up you suddenly have 2 microcontrollers which can lock up and not to mention the asynchronous communication between them (2 microcontrollers = running 2 parallel asynchronous tasks).
Really? Are you quite sure about what you are saying? The critical part is running on a MCU without any fancy OS stuff performing its mission-critical or life-critical function autonomously. It will send the data to the other MCU running the non-critical part of the application.
Try to build such a system and you'll see why it is a bad idea. In the end the functions on both processors will be much more intertwined than you think/want at first glance. The start of the slippery slope: Chances are both processors will need to do something special if one of them fails.
It can only work if you can make a very clean break between the two microcontrollers and put them in seperate boxes with an RS232 or RS485 interface (cable) between them.

it is a bad idea, been through that, on paper it seems fine and easy, when programming it's mess and both processors have control over the critical circuitry so if one fail the other shut it down , and above all com are nightmarish experience. you just can't keep it simple.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
Re: FPGA Vs µC
« Reply #24 on: October 21, 2015, 05:29:19 pm »
At one day when at work the whole building started to shake.  :wtf: It turned out 'a system' with a large heavy mechanical structure had crashed into itself. Post mortem analyses revealed that the programmers from the supplier had just mixed the emergency stop code with the normal operating code. In other words: there was no layer providing any safety! Ofcourse that went wrong during testing...  :palm: Fortunately nobody got hurt due to other safety precautions but the incident could easely have resulted in death. 'Our' software engineering guys related to that project then decided to rewrite the entire system themselves and do it right (they already had a lot of experience writing safety critical software).
« Last Edit: October 21, 2015, 05:40:33 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf