Author Topic: Best MCU for the lowest input capture interrupt latency  (Read 18017 times)

0 Members and 1 Guest are viewing this topic.

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26906
  • Country: nl
    • NCT Developments
Re: Best MCU for the lowest input capture interrupt latency
« Reply #125 on: April 10, 2022, 11:33:50 am »
I encourage everyone to run a forum search with search term "xcore" and look at the results. Almost 100% are by a single person, date back for years and years, and are written exactly like marketing people write ("silver bullet points" style). Most are on-topic, but you will be able to see some that stand out in a bizarre way, just like having to meet a certain quota of writing xcore-positive messages, and no one happened to ask about timing that month.
I have to agree with this. I think most people are not very interested in Xcore processors. First of all it is much easier to deal with a single threaded or single CPU solution compared to dealing with a whole bunch of cores running asynchronously. Secondly being locked into a vendor specific tool for software development in a very specific way is something to avoid as well. If you need more cores, you can buy multi-core microcontrollers nowadays that are using industry standard ARM cores + tools. So that only leaves very niche applications that need massive parallel processing to get a job done. The typical example of tggzzz which implies of having a core sitting idle to wait for input makes no practical sense. Peripherals have buffers and/or DMA to relax the timing requirements. Using an entire processor core to deal with I/O is a waste of resources; Xcore doesn't bring anything to the table for these purposes. Solution looking for a problem.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Best MCU for the lowest input capture interrupt latency
« Reply #126 on: April 10, 2022, 11:58:14 am »
I encourage everyone to run a forum search with search term "xcore" and look at the results. Almost 100% are by a single person, date back for years and years, and are written exactly like marketing people write ("silver bullet points" style). Most are on-topic, but you will be able to see some that stand out in a bizarre way, just like having to meet a certain quota of writing xcore-positive messages, and no one happened to ask about timing that month.
I have to agree with this. I think most people are not very interested in Xcore processors. First of all it is much easier to deal with a single threaded or single CPU solution compared to dealing with a whole bunch of cores running asynchronously. Secondly being locked into a vendor specific tool for software development in a very specific way is something to avoid as well. If you need more cores, you can buy multi-core microcontrollers nowadays that are using industry standard ARM cores + tools. So that only leaves very niche applications that need massive parallel processing to get a job done. The typical example of tggzzz which implies of having a core sitting idle to wait for input makes no practical sense. Peripherals have buffers and/or DMA to relax the timing requirements. Using an entire processor core to deal with I/O is a waste of resources; Xcore doesn't bring anything to the table for these purposes. Solution looking for a problem.

The vendor-specific point is very valid.

Dealing with multiple asynchronous cores is tricky, not because of the hardware but because of the software and software mindset of parallelism as an "advanced afterthought". Yes, C, I'm looking at you!

The XMOS ecosystem starts by presuming parallelism and therefore being based around concepts that simplify parallelism. Those concepts were developed in the 70 (Hoare's CSP), deployed in the 80s (Transputer/Occam), partially deployed in many processors in the 90s and 00s (e.g. some TMS320 DSPs) - and are becoming more mainstream (in Go and Rust) in the 10s and 20s. They deserve to be better understood and appreciated.

As for having an entire core doing nothing being wasteful, that's just wrong. While that is the conceptual model, it isn't implemented that way. The implementation is SMT taken to an extreme: cores are "multiplexed" onto a tile, the only duplication being a small (invisible) set of registers.

That's the same as Sun inplemented in their Niagara T-series processors. Almost 20 years ago they had 8 "tiles" of 8 "cores" per chip, hence being able to "service" 64 threads in hardware. For the appropriate "embarassilbly parallel" server workload, that worked stunningly well.

Overall, what XMOS brings to the table is neatly integrated software and hardware that parallelism easy. Nothing else manages that; there have been many failed attempts by other companies. Since high levels of parallelism is the future, whether or not you realise it, XMOS is showing one way forward. There are others, and need to be still more.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: Best MCU for the lowest input capture interrupt latency
« Reply #127 on: April 10, 2022, 02:37:35 pm »
TL;DR;  but have you looked ad Cypress pSOC ? they have programmable hardware blocks. Put your time critical functions in there.
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Re: Best MCU for the lowest input capture interrupt latency
« Reply #128 on: April 10, 2022, 03:23:32 pm »
I pretty much like the XMOS cores, appently unlike some :)

I do think they’re expensive for what you get though, and the point about the conceptual model not matching the implementation is double-edged. Yes, it means you can PAR several tasks onto one core, but you’re bringing back the complexity that the conceptual model hid away. Similarly, the moveable pointer idea - its clever, it’s well-implemented, but it’s different, and like anything wizardly, it is subtle and easy to anger…

OTOH I don’t think many people would implement an SDRAM controller in software on a microcontroller, but you can, fairly easily, on an XMOS chip, and then vend that memory to other cores/tasks. Few microcontrollers provide time-stamped, clocked input/output of multi-bit ports, with SERDES available in hardware. If that sort of thing solves your problem, XMOS is your friend.

I don’t think they excel as general-purpose chips (too slow, too expensive, not a lot of RAM), but within the realm of problems they *are* applicable to, they are pretty much spot on. I’ve seen them a lot in audio processing for example, where hard real-time processing is crucial. Data acquisition is another.

Some of this ‘jitter is a myth’ talk seems a bit misguided to me - sure you can show negligible jitter for a single incoming signal by using ITCM, high interrupt priority, and it all works nicely. Now show me 8 different signals, all being equally important, all being asynchronous to each other, and all needing that same latency response. A single-core CPU can perform one interrupt at a time, the benefit of the Xcore is that it can handle multiple signals/inputs at a guaranteed latency, without requiring a whole new MPU, clock, maybe external flash, and passives. Again, if you need that, it’s awesome.

Horses for courses, choose what works. Given the steady progression of low-end, low-cost FPGA parts, the XMOS advantage is being eroded IMHO, but all things pass - maybe XMOS will come out with something that can extend the line’s life, again, in its sphere of specialty.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Best MCU for the lowest input capture interrupt latency
« Reply #129 on: April 10, 2022, 03:28:54 pm »
TL;DR;  but have you looked ad Cypress pSOC ? they have programmable hardware blocks. Put your time critical functions in there.

There is a good argument that some classes of time critical functions should be put in hardware. Nobody argues against that! Sometimes the hardware can even be FPGAs with ARM A class processors, e.g. Zynq devices :)

The key point is to know what you will be able to guarantee when the fully functioning system has finally become operational. Finding infrequent timing failures at that stage is very expensive.

In non-trivial systems, microbenchmarks have limited predictive value.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Best MCU for the lowest input capture interrupt latency
« Reply #130 on: April 10, 2022, 03:48:08 pm »
Looks like our posts crossed in the æther!

I pretty much like the XMOS cores, appently unlike some :)

I do think they’re expensive for what you get though, and the point about the conceptual model not matching the implementation is double-edged. Yes, it means you can PAR several tasks onto one core, but you’re bringing back the complexity that the conceptual model hid away. Similarly, the moveable pointer idea - its clever, it’s well-implemented, but it’s different, and like anything wizardly, it is subtle and easy to anger…

I agree with all that.

My point about conceptual model and implementation wasn't intended to be about merging several tasks onto one core. I didn't make it very clear, but my point was about the lack of hardware duplication. In most multicore devices, each core has its own ALU, instruction sequencer, addressing mechanism etc. In SMT devices, such as xCORE and Sun's Niagara, there is only one sequencer, one ALU, one addressing mechanism and they are all timeshared between each core; only the registers are per-core. One of those timeshared computational blocks is, in xCORE terms, called a "tile", and each tile has 8 cores. A given device can have up to 4 tiles, making a total of 32 cores (but only 4 ALUs etc).

Quote
OTOH I don’t think many people would implement an SDRAM controller in software on a microcontroller, but you can, fairly easily, on an XMOS chip, and then vend that memory to other cores/tasks.

I agree.

I'll add that even though an xCORE device can process 100Mb/s ethernet bit streams, in most cases it probably makes more sense to do that in hardware. Indeed, some xCORE devices have such ethernet hardware.

Quote
Few microcontrollers provide time-stamped, clocked input/output of multi-bit ports, with SERDES available in hardware. If that sort of thing solves your problem, XMOS is your friend.

I don’t think they excel as general-purpose chips (too slow, too expensive, not a lot of RAM), but within the realm of problems they *are* applicable to, they are pretty much spot on. I’ve seen them a lot in audio processing for example, where hard real-time processing is crucial. Data acquisition is another.

And that's the point: "horses for courses".

There's no doubt that xCORE is hitting a niche market - but it is worth knowing what that niche is and how XMOS matches it. Some of the concepts they use are applicable to far more than the xCORE niche.

Quote
Some of this ‘jitter is a myth’ talk seems a bit misguided to me - sure you can show negligible jitter for a single incoming signal by using ITCM, high interrupt priority, and it all works nicely. Now show me 8 different signals, all being equally important, all being asynchronous to each other, and all needing that same latency response. A single-core CPU can perform one interrupt at a time, the benefit of the Xcore is that it can handle multiple signals/inputs at a guaranteed latency, without requiring a whole new MPU, clock, maybe external flash, and passives. Again, if you need that, it’s awesome.

Precisely. Spot on.

I like being able to predict performance, not measure it and hope. I appreciate that is unfashionable with the TDD/Agile "if it passes the (overly simplistic) tests, then it is working" brigade.

Quote
Horses for courses, choose what works. Given the steady progression of low-end, low-cost FPGA parts, the XMOS advantage is being eroded IMHO, but all things pass - maybe XMOS will come out with something that can extend the line’s life, again, in its sphere of specialty.

FPGA costs come in two forms: per chip and staffing/conceptual. It is much easier to find softies than FPGA engineers, let alone find those that can understand the other's technology!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3146
  • Country: ca
Re: Best MCU for the lowest input capture interrupt latency
« Reply #131 on: April 10, 2022, 05:03:50 pm »
To get minimal jitter - avoiding distortion of other functions - I made a CubeMX startup-file and added only one init-function and the ISR. Now jitter is reduced to less than 10 ns. Sampling time of 100 kHz input signal has been several minutes. This is for H7xx.

This is more understandable. If the peripheral clock is 275 MHz, you would expect the jitter to be roughly 1/275 = 3.6 ns jitter. That's about what you see. You can zoom in to measure the jitter more accurately.
 

Offline mino-fm

  • Regular Contributor
  • *
  • Posts: 143
  • Country: de
Re: Best MCU for the lowest input capture interrupt latency
« Reply #132 on: April 11, 2022, 07:50:34 am »
This is more understandable. If the peripheral clock is 275 MHz, you would expect the jitter to be roughly 1/275 = 3.6 ns jitter. That's about what you see. You can zoom in to measure the jitter more accurately.

Here it is. If each ns really matters dedicated hardware is necessary.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8172
  • Country: fi
Re: Best MCU for the lowest input capture interrupt latency
« Reply #133 on: April 11, 2022, 10:05:01 am »
This is more understandable. If the peripheral clock is 275 MHz, you would expect the jitter to be roughly 1/275 = 3.6 ns jitter. That's about what you see. You can zoom in to measure the jitter more accurately.

Here it is. If each ns really matters dedicated hardware is necessary.

Yeah, and asynchronous hardware, to be specific: a wait-for-event instruction on an xCORE core, for example, won't make a difference, because you are just seeing synchronization logic jitter.

Some MCUs include peripherals with asynchronous control paths. I have used said feature in STM32F334 HRTIM in a DC/DC converter with 2MHz BW for current sense (LT1999). Inductor current rises fast enough that even one clock cycle at 72MHz makes a difference. Although to be fair, at 200MHz, that would have been totally acceptable.

Obviously, even with asynchronous logic, there will be delay and jitter, just significantly smaller than what you can get with some 72MHz IO clock as in the F334.

With today's MCUs running IO happily at 200MHz and core at 400MHz, all this matters less and less, and you can just do the simple thing and use general purpose interrupts and code to do things which absolutely required special peripheral features earlier. This increases flexibility, and makes special solutions like xCORE less appealing than they were in the past.
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19497
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Best MCU for the lowest input capture interrupt latency
« Reply #134 on: April 11, 2022, 11:10:51 am »
With today's MCUs running IO happily at 200MHz and core at 400MHz, all this matters less and less, and you can just do the simple thing and use general purpose interrupts and code to do things which absolutely required special peripheral features earlier. This increases flexibility, and makes special solutions like xCORE less appealing than they were in the past.

You appear to have something against xCORE, or at least don't understand what it offers!

If you use a fast conventional single-core processor to do multiple tasks that can't reasonably be offloaded to hardware, what timing guarantees can you give?
For example, consider "multiple tasks" to be continuous i/o (e.g. every 1µs), plus front panel control, plus some undefined computation of complexity, plus USB comms to a PC (or ethernet comms if you prefer).

Please don't bother to tell us not many systems are like that; we know - but some are. Horses for courses.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline pandy

  • Contributor
  • Posts: 14
  • Country: 00
Re: Best MCU for the lowest input capture interrupt latency
« Reply #135 on: April 11, 2022, 09:38:46 pm »
Strangely no one mentioned RP2040 (RPi Pico) with PIO on board. Small, interesting uC available and cheap.
 

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Re: Best MCU for the lowest input capture interrupt latency
« Reply #136 on: April 11, 2022, 09:51:44 pm »
Strangely no one mentioned RP2040 (RPi Pico) with PIO on board. Small, interesting uC available and cheap.

Reply 35 :)
 

Offline pandy

  • Contributor
  • Posts: 14
  • Country: 00
Re: Best MCU for the lowest input capture interrupt latency
« Reply #137 on: April 11, 2022, 10:18:59 pm »
Indeed, sorry, performed search in browser and didn't found RP2040... so my proposal...
 
The following users thanked this post: hans

Offline mino-fm

  • Regular Contributor
  • *
  • Posts: 143
  • Country: de
Re: Best MCU for the lowest input capture interrupt latency
« Reply #138 on: April 12, 2022, 08:29:23 am »
Strangely no one mentioned RP2040 (RPi Pico) with PIO on board. Small, interesting uC available and cheap.

Don't talk about it. Do it and show us the results of a real application.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8172
  • Country: fi
Re: Best MCU for the lowest input capture interrupt latency
« Reply #139 on: April 12, 2022, 09:32:13 am »
People are cautious about new players with just one or two products. Raspberry Pi "foundation" has a track record of uncertain/poor product availability, support and failure to cater professional/industrial market. It's well possible they have solved these problems when they went into silicon/microcontroller market, but only time and experience will prove that, people don't start trusting their product line overnight.

And I hope the best for them, competition is only good.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14470
  • Country: fr
Re: Best MCU for the lowest input capture interrupt latency
« Reply #140 on: April 12, 2022, 05:09:04 pm »
The RP2040 is honestly great for a cheap MCU, with some interesting features. And yes, the problem is that availability, even for mid-term, is unknown for any commercial application. (And as I mentioned on other occasions, its power consumption in low power modes is not that low, so it's not a good candidate for low-power, battery-operated devices IMO, but it is for many other purposes.)

That said, the whole idea that I already mentioned a couple times in this thread is to use hardware trigger features whenever available - the RP2040 PIO is great for this, but there are now many MCUs out there with such features. Use them!
 

Offline hans

  • Super Contributor
  • ***
  • Posts: 1638
  • Country: nl
Re: Best MCU for the lowest input capture interrupt latency
« Reply #141 on: April 12, 2022, 05:16:15 pm »
The RP2040 is an educational product, priced for educational markets, with educational/academically interesting peripherals and features (dual M0+, PIO, etc.)

The problem is that the MCU doesn't have any code protection features. So either roll your own that executes from RAM (good luck) or buy something that isn't going to be instantly copy-pasted as soon someone pulls the FLASH chip from the PCB..

(I've also seen people joke about ESP32 being a bad choice for commercial applications.. but atleast it has decryption on the fly for code from external FLASH. Something which also can't be said of all STM32s with QuadSPI or FMC, by the way.)
« Last Edit: April 12, 2022, 05:18:19 pm by hans »
 

Offline pandy

  • Contributor
  • Posts: 14
  • Country: 00
Re: Best MCU for the lowest input capture interrupt latency
« Reply #142 on: April 12, 2022, 07:58:30 pm »
Don't talk about it. Do it and show us the results of a real application.

Afraid you can't afford for my services so you are not in position to demand something from me...
 

Offline jemangedeslolosTopic starter

  • Frequent Contributor
  • **
  • Posts: 386
  • Country: fr
Re: Best MCU for the lowest input capture interrupt latency
« Reply #143 on: April 13, 2022, 02:00:41 pm »
I feel like tension is growing....
Is it really necessary ?

I don't think that mino-fm is really asking you some king of services.
Maybe he just wanted everyone to stop coming up with architectures without giving real numbers.

Or maybe he already knows the RP2040 isn't good when it comes to latency ?

Me on the other hand, I gave up.
Either I am too stupid, or there is a silicon bug.

With proper oscilloscope probe, I didn't see any coupling between the output capture pin and the input.
Im using a dsPIC33EP512MC806, I will try with a dsPIC33EP256MU806 which is pin to pin compatible.
I have also an other PCB on the way to evaluate dsPIC33CK.
 
The following users thanked this post: hans

Offline Sal Ammoniac

  • Super Contributor
  • ***
  • Posts: 1670
  • Country: us
Re: Best MCU for the lowest input capture interrupt latency
« Reply #144 on: April 13, 2022, 05:47:39 pm »
The RP2040 is an educational product, priced for educational markets, with educational/academically interesting peripherals and features (dual M0+, PIO, etc.)

Is this thing documented, or is it like the Broadcom CPUs used on the other Raspberry Pi products where you can't get a reference manual?
Complexity is the number-one enemy of high-quality code.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14470
  • Country: fr
Re: Best MCU for the lowest input capture interrupt latency
« Reply #145 on: April 13, 2022, 05:58:37 pm »
I feel like tension is growing....
Is it really necessary ?

It's not, but pretty common on forums and social media in general. I guess we have to live with it.

Or maybe he already knows the RP2040 isn't good when it comes to latency ?

The reason some people mentioned the RP2040 and explicitely its PIO, is precisely that you can implement this kind of thing with low latency without using interrupts. You can get ahold of the datasheet and read about the PIO, it's interesting.

Me on the other hand, I gave up.
Either I am too stupid, or there is a silicon bug.

With proper oscilloscope probe, I didn't see any coupling between the output capture pin and the input.
Im using a dsPIC33EP512MC806, I will try with a dsPIC33EP256MU806 which is pin to pin compatible.
I have also an other PCB on the way to evaluate dsPIC33CK.

I would have gladly given the hardware trigger a try - and shared my findings - if I still had any board with a dsPIC33EP...
If there's a silicon bug, have you read the errata, just in case?
 

Offline pandy

  • Contributor
  • Posts: 14
  • Country: 00
Re: Best MCU for the lowest input capture interrupt latency
« Reply #146 on: April 14, 2022, 08:26:29 pm »
I feel like tension is growing....
Is it really necessary ?

I don't think that mino-fm is really asking you some king of services.
Maybe he just wanted everyone to stop coming up with architectures without giving real numbers.

Or maybe he already knows the RP2040 isn't good when it comes to latency ?

Sorry but satisfying someone overgrown ego is not my core business... Personally I do not start conversation from challenging anyone (with unspoken "if you can't prove it then shut-up and get lost") - as this is public forum and i don't violate forum rules then I simply consider such approach as rude.

RP2040 PIO by definition is deterministic as such jitter and latency are predictable as already expressed by SiliconWizard - can't add anything more.
 

Offline DavidAlfa

  • Super Contributor
  • ***
  • Posts: 5907
  • Country: es
Re: Best MCU for the lowest input capture interrupt latency
« Reply #147 on: April 14, 2022, 08:53:33 pm »
It seems I came in just in time!
Final Destination! PIC vs Xcore vs Stm32 vs Rpi !!
 :popcorn:
Hantek DSO2x1x            Drive        FAQ          DON'T BUY HANTEK! (Aka HALF-MADE)
Stm32 Soldering FW      Forum      Github      Donate
 

Offline SpacedCowboy

  • Frequent Contributor
  • **
  • Posts: 292
  • Country: gb
  • Aging physicist
Re: Best MCU for the lowest input capture interrupt latency
« Reply #148 on: April 14, 2022, 09:36:05 pm »
Is this thing documented, or is it like the Broadcom CPUs used on the other Raspberry Pi products where you can't get a reference manual?

There is reasonable documentation:

- datasheet
- hardware design
- SDK at GitHub

I wouldn't say it's perfect, but then I rarely consider documentation perfect, and that's subjective anyway. It took me about 20 minutes from deciding to try it, to getting the LED to blink on the Pico...

The SDK at least doesn't try to be all things to all (wo)men, and provide a HAL that tries to cope with all possible code-contortions and therefore ends up being useless for high-performance code. It doesn't hold your hand that much, but some of the nicer touches are that you can configure the serial output to be sent over USB to a tty, just by adding a line or two to CMakeLists.txt.

The PIO 'WAIT' instruction appears to have 1 cycle of latency, apparently because it samples the value to WAIT on too early in the clock cycle for it to become stable, or perhaps because of the synchronizers, in any event, the code

Code: [Select]
wait 1 pin 0.           ; wait for high value on pin 0
set pins, 1 [15]        ; output high value on output pin and delay for 15 clocks
set pins, 0             ; output low value on pin

 ... which (apart from boilerplate) is all you'd need to await one pin going high and then setting another high for 15 clocks  - would be sufficient for the OP's question. Assemble that with the provided PIO assembler, load it up, configure the input/output pins (there are SDK functions for all this) and Bob's your Auntie's live-in-lover.

The clock can run at anything up to ~250MHz, though it's only guaranteed to ~130. At 200MHz with a 2-clock latency (1 for the WAIT, one for the SET), you'd be looking at ~10ns response.

You can get fancier, and use a register & loop to make the output pulse longer, or trigger an interrupt on the ARM when it happens, or set up a DMA stream from main memory, and have it automatically pull into the PIO FIFO and output data one bit per clock until the FIFO empties, or ... I mean, it's software...
 
The following users thanked this post: Sal Ammoniac, PCB.Wiz, 2N3055

Offline mino-fm

  • Regular Contributor
  • *
  • Posts: 143
  • Country: de
Re: Best MCU for the lowest input capture interrupt latency
« Reply #149 on: April 15, 2022, 07:39:46 am »
Afraid you can't afford for my services so you are not in position to demand something from me...

Oh, a very stupid answer. Jemangedeslolos asked for help not me.

I'd shown several pictures and some code. If someone ask me for complete code he/she can get it.
So let us discuss about your solution. Let's talk about nothing.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf