General > General Technical Chat
ASM programming is FASCINATING!
westfw:
Quote
getting used to PIC ASM, and am now learning by messing around with an Altair 8800 simulator ... I've been looking here also, at 6502
--- End quote ---
I would strongly suggest investing time in a modern processor, rather than all those historical beasts.MIPS (ie PIC32), ARM, or RISC-V...
(the problem is, you won't find nearly as much tutorial material on how to program them in assembly language.)
tggzzz:
--- Quote from: T3sl4co1l on July 29, 2020, 05:33:14 am ---
--- Quote from: tggzzz on July 29, 2020, 01:03:33 am ---Making sure of that is very difficult. If you measure a mean time of X, what fudge factor should you apply to get to the worst case?
As for "if you have enough cpu power", yes that is a solution. But I'm reminded that you can get a brick to fly if you apply enough power. (Or search yootoob for "flying lawnmower"!)
--- End quote ---
Is this an argument in favor of or opposition to my post? ;D
--- End quote ---
Yes :)
--- Quote ---I like it, it's actually a really good analogy. It highlights the same gross excess, and rational economy.
Back in the day, it took crazy defense projects (or a few very dedicated and probably wealthy amateurs) to come up with junk like that (e.g., those ill fated flying-saucer platforms). Nowadays anyone with under a thousand bucks knocking around can slap together something like that!
In the same way, what used to require heroic assembler on one platform, with today's platforms is now trivial, even on a budget. Don't let some imagined combination of efficiency, elegance and so on, be the barrier to "good enough"!
Doing some boring housekeeping tasks? Don't worry about learning 8051 assembler, just slap in the AVR or STM32 you're familiar with. Cost reduce it later when you have time -- and more importantly, budget -- to!
Using 10s technology to force 90s games to "run" on an 80s console? Don't worry about learning VHDL for the bus interface, just use the rPi you're handy with!
Fluent in Python but the data-cranking problem would really do better on a DSP or FPGA? Toss in the $50 SBC, who cares!
And of course not that one should take such liberty for granted: there will always be some applications where the harder solution is required, so there is value in learning lower level things (even assembly). By all means, take the time to investigate them, as you can. :-+
--- End quote ---
No argument there, but what underutilisation fudge factor should you employ to guarantee to meet timing constraints.
My professional experience includes both hard realtime embedded systems (where "deadline" was the right term), and soft realtime systems where the mean latency was important. The former is "challenging" if you have caches around - it is best to disable them.
--- Quote ---
--- Quote ---I'm reminded of the old joke about someone entering a programming competition. The winning entry was faster, but contained errors. The losing competitor remarked that he could have made his program ten times faster if he didn't have to give the correct result.
--- End quote ---
Yup. Timing from start of instruction(s) is what I meant, of course; but knowing when they start, is another matter (or when their outputs propagate to their targets). :)
--- Quote ---Cache coherency is a killer, both in large systems or hard realtime systems.
The larger HPC systems appear to be settling on message passing architectures, which can avoid the problems of cache coherency.
--- End quote ---
Reminds me of this story:
https://randomascii.wordpress.com/2018/01/07/finding-a-cpu-design-bug-in-the-xbox-360/
tl;dr they added an instruction to perform an incoherent prefetch, bypassing L2. Turns out... just putting that instruction into memory anywhere executable at all,* introduced an extremely small probability that it would be speculatively executed, tainting coherency and setting up a crash with absolutely no warning or explanation.
*Hmm, doesn't say if it was tested quite this far. A branch-indirect instruction could potentially be predicted to land on one, even if the target is not in any intended executable code path. (Also assuming memory is r/w/x tagged, so that general data doesn't get spec-ex.'d; that would just about damn it to a respin, I would guess!) Depends when and how such an instruction is decoded; maybe those are really slow on the platform, decoded late, and it's actually safe?
--- End quote ---
I haven't heard that story, but it doesn't surprise me. I've deliberately avoided looking at the data sheet and errata for modern x86 processors. As Tony Hoare put it in his Turing award lecture, "There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature."
Having said that, the xCORE processor I used had zero errata and the processor's hardware-software co-design made it a delight to use. Programming it was fun, not a battle.
KL27x:
--- Quote ---people also in practice seem to say "I wrote it in assembler".
--- End quote ---
Yeah, and it doesn't bother me, at all. People also "compile" their assembly into a hex.
edit: I have yet to hear anyone recompile their assembler. That might bug me.
MK14:
--- Quote from: KL27x on July 29, 2020, 09:08:06 am ---
--- Quote ---people also in practice seem to say "I wrote it in assembler".
--- End quote ---
Yeah, and it doesn't bother me, at all. People also "compile" their assembly into a hex.
--- End quote ---
Which they then upload download into the cpu.
Psi:
Yep, ASM is definitely fascinating.
I wouldn't want to write a massive windows app in it, but it definitely gives you a good appreciation for how things work behind the scenes.
I took a class in computer architecture and then advanced computer architecture at uni in my computing degree, it was very worthwhile.
Then after that degree i got another one in electronics with emphasis on mcu programming. hehe
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version