Electronics > Microcontrollers
Converting BCD(mm) to BCD(inch)[SOLVED]
jpanhalt:
This is related to some recent threads of mine about reading Mitutoyo linear scales. The bare sensors (e.g., the ones on my mill) report six, 4-bit BCD bytes in mm. I need to convert to inches for display. By brute force, I have converted to binary, divided by 25.4 (fixed point), and converted back (22 bit) to BCD. That requires about 1,000 instruction cycles or a little less. The 22-bit conversion works because the scales are less than 1 meter long.
I have been reading about doing that math in BCD, which I understand was common years ago and may still be used in the financial industry and some small calculators. My chosen MCU is a PIC16F1xxx, and it has no "BCD-friendly" instructions like the PIC18F MCU's have (e.g., DAW, decimal adjust W). I also work in Microchip Assembly exclusively (MPASM).
Has anyone here done BCD multiplication and/or division on an 8-bit controller without BCD instructions? Would you recommend going that direction or sticking with the brute force method?
Regards, John
zapta:
What do you hope to achieve, better accuracy? Speed? Smaller code? Something else?
jpanhalt:
Accuracy is fine. Just looking for smaller code with the same or better speed. Many routines with lots of loops, e.g., double dabble*, have small code, but can be quite slow with 24-bit numbers.
*For binary to BCD, I use a polynomial method (derived from PICList) that has more code, but is quite fast. My 20-bit routine takes about 300 instruction cycles.
rhodges:
Converting ASCII decimal or BCD to binary is usually easy. Here is code to convert 16 bit binary to BCD. It looks like it is easy to add a few instructions to do 24 bits. Hope it helps!
Benta:
Dunno why your code is so bloated.
I did a binary to BCD routine for the 68HC11 once and it was like 20 lines of assembler code.
Converting BCD to binary would be around the same.
Navigation
[0] Message Index
[#] Next page
Go to full version