EEVblog Electronics Community Forum
EEVblog => EEVblog Specific => Topic started by: EEVblog on October 26, 2018, 01:10:50 am
-
Getting the 3 cent Padauk microcontroller to blink a LED.
Or at least the in-circuit emulator blinking a LED with C.
https://www.youtube.com/watch?v=Rixo78hv_lw (https://www.youtube.com/watch?v=Rixo78hv_lw)
-
I guess I don't need to put in my 2 cents now. ;)
-
Looks like a useful IDE. The "go" behavior is the same as other IDEs do it, e.g. Simplicity Studio from Silabs: first click on go stops at the first line of the main function, and then you can set breakpoints, and next go click actually runs the program.
Of course, standard ANSI C would be much better. I would recommend to use LLVM/CLANG instead of GCC, if someone wants to write a backend for it, because less headache and Dave2 could probably do it in one week with it.
BTW, you can buy from Taobao with an agent. I have used this in the past without problems: https://www.yoybuy.com (https://www.yoybuy.com)
-
Just heard back from LCSC. They do not currently provide programming of the parts to customers, but have offered the service to me for free ;D
When they have a public price for a programming service option they'll let me know.
So the wheels are turning.
If they offer this service then I suspect adoption will skyrocket.
-
Give us GCC compatibility and a 10$ programmer and i'll take 1k of these.
-
Give us GCC compatibility and a 10$ programmer and i'll take 1k of these.
Don't forget the programming protocol itself, it's just as important as the programmer hardware.
- Open toolchain support (compiler+linker+bytecode gen).
- Open programming protocol (HW levels & sequences, ideally along with the soft protocol for the official programmers)
- Open headers (including info on entry points, everything else needed, etc)
ATMEGAs and STCs fulfil all of these (the last point is a little fiddlier in practice, but the info is still free to access). Because of that they're damned easy to develop for on any computer + os; and will continue to be so 20 years down the track.
Of course this is all just what I personally want, and I'm not a big company. But this is the sort of stuff that makes the devices look like good options for students and small companies. Arguably you want to hook these sort of people whilst they're still young.
-
Maybe it could be reverse enginerded.
:-//
-
An official dealer will program them for about 0.3 cents each in 10k qty
https://skywiner.world.taobao.com/
Practically free!
This is getting serious.
-
0.3c programming cost for 1kbyte of rom
... that's still $307.2 per megabyte!
EDIT: Doh! $3.07 per megabyte. Ignore me :D
-
$307.2 per megabyte, how do you figure?
-
LCSC have just said that they will offer the free programming service for a limited time to anyone. Just email them when you place your order.
-
Hey Dave & David,
Dave mentioned "YOu can test it over voltage and fully qualify one of these if you have means" (or something like that). I, for one, would like to see the process of how to do this. Maybe it's a niche area of interest like the Metrology gear, but I think it would be some interesting insite.
-
Chinglish :clap: Chinese English
-
$3.07 per megabyte.
$300k per terabyte? Totally unacceptable in 2018!
-
Maybe it could be reverse enginerded.
:-//
I guess if Dave asks them, they might publish the programming protocol, but right, shouldn't be rocket science to look at it with a logic analyzer.
For the GCC or Clang port: I think Dave mentioned in the video that the assembly instruction set is documented, so should be easy.
-
Not sure how well the big backends can optimise for size; or if sdcc has any advantages here. Could be interesting.
-
What's wrong with micro-C?
-
Not sure how well the big backends can optimise for size; or if sdcc has any advantages here. Could be interesting.
That's the advantage of LLVM and GCC: all the optimization is already done before the backend has to translate it, and a simple intermediate result is created. You have to write only how these intermediate instructions (simple instructions like AND, OR, load, store etc.) are translated to the target machine. I guess in the end it might be not that easy, but for LLVM there is a tutorial how to do it and doesn't look too difficult:
https://llvm.org/docs/WritingAnLLVMBackend.html (https://llvm.org/docs/WritingAnLLVMBackend.html)
And LLVM optimizes a lot. For example it can replace a loop by a multiplication and one shift (divide by 2), and some adds and subs, by detecting that the loop is just a sum of the loop variables and then applying the well known mathematically transformations (as already Gauss as a kid knew (http://mathcentral.uregina.ca/qq/database/qq.02.06/jo1.html)). See the compiler explorer:
https://godbolt.org/z/tzU5rc (https://godbolt.org/z/tzU5rc)
No way SDCC would find such optimizations.
-
LCSC have just said that they will offer the free programming service for a limited time to anyone. Just email them when you place your order.
That "limited time" offer is totally nebulous and completely impractical, but not unexpected from China. :palm:
I hope LCSC can explain exactly what they mean by "limited time" - surely there is a minimum order qty? what is the lead time? Will they only offer it if you purchase the ICE/programmer from them?
-
I didnt understood how the led can blink at a visible speed
If we do that on any other low speed micro like arduino it will need a sleep/delay/interrup with timer but in the example it blinks at speeds that can be see by the eye without any of that, the chip it is running at 2 MHz so i dont understand why, i can guess that maybe it is because the I/O are extremely slow but the logic analyzer shows that no, it changes fast
Any info about that?
-
I didnt understood how the led can blink at a visible speed
If we do that on any other low speed micro like arduino it will need a sleep/delay/interrup with timer but in the example it blinks at speeds that can be see by the eye without any of that, the chip it is running at 2 MHz so i dont understand why, i can guess that maybe it is because the I/O are extremely slow but the logic analyzer shows that no, it changes fast
Any info about that?
His first experiment was to toggle the pin that fast, then he divided down with the timer for his blinky. It is not that clear. re-watch from 20:00 --> end.
-
Looks like a useful IDE. The "go" behavior is the same as other IDEs do it, e.g. Simplicity Studio from Silabs: first click on go stops at the first line of the main function, and then you can set breakpoints, and next go click actually runs the program.
Of course, standard ANSI C would be much better. I would recommend to use LLVM/CLANG instead of GCC, if someone wants to write a backend for it, because less headache and Dave2 could probably do it in one week with it.
BTW, you can buy from Taobao with an agent. I have used this in the past without problems: https://www.yoybuy.com (https://www.yoybuy.com)
Yeah, Run-> Continue it is a normal thing and i hate it, if i wanted to stop i will use a breakpoint, no?
CoIDE which it is an Eclipse by CooCox (one of the operating systems for microcontrollers) also has the option by default (i dont know if it can be changed) so it is not only a thing by the official tools
Even if the chip is very cheap and has potential i dont see any others manufacturers dont doing similar low cost chips, now that RISC V is a thing it can be done and if it trends it will happen, as it happened before with many others things like arduino/raspberry or even the Software where now the toolschains/operating systems/ides are free instead of a few thousand dollars, so taking that effort to develop a new toolchain it is not a good idea in my opinion, it is a matther of time NXP, TI and others start to release things like that to the market
Bad times for ARM if you ask me, i believe that NXP, Microchip, Cypress or ST can release a microchip below 10 cents, they cant rigth now because licences cost money, thats where RISC come in, they took any open source micro like LEON, modify it slightly and boom, ARM will be in big trouble because once you start to roll you cant stop it
-
His first experiment was to toggle the pin that fast, then he divided down with the timer for his blinky. It is not that clear. re-watch from 20:00 --> end.
Thanks, i will watch until the end, i stopped at 19:31 assuming Dave will do more digging so i stoped and come to ask why, he didnt say anything when shows to us the board so i thought it will be was he intented from minute one, as any of us when we try to blink a led will do, use a x00ms period blinking, i still remenber my first try, i used the oscilloscope because the code was rigth and it was really blinking but was that fast that it looks like it was on all the time, douh
Edit: Ok, i finished, i believed that the led blinking at 14:26 was the led and not the "it is alive" led of the simulator board, thats why i didnt undestand it, i imagined that the led was directly conected to the I/O as arduino/ST boards does, where you has an integrated led in the board, my mistake
-
Not sure how well the big backends can optimise for size; or if sdcc has any advantages here. Could be interesting.
That's the advantage of LLVM and GCC: all the optimization is already done before the backend has to translate it, and a simple intermediate result is created. You have to write only how these intermediate instructions (simple instructions like AND, OR, load, store etc.) are translated to the target machine. I guess in the end it might be not that easy, but for LLVM there is a tutorial how to do it and doesn't look too difficult:
No way SDCC would find such optimizations.
There is more to it then just producing small code. I don't know about this particular 3 cent microcontroller but chances are it doesn't have a real stack (or no room for it). This means you'll need to write and compile C in an entirely different way which is not supported by GCC at all (and probably not by LLVM). Look at Keil's C compiler for the 8051. It is a true work of art because of the way it manages to produce fast and efficient code for such a contorted architecture.
-
Not sure how well the big backends can optimise for size; or if sdcc has any advantages here. Could be interesting.
That's the advantage of LLVM and GCC: all the optimization is already done before the backend has to translate it, and a simple intermediate result is created. You have to write only how these intermediate instructions (simple instructions like AND, OR, load, store etc.) are translated to the target machine. I guess in the end it might be not that easy, but for LLVM there is a tutorial how to do it and doesn't look too difficult:
No way SDCC would find such optimizations.
There is more to it then just producing small code. I don't know about this particular 3 cent microcontroller but chances are it doesn't have a real stack (or no room for it). This means you'll need to write and compile C in an entirely different way which is not supported by GCC at all (and probably not by LLVM). Look at Keil's C compiler for the 8051. It is a true work of art because of the way it manages to produce fast and efficient code for such a contorted architecture.
The GCC ports for MCU cores which look vaguely like larger scale cores - AVR, MSP430 - work very well. Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Keil isn't that great. Until MicroChip bought HiTech, and killed their 8051 C compiler, it wiped the floor with Keil.
-
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
-
Chinglish
It's lovely. I'll take extensive Chinglish technical documentation over perfectly-translated marketing fluff, anyday!(What's that quote about "better is the enemy of good"?)
-
Maybe it could be reverse enginerded.
:-//
I guess if Dave asks them, they might publish the programming protocol, but right, shouldn't be rocket science to look at it with a logic analyzer.
They won't publish it.
Response:
in our ic programming process, it involves not only digital signals, but also a lot of voltage level control and analog signals. it is not easy to do it by such as audrino programmer.
besides programming, there are a lot of trimming process involved and these trimming algorithm are not supposed to release.
so ... sorry about that. please use our proprietary programmer
-
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
I'm currently using a GCC port for the 6809 for my Bloxorz Vectrex game (http://kickstarter.com/projects/frankbuss/bloxorz-for-vectrex-pc-mac-ios-and-android). It works and can even produce some optimized code. It has a few bugs with O3, but this could be fixed. So it is possible. That said, the 6809 has a more powerful instruction set than a low-end PIC and a relatively large stack, maybe this helps.
-
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
I'm currently using a GCC port for the 6809 for my Bloxorz Vectrex game (http://kickstarter.com/projects/frankbuss/bloxorz-for-vectrex-pc-mac-ios-and-android). It works and can even produce some optimized code. It has a few bugs with O3, but this could be fixed. So it is possible. That said, the 6809 has a more powerful instruction set than a low-end PIC and a relatively large stack, maybe this helps.
The 6809 is quite an advanced core, designed specifically to run languages like C well. If a GCC port for that isn't working well, the developers really aren't trying. This has very little in common with developing a C compiler for an 8051 or PIC8.
-
They won't publish it.
Response:
in our ic programming process, it involves not only digital signals, but also a lot of voltage level control and analog signals. it is not easy to do it by such as audrino programmer.
besides programming, there are a lot of trimming process involved and these trimming algorithm are not supposed to release.
so ... sorry about that. please use our proprietary programmer
Probably the only voltage level control is switching some high voltage signal on and off while programming. And I guess the trimming algorithm is for calibrating the oscillator. But it doesn't have an ADC, so why would it need other analog signals? It has just an internal bandgap reference for a comparator, and I guess it doesn't calibrate this. I couldn't even find a lower and upper range in the datasheet for it, as more expensive microcontrollers specify.
Dave, might be interesting to monitor the programming pins when you program a chip, first with a scope to see which are analog and digital, and then the digital pins with your logic analyzer. The assembler instruction set is very similar to a PIC. I wouldn't be surprised if the programming algorithm would be as well the usual VPP/PGC/PGD PIC programming scheme, which would explain why they don't want to publish it >:D
-
It looks like they might intend to branch into 32bitters soon. I am reading through their job offer listings and there are positions for ARM toolchain developers. I hope they focus on the niche of ultra small MCU's as it would be a good alternative to SiLego GreenPacks.
On second thought, maybe they are using ARM's in the ICE to emulate the chips.
-
I looked a bit at the exe file of the IDE. There is an interesting comment in the program: "* This is Kent..Hell to Change :Company Name ---Setup", see attached screenshot. I wonder what this means? Looks like it is used in some sort of string search in the exe file itself, maybe to change the string "Kent Lee Tool" to "PADAUK Tool".
The code which references "ipconfig /all" looks harmless as well: It is used to get the MAC address of the ethernet adapter, then this is xor'ed with a constant, and if it matches another constant, some USB driver test is not executed. Same if there is a special file X_LOGO.jpg and a special volume serial number for the harddisk. Probably to test for the developer machine to enable some debugging.
Both things are a bit hackish, as the rest of the IDE and the usage of non ANSI C. But still useful to implement simple programs for toys, 555 replacement etc.
The IDE itself is written in C++ with Visual Studio and MFC. It will probably even run on Windows XP.
-
scanning for ip addresses seems a bit odd for a simple compiler.
-
The IDE itself is written in C++ with Visual Studio and MFC. It will probably even run on Windows XP.
Good. If you're running this in a work VM then you have a lot more options.
-
I am very curious about Dave's remark at 15:41 about "timing mode analysis": https://youtu.be/Rixo78hv_lw?t=941
Can somebody elaborate a bit more about it and how it works? As far as I understand, what Dave means is that the CPU clock (the CPU inside the computer which runs the Saleae software) is not synchronized with the sampling clock of the logic analyzer. So there's a drift between the two clocks and it causes a glitch we can see in the trace. But I'm not sure how exactly this works, probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
I'm asking this because I've seen this glitch many times, but I always assumed that it's actually physically present and is caused by a device I'm analyzing rather than the logic analyzer.
-
I am very curious about Dave's remark at 15:41 about "timing mode analysis": https://youtu.be/Rixo78hv_lw?t=941
Can somebody elaborate a bit more about it and how it works? As far as I understand, what Dave means is that the CPU clock (the CPU inside the computer which runs the Saleae software) is not synchronized with the sampling clock of the logic analyzer. So there's a drift between the two clocks and it causes a glitch we can see in the trace. But I'm not sure how exactly this works, probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
I'm asking this because I've seen this glitch many times, but I always assumed that it's actually physically present and is caused by a device I'm analyzing rather than the logic analyzer.
There is no glitch. You are just seeing the effect of the x-axis being coarsely quantized in time. If the sampling rate were much higher, the waveform would looks much more consistent along its length.
-
I am very curious about Dave's remark at 15:41 about "timing mode analysis": https://youtu.be/Rixo78hv_lw?t=941
Can somebody elaborate a bit more about it and how it works? As far as I understand, what Dave means is that the CPU clock (the CPU inside the computer which runs the Saleae software) is not synchronized with the sampling clock of the logic analyzer. So there's a drift between the two clocks and it causes a glitch we can see in the trace. But I'm not sure how exactly this works, probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
I'm asking this because I've seen this glitch many times, but I always assumed that it's actually physically present and is caused by a device I'm analyzing rather than the logic analyzer.
https://en.wikipedia.org/wiki/Aliasing
-
It will probably even run on Windows XP.
the IDE works; when I was playing with it, I was using a MicroXP VM.Of course I have no idea whether the ICE or Programmer Drivers work.The $250 cost of entry is likely to prevent me (and most other hobbyists) from playing with it.
(Yes, I realize that that is small change compared to most pre-Flash microcontroller development, and completely insignificant if you're making things in the volume that the chips are aimed at. Still, $250 buys an awful lot of hobbyist-grade $1 ATmega8s or $3 "Nano" clones.)
-
... probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
You already know the problem: The trace isn't "reconstructed", it's just the raw I/O pin reading displayed using only vertical/horizontal lines.
-
The IDE's speed is impressive, which is to be expected of an IDE that is very simple, caters to a single device (or family of devices) and is in its beta release. The overall functionality requires quite a lot of work to use this.
The help is not searchable and the articles are unsorted (it may be correctly sorted in Chinese, but I can't tell). Several articles are non-existent.
The editor is quite basic for 2018, although it gets the job done.
Three ways to find example code: they are present either in menu Code Generate, menu Help --> Refer Source (??!??) or menu File --> Demo Project. I would prefer to have a single place to find it.
The "Code Generate" menu is an interesting idea - as expected, for a such simple device the configuration options are limited and may be simply enough.
Non-existing compiler options, which is somewhat weird when compared to a regular C compiler with its include directories, libraries, etc. The example code itself has an assembly feel to it.
The compiler/linker messages could use an overhaul: "Use 10 memory, Remain 54 unused memory" message is childish and almost useless without units (Hex? Decimal? Words? Bytes?) and type of memory (RAM? ROM?)
I don't have a debugger to evaluate it, but Dave's video shows what it looks like the device registers on the top right- I can't tell how complete this list is, but that is a key difference between a decent IDE debugger and the generic hack-a-jobs around.
-
What's wrong with micro-C?
It is neither free nor does it support a reasonablish large subset of standard C.
Philipp
-
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
There is the Small Device C Compiler (SDCC). SDCC backends for the Padauk devices would surely be doable. After all, the architecture is much nicer than the low-end PICs.
Philipp
-
Give us GCC compatibility and a 10$ programmer and i'll take 1k of these.
Don't forget the programming protocol itself, it's just as important as the programmer hardware.
- Open toolchain support (compiler+linker+bytecode gen).
- Open programming protocol (HW levels & sequences, ideally along with the soft protocol for the official programmers)
- Open headers (including info on entry points, everything else needed, etc)
[…]
Open toolchain: Compiler, assembler, linker could be done in SDCC.
Open programming protocol would be harder; this will probably require some reverse engineering. Writing free software to support the official programmer could be a good first step, with free hardware following later. It happened for the ST-Link, so it should be doable here.
Open headers are not that important here. The set of I/O registers and peripherals is quite small, so it could be practical to just go by the datasheet.
Philipp