Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
Chinglish
It's lovely. I'll take extensive Chinglish technical documentation over perfectly-translated marketing fluff, anyday!(What's that quote about "better is the enemy of good"?)
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
I'm currently using a GCC port for the 6809 for my
Bloxorz Vectrex game. It works and can even produce some optimized code. It has a few bugs with O3, but this could be fixed. So it is possible. That said, the 6809 has a more powerful instruction set than a low-end PIC and a relatively large stack, maybe this helps.
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
I'm currently using a GCC port for the 6809 for my Bloxorz Vectrex game. It works and can even produce some optimized code. It has a few bugs with O3, but this could be fixed. So it is possible. That said, the 6809 has a more powerful instruction set than a low-end PIC and a relatively large stack, maybe this helps.
The 6809 is quite an advanced core, designed specifically to run languages like C well. If a GCC port for that isn't working well, the developers really aren't trying. This has very little in common with developing a C compiler for an 8051 or PIC8.
They won't publish it.
Response:
in our ic programming process, it involves not only digital signals, but also a lot of voltage level control and analog signals. it is not easy to do it by such as audrino programmer.
besides programming, there are a lot of trimming process involved and these trimming algorithm are not supposed to release.
so ... sorry about that. please use our proprietary programmer
Probably the only voltage level control is switching some high voltage signal on and off while programming. And I guess the trimming algorithm is for calibrating the oscillator. But it doesn't have an ADC, so why would it need other analog signals? It has just an internal bandgap reference for a comparator, and I guess it doesn't calibrate this. I couldn't even find a lower and upper range in the datasheet for it, as more expensive microcontrollers specify.
Dave, might be interesting to monitor the programming pins when you program a chip, first with a scope to see which are analog and digital, and then the digital pins with your logic analyzer. The assembler instruction set is very similar to a PIC. I wouldn't be surprised if the programming algorithm would be as well the usual VPP/PGC/PGD PIC programming scheme, which would explain why they don't want to publish it
It looks like they might intend to branch into 32bitters soon. I am reading through their job offer listings and there are positions for ARM toolchain developers. I hope they focus on the niche of ultra small MCU's as it would be a good alternative to SiLego GreenPacks.
On second thought, maybe they are using ARM's in the ICE to emulate the chips.
I looked a bit at the exe file of the IDE. There is an interesting comment in the program: "* This is Kent..Hell to Change :Company Name ---Setup", see attached screenshot. I wonder what this means? Looks like it is used in some sort of string search in the exe file itself, maybe to change the string "Kent Lee Tool" to "PADAUK Tool".
The code which references "ipconfig /all" looks harmless as well: It is used to get the MAC address of the ethernet adapter, then this is xor'ed with a constant, and if it matches another constant, some USB driver test is not executed. Same if there is a special file X_LOGO.jpg and a special volume serial number for the harddisk. Probably to test for the developer machine to enable some debugging.
Both things are a bit hackish, as the rest of the IDE and the usage of non ANSI C. But still useful to implement simple programs for toys, 555 replacement etc.
The IDE itself is written in C++ with Visual Studio and MFC. It will probably even run on Windows XP.
scanning for ip addresses seems a bit odd for a simple compiler.
The IDE itself is written in C++ with Visual Studio and MFC. It will probably even run on Windows XP.
Good. If you're running this in a work VM then you have a lot more options.
I am very curious about Dave's remark at 15:41 about "timing mode analysis":
https://youtu.be/Rixo78hv_lw?t=941Can somebody elaborate a bit more about it and how it works? As far as I understand, what Dave means is that the CPU clock (the CPU inside the computer which runs the Saleae software) is not synchronized with the sampling clock of the logic analyzer. So there's a drift between the two clocks and it causes a glitch we can see in the trace. But I'm not sure how exactly this works, probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
I'm asking this because I've seen this glitch many times, but I always assumed that it's actually physically present and is caused by a device I'm analyzing rather than the logic analyzer.
I am very curious about Dave's remark at 15:41 about "timing mode analysis": https://youtu.be/Rixo78hv_lw?t=941
Can somebody elaborate a bit more about it and how it works? As far as I understand, what Dave means is that the CPU clock (the CPU inside the computer which runs the Saleae software) is not synchronized with the sampling clock of the logic analyzer. So there's a drift between the two clocks and it causes a glitch we can see in the trace. But I'm not sure how exactly this works, probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
I'm asking this because I've seen this glitch many times, but I always assumed that it's actually physically present and is caused by a device I'm analyzing rather than the logic analyzer.
There is no glitch. You are just seeing the effect of the x-axis being coarsely quantized in time. If the sampling rate were much higher, the waveform would looks much more consistent along its length.
It will probably even run on Windows XP.
the IDE works; when I was playing with it, I was using a MicroXP VM.Of course I have no idea whether the ICE or Programmer Drivers work.The $250 cost of entry is likely to prevent me (and most other hobbyists) from playing with it.
(Yes, I realize that that is small change compared to most pre-Flash microcontroller development, and completely insignificant if you're making things in the volume that the chips are aimed at. Still, $250 buys an awful lot of hobbyist-grade $1 ATmega8s or $3 "Nano" clones.)
... probably because I don't really know how the logic analyzer works and how the trace is reconstructed.
You already know the problem: The trace isn't "reconstructed", it's just the raw I/O pin reading displayed using only vertical/horizontal lines.
The IDE's speed is impressive, which is to be expected of an IDE that is very simple, caters to a single device (or family of devices) and is in its beta release. The overall functionality requires quite a lot of work to use this.
The help is not searchable and the articles are unsorted (it may be correctly sorted in Chinese, but I can't tell). Several articles are non-existent.
The editor is quite basic for 2018, although it gets the job done.
Three ways to find example code: they are present either in menu Code Generate, menu Help --> Refer Source (??!??) or menu File --> Demo Project. I would prefer to have a single place to find it.
The "Code Generate" menu is an interesting idea - as expected, for a such simple device the configuration options are limited and may be simply enough.
Non-existing compiler options, which is somewhat weird when compared to a regular C compiler with its include directories, libraries, etc. The example code itself has an assembly feel to it.
The compiler/linker messages could use an overhaul: "Use 10 memory, Remain 54 unused memory" message is childish and almost useless without units (Hex? Decimal? Words? Bytes?) and type of memory (RAM? ROM?)
I don't have a debugger to evaluate it, but Dave's video shows what it looks like the device registers on the top right- I can't tell how complete this list is, but that is a key difference between a decent IDE debugger and the generic hack-a-jobs around.
What's wrong with micro-C?
It is neither free nor does it support a reasonablish large subset of standard C.
Philipp
Attempts to make GCC ports for very basic MCU cores - 8051, PIC8 - have been dismal failures.
Yeah; that. These chips are quite similar to PIC8. You're not going to get gcc or llvm working on it. At least, not to anyone's satisfaction.
There is the Small Device C Compiler (SDCC). SDCC backends for the Padauk devices would surely be doable. After all, the architecture is much nicer than the low-end PICs.
Philipp
Give us GCC compatibility and a 10$ programmer and i'll take 1k of these.
Don't forget the programming protocol itself, it's just as important as the programmer hardware.
- Open toolchain support (compiler+linker+bytecode gen).
- Open programming protocol (HW levels & sequences, ideally along with the soft protocol for the official programmers)
- Open headers (including info on entry points, everything else needed, etc)
[…]
Open toolchain: Compiler, assembler, linker could be done in SDCC.
Open programming protocol would be harder; this will probably require some reverse engineering. Writing free software to support the official programmer could be a good first step, with free hardware following later. It happened for the ST-Link, so it should be doable here.
Open headers are not that important here. The set of I/O registers and peripherals is quite small, so it could be practical to just go by the datasheet.
Philipp