Author Topic: Micropython for MCUs - is it real or experimental ?  (Read 18716 times)

0 Members and 1 Guest are viewing this topic.

Offline e100

  • Frequent Contributor
  • **
  • Posts: 567
Re: Micropython for MCUs - is it real or experimental ?
« Reply #25 on: December 22, 2016, 03:13:21 am »
Perhaps the most important question is, does it work well enough to get the job done?
The European Space Agency asked the same question back in 2015 with regards to using Micropython on space qualified hardware.
So far I'm not aware of any conclusions from that investigation, but the fact that they even bothered to fund an investigation would indicate that they saw a potential cost saving over using traditional tools.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11269
  • Country: us
    • Personal site
Re: Micropython for MCUs - is it real or experimental ?
« Reply #26 on: December 22, 2016, 03:16:30 am »
but the fact that they even bothered to fund an investigation
Means that bureaucracy did its job and someone got paid to do pointless work. They will never approve this thing for anything space related. Half of the C standard is not approved for normal automotive use via MISRA-C. I can't see a dynamic language getting anywhere close to space stuff.
Alex
 
The following users thanked this post: Kjelt

Offline e100

  • Frequent Contributor
  • **
  • Posts: 567
Re: Micropython for MCUs - is it real or experimental ?
« Reply #27 on: December 22, 2016, 04:29:46 am »
but the fact that they even bothered to fund an investigation
Means that bureaucracy did its job and someone got paid to do pointless work. They will never approve this thing for anything space related. Half of the C standard is not approved for normal automotive use via MISRA-C. I can't see a dynamic language getting anywhere close to space stuff.

There are already cubesats using commodity hardware and software.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11269
  • Country: us
    • Personal site
Re: Micropython for MCUs - is it real or experimental ?
« Reply #28 on: December 22, 2016, 04:33:10 am »
There are already cubesats using commodity hardware and software.
You are mixing toys that live 2 weeks and cost next to nothing with long term missions. The former ones can be made of anything, it does not matter. The later ones will never run Python of any kind for mission critical stuff.
Alex
 

Offline e100

  • Frequent Contributor
  • **
  • Posts: 567
Re: Micropython for MCUs - is it real or experimental ?
« Reply #29 on: December 22, 2016, 04:45:17 am »
There are already cubesats using commodity hardware and software.
You are mixing toys that live 2 weeks and cost next to nothing with long term missions. The former ones can be made of anything, it does not matter. The later ones will never run Python of any kind for mission critical stuff.

The current launch cost of a cubesat is about $40,000.
I would say that some of the organisations flying cubesats would be offended if you described their projects as toys.
« Last Edit: December 22, 2016, 04:46:48 am by e100 »
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11269
  • Country: us
    • Personal site
Re: Micropython for MCUs - is it real or experimental ?
« Reply #30 on: December 22, 2016, 04:47:04 am »
The current launch cost of a cubesat is about $40000.
And the Curiosity rover costs $2.5 Billion.

And I'm pretty sure they are overpaying at that price.

I would say that some of the organisations flying cubesats would be offended if you described their projects as toys.
Their problem, not mine.
Alex
 

Offline rob77

  • Super Contributor
  • ***
  • Posts: 2085
  • Country: sk
Re: Micropython for MCUs - is it real or experimental ?
« Reply #31 on: December 22, 2016, 11:24:12 am »
The current launch cost of a cubesat is about $40000.
And the Curiosity rover costs $2.5 Billion.

And I'm pretty sure they are overpaying at that price.

I would say that some of the organisations flying cubesats would be offended if you described their projects as toys.
Their problem, not mine.

man you're stupidly biased against micropython and talking any kind of bullshit just to "be right"  :-DD who the hell suggested micropython for a $2.5 billion project ? you mixing up anything just to "keep your ground" and ignoring the fact that micropython might be, and actually IS useful for many small projects.
 

Offline ehughes

  • Frequent Contributor
  • **
  • Posts: 409
  • Country: us
Re: Micropython for MCUs - is it real or experimental ?
« Reply #32 on: December 22, 2016, 03:47:20 pm »
It is academically interesting but kinda useless for anything real. I honestly can't see the benefit of it. All of the examples are doing such trivial thing that are easy to do in a real language.  It has to be stripped down to a level where you lose all of the benefits of a dynamic language.     It is a lot of code just to be able to slowly twiddle bits on a 32-bit processor.  That you you really don't get that much space to run code.      You could run a small Cortex A5 and get real python. 

For Space you don't willy nilly can code and see what happens.   That is such poor engineering discipline.    If you need the ability to change code that much in flight there a serious problem on the design side.

.....That and I am highly offended by the significance of white space in code.  It is literally the dumbest thing you could possible do in a language design.       Invisible things should never, ever control execution flow.   The whole "it makes nice looking code" is really ridiculous given that I can run things through astyle.         I have literally witnessed engineers puzzling over a mistaken indent causing function problems.    This may work for hispster web programmers but I have yet to meet someone who does serious embedded work who finds this acceptable.     

Having a dynamic scripting language is a nice tool, but Python is a bad choice for embedded.      That and I really like Julia now for my scripts....  It actually runs at a reasonable rate and provides advantages over C for high end computing.

 

Offline rob77

  • Super Contributor
  • ***
  • Posts: 2085
  • Country: sk
Re: Micropython for MCUs - is it real or experimental ?
« Reply #33 on: December 22, 2016, 04:31:21 pm »
It is academically interesting but kinda useless for anything real. I honestly can't see the benefit of it. All of the examples are doing such trivial thing that are easy to do in a real language. 

ok first please define "real language"...

and once again, one single example... Espressif ESP8266 - millions of chips produced, so don't tell me it's not real. and now show me the advantages of maintaining a toolchain , writing code in "real" language  and reading many pages of datasheets to make the hardware work in your code in your "real" language (wifi anyone ?  >:D ) .

further advantage is a very easy and risk-free remote firmware upgrade.. you're changing your high level code (in fact a single file on the internal filesystem) and not reflashing the OS/runtime.

and once again and slowly... no-one is suggesting to use micropython in time critical applications or complex projects...
 

Offline Bruce Abbott

  • Frequent Contributor
  • **
  • Posts: 627
  • Country: nz
    • Bruce Abbott's R/C Models and Electronics
Re: Micropython for MCUs - is it real or experimental ?
« Reply #34 on: December 22, 2016, 05:13:19 pm »
Espressif ESP8266 - millions of chips produced, so don't tell me it's not real.
ESP8266 is not an MCU.
 

Offline rob77

  • Super Contributor
  • ***
  • Posts: 2085
  • Country: sk
Re: Micropython for MCUs - is it real or experimental ?
« Reply #35 on: December 22, 2016, 05:42:39 pm »
Espressif ESP8266 - millions of chips produced, so don't tell me it's not real.
ESP8266 is not an MCU.

and what is it then ? it's a 32bit MCU accompanied with a wifi chip.
 

Offline Bruce Abbott

  • Frequent Contributor
  • **
  • Posts: 627
  • Country: nz
    • Bruce Abbott's R/C Models and Electronics
Re: Micropython for MCUs - is it real or experimental ?
« Reply #36 on: December 22, 2016, 06:00:11 pm »
it's a 32bit MCU accompanied with a wifi chip.
and 512k of external ROM containing the firmware that makes it an ESP8266. It is no more an MCU than Raspberry Pi, Teensy or Arduino is an MCU.



 
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11269
  • Country: us
    • Personal site
Re: Micropython for MCUs - is it real or experimental ?
« Reply #37 on: December 22, 2016, 06:02:27 pm »
that makes it an ESP8266
ESP8266 is a chip. It does need external components to operate, but the name ESP8266 belongs to a chip. The modules have different names.
Alex
 

Offline dferyance

  • Regular Contributor
  • *
  • Posts: 181
Re: Micropython for MCUs - is it real or experimental ?
« Reply #38 on: December 22, 2016, 06:30:36 pm »
The nice thing is that micropython comes with a full blown networking stack , encryption and the whole hoopla. All the annoying crap nobody wants to deal with is already handled. you don't need to muck with libraries.
....

But all you described was exactly that; libraries. Networking stack, is a library, encryption is a library, wifi is a library. They might be great and easy to use libraries and that has merit, but none of that is unique to python. Having lots of pre-built easy-to-use libraries is a large part of the appeal of Arduino and that uses C++.

I like toy projects like this though. It is a fun challenge and cool to tinker with. But let's be realistic; this is making a major trade-off to be able to use python code that just isn't worth it. As long as it is a fun project to play with there is nothing wrong with it. But the mistake is when it is used as a crutch for people who refuse to learn C or anything new to them.
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: Micropython for MCUs - is it real or experimental ?
« Reply #39 on: December 22, 2016, 06:33:35 pm »
i can do it with an epsressif ESP32 ( 180 MHz SOC with wif and bluetooth embedded ) and a bosh BM280 environmental sensor. hardware cost  : 4 to 5$
IDK, but I have trouble sourcing the ESP32 for the moment.
It will run this micropython? Out of the box?
yup. Micropython is released for the ESP32. and Wifi/ble are full working.

Wipy module :   https://www.pycom.io/solutions/py-boards/wipy2/

Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline rob77

  • Super Contributor
  • ***
  • Posts: 2085
  • Country: sk
Re: Micropython for MCUs - is it real or experimental ?
« Reply #40 on: December 22, 2016, 06:34:45 pm »
it's a 32bit MCU accompanied with a wifi chip.
and 512k of external ROM containing the firmware that makes it an ESP8266. It is no more an MCU than Raspberry Pi, Teensy or Arduino is an MCU.

raspberry pi is a full blown computer (CPU must be equipped with a MMU to fall into that category)
arduino is a board + ecosystem built around the ATmega MCU
Teensy is similar to arduino in terms of definition (dev board + ecosystem built around a MCU)

esp8266 is the name of both popular wifi module and also the chip itself... so it's not the external SPI flash that makes it the esp8266... the external SPI flash holds your firmware.

feature list of that esp8266 chip.
Quote
32-bit RISC CPU: Tensilica Xtensa LX106 running at 80 MHz*
64 KiB of instruction RAM, 96 KiB of data RAM
External QSPI flash - 512 KiB to 4 MiB* (up to 16 MiB is supported)
IEEE 802.11 b/g/n Wi-Fi
Integrated TR switch, balun, LNA, power amplifier and matching network
WEP or WPA/WPA2 authentication, or open networks
16 GPIO pins
SPI, I²C,
I²S interfaces with DMA (sharing pins with GPIO)
UART on dedicated pins, plus a transmit-only UART can be enabled on GPIO2
1 10-bit ADC

btw... what you call a CPU accompanied with memory and IO on the same chip ? wait... wasn't it a MCU ? ok the flash is external, but the chip has both instruction and data memory built in... so it's damn close if not spot on ;)


 

Offline dferyance

  • Regular Contributor
  • *
  • Posts: 181
Re: Micropython for MCUs - is it real or experimental ?
« Reply #41 on: December 22, 2016, 06:42:59 pm »
.....That and I am highly offended by the significance of white space in code.  It is literally the dumbest thing you could possible do in a language design.       Invisible things should never, ever control execution flow.   The whole "it makes nice looking code" is really ridiculous given that I can run things through astyle.         I have literally witnessed engineers puzzling over a mistaken indent causing function problems.    This may work for hispster web programmers but I have yet to meet someone who does serious embedded work who finds this acceptable.     

Having programmed in Haskell and F# before python, I thought I would love the way python has significant whitespace - I don't, it is bad. In those other languages, especially Haskell, you don't run into whitespace bugs -- your code just won't compile if you mess it up. It has to do with the way the code is written in a purely functional style. When writing in a dynamic-typing and procedural language, if half your procedure is missing, or in the wrong block, python will appear to handle it fine but just do the wrong thing. Also you end up with much larger functions than in Haskell which makes tab alignment hard.

So my point is, don't totally reject whitespace being significant. Wait until trying it in Haskell. If you still don't like it after that, fine, that is a valid opinion. But don't let python scare you away.

I currently have many co-workers who love python, but having tried a wide variety of languages, I really don't get the appeal. I put it on par with Javascript; fine for simple scripting, but nothing to get excited about. Just chalk it up to some of the mistakes of starting out. Once people have worked in several different fields and on very different kinds of systems and technology, their opinions change -- or should change. Embedded programmers sometimes make the same mistakes just on different things.
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Micropython for MCUs - is it real or experimental ?
« Reply #42 on: December 22, 2016, 06:57:52 pm »
Sticking Python generated intermediate language code directly in to those chips will not work. It can not be interpreted in such a small chip, there simply is not enough room for an interpreter.

The code has to be compiled in to machine code, thus eliminating any slowdowns that you get when running intermediate code through an interpreter.

A user could actually use whatever language he/she wishes, to code a microchip, as long as that code is translated into that particular chip compatible machine code.

This means one could write in Assembly, C, C++, C#, Pascal, Java, JavaScript, Python... and the list goes on and on... it could even be programmed in XML.

So, back to answering the original question...If you get(understand) Python better than C or any other programming language, then go with that language. But if you want a huge collection of support libraries (ready made code), then go for C (though Python can use C libraries). If you understand the internal workings of a chip and how it "thinks" (registers etc..., then you can even code it using Assembly (the way chips were programmed and optimized in the early days).

Hope this helps!
« Last Edit: December 22, 2016, 07:00:16 pm by slicendice »
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: Micropython for MCUs - is it real or experimental ?
« Reply #43 on: December 22, 2016, 06:59:29 pm »
The nice thing is that micropython comes with a full blown networking stack , encryption and the whole hoopla. All the annoying crap nobody wants to deal with is already handled. you don't need to muck with libraries.

But surely at some point - SOMEONE

Of course. but with MicroPython it has already been done. Benefits me directly. Too many devkits come with a massive lump of code you have to dig through to get it running. I have not seen a single devkit that comes with turnkey systems. It takes several days to understand the toolchain. MBED is getting there but it still is a royal pain in the patooie sometimes.

Quote
Compare to freeRTOS which also gives you this stuff, with its own API for the features that will be about the same as the micropython hardware API. except freeRTOS is years older, and so years more developed, with support all over the place for numerous chips (actively supported and encouraged for use by most cortex M manufacturers!) And if you have a problem with FreeRTOS, it's all there in your project to dig into and look at if needed
  i don;t wan tto dig in. i have an application to write. i do not want te learn about operating systems, have to find a librarary that handles tcp/ip transport or ble transport , find a library that can handle json and encryption and figure out how to make all these things work nicely together.  i am a visual basic programmer : meaning i am used ot a rich set of instruction provided by the development tool. in a few lines of code i can do what i want as the 'operating system' already provides everything.

I have seen people try to develop an embedded module that configures an i2c slave properly, get data bytes out , format them as human readable text , and get the data loaded somewhere on a server.... using the regular tools. it takes days and the learning curve is very steep.



do the same thing using an electric-imp or a laird BLE600 and it is peanuts. why ? because all the heavy lifting is already done. simply do something like this :


while 1(
if handle = void then
   handle = connect(gateway,hostname,username,key)
   if handle=void then toggle_red_led
else
   red_led=off
   data = readsensor   <- i will write the code for 'readsensor'
   transmit (handle, jsonformat(data,formatstring)
   toggle blue_led
end if


or something along those lines.
this checks for a connection , if none exists : open it. if failse : blink red led. if works : read sensor data, format it and send it. blink blue led.

how much more complicated do you want it ?
this is the problem with all these iot things : i have a ton of idea's , unfortunately my coding skills are none to very mediocre. i need simple stuff. i can hack an i2c and spi bus like the best. but i have no clue how to write the underlying code to set up a ble or  wifi link to a server and get data there. that 'heavy lifting' needs to be done for me.

Any microcontroller manufacturer that wants to win the market needs to provide the fabric to do that. ( that is what espressif is doing ) People simply don't want to deal with that complicated stuff. It is time consuming , error prone and reinventing the wheel. The silicon itself is worthless these days. Any 32 bit machine can be had for 1 to s$. I don't care where i buy it from.  My selection criteria is : what is MY time to market. And that is heavily dependent on the tools/examples/code provided and how easy it is to pick it up and work with it.

Quote
you can use these parts with a C compiler, too.
even in assembly or using a hex editor. but why do you want to torture yourself ? and go through endless , compile, download, reset , doesn't work , alter , compile again. flash again...
with an interpreter i can alter a few lines and directly run. i can scatter debug code left and right and then simply comment it out. i can get a command prompt and execute a few lines interactively.
Having the capability to just type in a few commands and see immediate response is a dream to develop code.


Quote
OK, now using serial port debugging from inside your python application,
interactive prompt. simply type i2c_send_start <enter> and wathc the SDA and data lines toggle.
nothing beats the capabiltiy to call the functions from a prompt. (or jiggle the pins interactively) once you figure out how to do it then you can craft the routine properly.

Quote
you try tracing what happens when the I2C driver compiled into the executive
not my problem. the executive is supposed to work.

Quote
the memory allocation code
not my issue. executive issue .

Quote
Or the way the generic OS sleep
not my issue ..

I split my code form that what is provide and assumed to work properly. Gone ar ethe days where you develop everytinhg yourself , from scratch. Whoever gets me the richest, easiest to use platform at the lowest price will win my business.

Quote
And having all the relevant code for even looking at what might actually be happening under your application locked away
Micropython is fully open source. you can dig as deep as you want. down to bare metal and datasheets.

anyway. my mindest is different. i have something i want to do and needs ot send it's data to a webserver for interaction. i wan this over an encrypted link. don;t want to learn about networking , bluetooth and all this other kerfuffle. just give me afunction call to open the connection and one to send and one to receive. i will do the rest.
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: Micropython for MCUs - is it real or experimental ?
« Reply #44 on: December 22, 2016, 07:08:36 pm »
Sticking Python generated intermediate language code directly in to those chips will not work. It can not be interpreted in such a small chip,
Quote

Works perfectly fine. 'small chip' these days is a 32 bit ARM F4 running at 180 MHz... that is serious horsepower. Homecomputers in the 80's ran on 8 bit at 2MHz and could do serious stuff. An old 8051 ran full blown basic interpreter with floating point in 8 k rom and 256 bytes of ram.. that 3$ ESP32 chip has a dual core 32 bit cpu running at 180 Mhz with 512k of ram... one core is used for ble/wifi , the other is yours to do what you want ...
the micropython engine is barely 60k. leaves 400k for user program... plenty of room for something that will sit somewhere in a lightswitch or lightbulb. wanna make your own RGB led-lightbulb that can be controlled from your smartphone ? 3$ chip and 10 lines of code. bye arduino and rpi. you are too expensive , too big, too cumbersome.

Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 
The following users thanked this post: thm_w

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Micropython for MCUs - is it real or experimental ?
« Reply #45 on: December 22, 2016, 08:29:01 pm »
 :-DD

Why on earth would I want to use a chip with that amount of horsepower just to turn on a light bulb?
Why would I waste 60k of ram in a chip if I can do same stuff in just a few hundred bytes?
Why would I halve the performance of the chip by using an interpreter as unnecessary overhead?

Switching on a light bulb over WiFi can be written in a couple of lines of code in the Arduino environment. Python leverages a lot of readily written C and Assembler libraries, skipping the interpreter, where performance counts. Don't know how Micropython is implemented and what chips it supports, so won't argue on that.

I must agree that a cheap chip in small form factor and with a lot of horsepower is appealing though.

What I don't like nor agree with is the thinking that one should waste resources just because one can. This sort of thinking ends up in products like M$ Windows and similar which are consuming way too much resources while doing nothing. (in Win10 they have actually tried to work on this for the first time since first NT based release)
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5319
  • Country: gb
Re: Micropython for MCUs - is it real or experimental ?
« Reply #46 on: December 22, 2016, 09:18:42 pm »
but the fact that they even bothered to fund an investigation
Means that bureaucracy did its job and someone got paid to do pointless work. They will never approve this thing for anything space related. Half of the C standard is not approved for normal automotive use via MISRA-C. I can't see a dynamic language getting anywhere close to space stuff.

There are already cubesats using commodity hardware and software.

Indeed. But they are also very heavily modified. I've worked directly with, and consulted to, several cubesat teams, and while there is a limited amount of off the shelf stuff in there, none of it is turnkey by any stretch of the imagination. Even with the off he shelf components, the whole thing has to be integrated.

If Micropython is see to be fit for purpose for some things then so be it, but the power budget on these spacecraft is typically very limited, for a 1U it's about 1W. Pissing your power budget away in a housekeeping unit built with interpreted code (I am assuming it's interpreted) isn't what I'd recommend as a first shot, but if the team lacks embedded expertise and can make it work then so be it.

Usually in my experience, it's not the embedded expertise that's lacking, it's the RF and modulation/coding that's usually where there's a dearth of knowledge.

I have nothing against Micropython, but like any language it is not a universal panacea.

Edit: I see it's compiled which is definitely a step in the right direction, but the devil's in the detail, there will need to be a HAL of some description written, and I'd anticipate its performance in DSP won't be stellar. How does it deal with data structures and GC for example? How deterministic is it? In real time systems, these are crucial considerations.
« Last Edit: December 22, 2016, 09:47:08 pm by Howardlong »
 

Offline free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: Micropython for MCUs - is it real or experimental ?
« Reply #47 on: December 22, 2016, 09:39:29 pm »
:-DD

Why on earth would I want to use a chip with that amount of horsepower just to turn on a light bulb?
Why would I waste 60k of ram in a chip if I can do same stuff in just a few hundred bytes?
Why would I halve the performance of the chip by using an interpreter as unnecessary overhead?

Switching on a light bulb over WiFi can be written in a couple of lines of code in the Arduino environment.

your atmel chip  alone costs more than the esp32 and does not have wifi and ble ... you need external stuff for that .. which costs more money...
does it matter if your lightbulb turns on 20 microseconds faster than mine ? mine is on the market , you are still getting yours to work ...
the esp has 512K of ram. so what if i use 60k ?


Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline tszaboo

  • Super Contributor
  • ***
  • Posts: 7392
  • Country: nl
  • Current job: ATEX product design
Re: Micropython for MCUs - is it real or experimental ?
« Reply #48 on: December 22, 2016, 10:32:51 pm »
i can do it with an epsressif ESP32 ( 180 MHz SOC with wif and bluetooth embedded ) and a bosh BM280 environmental sensor. hardware cost  : 4 to 5$
IDK, but I have trouble sourcing the ESP32 for the moment.
It will run this micropython? Out of the box?
yup. Micropython is released for the ESP32. and Wifi/ble are full working.

Wipy module :   https://www.pycom.io/solutions/py-boards/wipy2/
I was asking about a trusted source, where I can buy 1K PCS of the modules. I was really cautious about the ESP8266 also, but then Avnet (well, the local Avnet supply chain child company) started supplying it.
I actually asked them about the ESP32, no answer.
Anyway, I've used python for IOT-i stuff. I had big reservations, but it actually worked out quite well in the end. Say, hundred lines of code. The programmer guy reviewing my code told me I do too much error handling...
I kinda starting to like the idea...
 

Offline slicendice

  • Frequent Contributor
  • **
  • Posts: 365
  • Country: fi
Re: Micropython for MCUs - is it real or experimental ?
« Reply #49 on: December 22, 2016, 11:21:16 pm »
Anyway, I've used python for IOT-i stuff. I had big reservations, but it actually worked out quite well in the end. Say, hundred lines of code. The programmer guy reviewing my code told me I do too much error handling...
I kinda starting to like the idea...

That's great! Personally I don't like Python at all, C makes so much more sense to me as a language. I've been studying quite a lot of math libraries and I found out that everybody was complaining that Python is really slow compared to pure C. Much easier to slap together a complicated formula in Python, but C gives the answer much faster. And it makes sense to me, as there is virtually no overhead in C.

Using Python on a chip that has room for it is alright, if the performance drain is no issue. Helps completing stuff a lot faster.

C is really easy to learn, and it is really easy to jump to C++, Java and C# after learning basic ANSI C. But making some more complex projects in C requires a good set of ready made libraries, or really good knowledge of how Microcontrollers or Processors work. Coding in pure C can become very time consuming in the long run compared to higher level languages. But that is nothing compared to Assembly language, where you have to control every bit and byte moving in a microcontroller.

If you want to be in control of your chip programming and don't want to be forced to be dependent on too many ready made libraries, or be limited to just a few chip, then go for Assembly and C. Where C is the better choice and Assembly worthwhile learning. If C is too hard to learn then I'd say programming is not your thing, and you should consult a pro or at least someone with basic programming skills.

If you are only going to program a handful of chips and they all have full Java or Python support (ready made libraries for those chips), but no direct C support, then go for those languages. You can always switch to C later on same chips.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf