Snip
Tarzan's a special case. He can say, "Me Tarzan, you Jane. Tarzan banana. Tarzan sex Jane" and it's easy to figure out if you know Tarzan. But Tarzan can't build an airplane. Language alone limits him.
Actually Tarzan did build (or actually hire someone to) him a zeppelin from unobtainium material similar to titanium he found from forbitten temple. Then he did fly in the pole of the earth with that ship, where he did find a hole to inside the earth where were a second world he and other did fly to explore.
Tldr; Read your Tarzan before making a poor examples. XD
if( (retval = do_thing()) == 3 )
{
do_things_with(retval);
}
so what does that do ?
do_thing() returns a number that gets compared to 3 , so that should yield a boolean that in turn gets stuffed in retval
the "if" clause only executes if retval is 'true'
So you are always calling "do_things_with" passing a boolean that is high.
assuming my analysis is right :why not simply write
if (dothing ==3) do_things_with(true);
what am i missing ? or what is not being explained ?
-edit-. crud missed the brackets. see what just happened ?
refactor that to
retval == do_dothing
if retval ==3 do_things_with (retval)
or even better ( since it only works if retval is 3 )
if do_thing ==3 do_things_with(3)
or even better
case x==dothing()
3: do_things_with (x)
endcase
As for the whole tarzan thing. i am talking programming languages. They can discriminate between assign and compare. There are languages that can do it ( the apparently hated basic can do it. PL/1 or PL/I can do it, other languages can do it too. )
it is not a "go to the store to buy an egg. if they have bread , bring two" problem .... two eggs if they have bread , one if they don't have bread / one egg and possibly 2 bread loaves.
For the occasional programmer these are annoying little details. i can't remember how many times i have been chasing a bug in a c program i wrote that turned out to be a =/== issue. or having to fix the missing ; or }. or having to remember when i need the {} , when i can drop them and where i need the ; and where not ( now i just put them whether required or not. )
Having a good ide that cleans up that stuff is extremely important for me. There is nothing more annoying to sit there waiting for a program to compile and after 5 minutes compilation aborts cause a ; is missing ...
i work differently. i want a language that matches my operating mode. interactive debugging. stop at error, let me examine/alter contents and code and tell it where to continue (or what to redo).
... and if the condition isn't == 3, but, say > 3, then your "replace-it-by-copypasted literal" strategy and "switch-case" strategy both break. In programming, the right question to ask is not "can I just make this work, with these exact values"; no, it's "what is the correct general pattern of doing this". Let the compiler work out the optimizations, let the compiler internally "copypaste" the literal "3". You as a programmer - communicate the intent.
Syntax isn't important. The important thing is to recognize what the correct underlying action is, and describe it with the language. In English, this could be: "If you have more than five bananas, give me all of your bananas except two which you can keep."
You can't describe this with a huge switch-case lookup table containing all possible numbers of bananas, or gazillion if-elses, either. You need to detect the pattern of checking the number of bananas, and using variables and arithmetic. Here, assignments and comparisons are both needed, are different operations, and they can be combined with checking. Both in English, and in C. You can of course add limitations; in English you can say "no sentences longer than 5 words". In BASIC you can say, no assignments inside IF clauses. If you forbid it, and everybody knows it's forbidden, then the compiler can assume the intent and is right 100% of the time.
But is this a good limitation?
The example I gave is very typical in C codebases and the reason is simple: it communicates the right intent - store value and act based on that value -, it removes copypasting variable names or values.
Short-circuiting && and || are similar. If you just know the rules, they are powerful communication tools.
To really inexperienced programmers, these may feel like obfuscations and tricks or "hacks", I understand that. But the solution, really, is in learning, not in limiting the tools.
You can always dumb down the language, and you end up with BASIC. BASIC is Turing complete, meaning it can implement any program C can. But so is Brainfuck.
Too "basic" languages make programs appear longer, slower to write, slower to read, harder to understand. Higher-level languages require more initial learning (more syntax, more rules...), but then allows doing more complex, more advanced programs without wasting time working around the limitations. And quite frankly, C is, by modern standards, quite primitive. If you limit its power in any way, trying to make it "easier" for complete beginners, it becomes pretty much unusable.
We don't need more beginner-friendly languages. We need beginners who are ready to invest some time in learning. Mistakes are OK. Arrogance is the problem.
I'm thinking about the effort spent in school, learning the grammatical rules of German (aus bei mit nach seit von zu, durich für gegen ohne um) and Swedish, without much use for either language because we communicate in English anyway. Compared to that, languages like C or Python are orders of magnitude easier to learn, even if you learn the whole standard by heart! Seeing how important programming is in modern societies, I'm surprised people want completely effortless learning, but are ready to spend years and years in learning human languages, even just for fun.
Short-circuiting && and || are similar. If you just know the rules, they are powerful communication tools.
To really inexperienced programmers, these may feel like obfuscations and tricks or "hacks", I understand that. But the solution, really, is in learning, not in limiting the tools.
If I'm not terribly wrong that mnemonic for crowbaring logic chains (for speed gain) is language specific. There is no such mnemonic in ie. relay programs. People tend to go religious about C(++) and it is kind of odd. The most disturbing part is that (internet) world is so biased almost as religion to C (..and derivatives) ...and it is not a new thing.
Also most of the programming languages are super simple to learn, C-lang does have a handfull of reserved words, few symbols a few clouse rules, a few data IO rules and a few prImitive data types. What actually is the hard part is the standard libraries and especially beyond std.libs., which many times do not follow a common logic between each other (again not language specific). Also in some languages like C the toolchains are more than pita. Ie. with Python what GUI toolkit to use.
Then there is the whole systems knowledge layer, this is what professional PC / Osystem programmers seems to take as granded as part of language (all the POSIX etc. talk). While actually this doesn't relate logically to programming language at all (outside maybe assembly), it is part of scope and systems knowledge (OS vs baremetal for example). Also the so much toted memory management and heap talk do drop to this systems category more or less (ie. Stack based CPUs).
Also programming language independent of hardware or implementation (as a common way to describe intented actions and information) there is major difference to natural languages, while natural language have one and only one target, other human, the programming language do have always two targets machine and human. This is one of many reasons why the programming languages do not resemble an essay of natural language and that is also reason almost none is writing assembly and there is still search for the sweetspot between natural and machine language.
Then there is whole world of hybrid programming languages for machines ie. graphic languages (compare to full natural languages ie. which blind or deaf communities use), which can be much much more expressive than ie. C, ie. for complex logic networks.
If one knows only C all problems do look memory management. ...I took liberty to this custom proverb, since also hammers come in multitude of shapes and which many are used to something else than nails.
Just my .50cents, who do not agree is free to do so.
If one knows only C all problems do look memory management.
Not for those of us who use the Boehm GC library in our C programs :-)
Also, I've never seen anyone more obsessed with memory management than Rust programmers.
Also, I've never seen anyone more obsessed with memory management than Rust programmers.
To be fair, that's kind of the point of Rust.
Also, I've never seen anyone more obsessed with memory management than Rust programmers.
To be fair, that's kind of the point of Rust.
Yes. Peculiar kind of self-flagellation.
Modern C++ is just about as bad with unique_ptr and shared_ptr littering the code everywhere.
There are times to think about memory management, and there are times to think about what you're actually trying to achieve, and they shouldn't be intertwined everywhere like a vine choking an apple tree.
I write mostly C and spend maybe 3.14% of time thinking about memory management. Sure, it's a thing in C, but if it seems overwhelming, you are likely doing something wrong.
Of course there are some very specific challenging edge cases where memory management needs more thought than that, but then again, for those a pre-engineered one-size-fits-all solution wouldn't be suitable. With power and control, comes capability.
With Python, this is all uninteresting because Python is all about running code written by others in C, so memory management is also taken care of, in C. But as always with code written by others, you are limited to the functionality of that code.
With Python, this is all uninteresting because Python is all about running code written by others in C, so memory management is also taken care of, in C. But as always with code written by others, you are limited to the functionality of that code.
That is so when you use numpy or other similar package.
But this is not the case in web applications or other similar ones, which make intensive use of Python code.
With Python, this is all uninteresting because Python is all about running code written by others in C, so memory management is also taken care of, in C. But as always with code written by others, you are limited to the functionality of that code.
That is so when you use numpy or other similar package.
But this is not the case in web applications or other similar ones, which make intensive use of Python code.
Which then is interpreted, by code written in C...
Which then is interpreted, by code written in C...
Yes, the key here is that the Python part needs to be as high-level as possible; in other words, utilizing the "C parts" as much as possible.
This is wise; it's co-operation, some bright-minded people made good libraries, and you can leverage this work in higher level language. It's like using a switch mode regulator IC, say one with internal compensation and all. You don't need to know how it works, just instantiate the chip from Digikey, copy the datasheet example and modify whenever needed.
This isn't usually a problem, until one day you need something new and special that can't be combined solely out of large building blocks. Sometimes people try to build large programs "from scratch" in Python, i.e., something with completely home-made algorithms, IO, etc. Poor performance may become as a real surprise to those who have read "Python is not slow" articles, benchmarking typical code relying on libraries as much as possible.
C is fast, but Python is not slow. Neither is PHP.
This forum is written in PHP and web speed matters. Why hasn't the forum been programmed in c?
Because in this case what matters is the speed and flexibility of development, and interpreted languages win in that field.
Don't really need to eek out latency and CPU savings for the average php forum or WordPress website. It's not a relevant cost.
Though I think say reddit could have saved a lot of money going with Go or Java in reduced server cost, instead of using Python. Maybe even C# though it would reduce the pool of developers with many years of relevant experience, which would be a huge problem for HR.
C is fast, but Python is not slow. Neither is PHP.
This forum is written in PHP and web speed matters. Why hasn't the forum been programmed in c?
Because in this case what matters is the speed and flexibility of development, and interpreted languages win in that field.
... as long as throwing more CPU power and RAM at Python/PHP/etc is cheaper than writing code in C. However, IT's CO
2 footprint is becoming a topic.
C is fast, but Python is not slow.
Yes, yes it is.
#include <stdio.h>
int main (int argc, char const *argv[])
{
int x;
printf ("starting\n");
for (int i = 0; i < 1000000; i++)
x = x + 1;
printf ("stopping\n");
return 0;
}
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
print ("starting");
x = 1
for i in range(1000000):
x = x + 1
print ("stopping");
We run both twice to ensure that they are on an even footing, not needing to service page faults for runtime binaries and the like (this will be to Python's advantage, it has a large runtime to load before it can do
anything).
cerebus@shu:~/Desktop$ gcc -O0 -o Loop Loop.c <- note optimisation turned off
cerebus@shu:~/Desktop$ ./Loop
starting
stopping
cerebus@shu:~/Desktop$ time ./Loop
starting
stopping
real 0m0.007s
user 0m0.004s
sys 0m0.002s
cerebus@shu:~/Desktop$ ./Loop.py
starting
stopping
cerebus@shu:~/Desktop$ time ./Loop.py
starting
stopping
real 0m0.130s
user 0m0.113s
sys 0m0.012s
cerebus@shu:~/Desktop$
C - 4ms, Python - 113ms. Python manages a mere 3.5% of the performance of C for the most trivial task possible.
One can make
many claims for Python but "
not slow" is not one of them. I have nothing against Python*, when used appropriately. I've made plenty of money out of writing Python for other people and would do so again
if Python is appropriate for the task at hand. I've used Python to hack out SCPI control and graphing programs for my own electronics messing about and
will do so again.
I'm fine with using Python when it's the right tool to choose. What I'm not fine with is making stupid claims like "
not slow" just to support a fanboy position. Professionals choose the appropriate tool for the job at hand. If one needs speed one does not choose Python. Just because a lot of bodgers
have put screws in with a hammer is not an excuse to not use a screwdriver. Similarly just because a bunch of people who should have known better
have done something in a particular programming language where another would have been more appropriate is not an excuse to repeat the mistake.
Almost invariably the people who get hung up on advocating some particular programming language for every task under the sun, and make excuses (or simply deny) its faults
only know that language. When all you have is a hammer, everything looks like a nail.
*Except for the dumb "
space is significant" thing.
That sin can never be forgiven.
I reckon you might get around 5% of the performance of C if you were to stick the Python code inside a function rather than execute globally. Given that present day hardware is about four orders of magnitude faster than what we used throw at problems, some find Python is not all that slow.
C - 4ms, Python - 113ms. Python manages a mere 3.5% of the performance of C for the most trivial task possible.
113 milliseconds is not slow. At least for me.
Edit:
I also don't consider slow the 10 seconds it can take for the
Python-Sphinx code to generate my web page with the changes introduced during half an hour of typing.
With a program written in C, the web generation might take 0.5 seconds instead of 10 seconds, but who cares? 10 seconds is not slow for me.
Also I don't know of any popular static website generator written in C, as flexible as Sphinx.
C - 4ms, Python - 113ms. Python manages a mere 3.5% of the performance of C for the most trivial task possible.
113 milliseconds is not slow. At least for me.
Edit:
I also don't consider slow the 10 seconds it can take for the Python-Sphinx code to generate my web page with the changes introduced during half an hour of typing.
With a program written in C, the web generation might take 0.5 seconds instead of 10 seconds, but who cares? 10 seconds is not slow for me.
Also I don't know of any popular static website generator written in C, as flexible as Sphinx.
This here above is the fanboy speaking.
What matters not to you matters a lot to someone else. And, yes, I've spent much more than 113ms to write this (OTOH it took barely 4ms to realise you missed the point, by 1.6 km) so 113ms might be not much. And for a program you spend 2 minutes writing and 113ms executing, ONCE, it really is nothing. If OTOH you spend 20 minutes writing it in C, those 4ms can be a blessing because you might need to run that bit of code ${BIGNUM} * 10
6 times. And Cerebus wrote all that. And you ignored it. So who am I fooling, thinking I'll get through to you?
Also, the "spaces are significant" is a mortal sin. 1000 times over. (Præterea censeo, Carthaginem esse delendam.)
C - 4ms, Python - 113ms. Python manages a mere 3.5% of the performance of C for the most trivial task possible.
113 milliseconds is not slow. At least for me.
Edit:
I also don't consider slow the 10 seconds it can take for the Python-Sphinx code to generate my web page with the changes introduced during half an hour of typing.
With a program written in C, the web generation might take 0.5 seconds instead of 10 seconds, but who cares? 10 seconds is not slow for me.
Also I don't know of any popular static website generator written in C, as flexible as Sphinx.
We can take it them that you drive a car that does gallons per mile rather than miles per gallon then can we? The 113 milliseconds is just the simplest, most trivial demonstration of the wastefulness and inefficiency of Python, it's not meant to be a metric of how long a user is sitting waiting for as you know full well. That one can get more than 20 times as much done in the same time in C (or Rust* or Swift or Go) as one can in Python, or get the same amount done for more than 20 times less use of resources, is most definitely an indication that Python is slow. Just because you throw computing resources at it to make it tolerable does not stop it from being intrinsically slow.
*
fn main() {
let mut x = 0;
println!("starting");
for _i in 0..999999
{
x = x + 1;
}
println!("stopping");
}
cerebus@shu:~/Desktop/Rust_temp/loop$ time ./target/release/loop
starting
stopping
real 0m0.261s
user 0m0.001s
sys 0m0.002s
I suspect Rust is being '
clever', the debug version takes 30ms.
So all high speed trains are slow because a hypersonic rocket is 20 times faster than a high speed train?
That reasoning is a fallacy. High-speed trains travel 20 times slower than a rocket, but that doesn't make them slow, they're still very fast.
Therefore I still argue that Python is not slow. C code can be 20 times faster, but that just means that C is very fast, not that Python is slow.
There's a huge cost difference between running an online shop with one server or 20. And slow online shops also lose revenue.