Create a new language.
One that isn't boneheaded and does not bank on whitespace for flow control.
One that does has a parser that understands when = is an assignment and when it is a comparison.
One that doesn't need every line to be ended in a semicolon, but uses a continuation character when needed.
One that doesn't smell like C, C++ or c# or Java or quiche-eaters pascal and its derivatives.
One that doesn't look like a bunch of cryptical mathematical equations
One that is easy to understand, uses simple english keywords.
something along these lines
a = 12
b= 14
c= b-a
for x = a to b
c = c^x
print a,b,x,c
next x
- simple english with a small vocabulary of keywords
- smart compiler that understands syntax and knows when to assign and when to compare. no need for walrus operators or == and other shenanigans
- smart code environment that uses autocompleteand puts the closing quotes around strings and fixes many speeelling errors
- interactive debugger that lets you singlestep through a program , examine data , add or remove instructions and rerun sections of code. No continuous code, attempt to compile, fix typing errors , missing brackets and semicolons , run , crash and not know where or why or even able to see or save the data. Fire up the program. if a mistake is encountered the program stops without loss of data. the debugger tells yu the most likely cause, lets you fix the source and then attempt to continue the run with the fix in place. while in debug you can execute code interactively , examine and alter data contents and even dump the current state of program and data to disk so it can be examined later without having to rerun the entire shebang. the entire machine state, data and all, can be saved.
I'm thinking to call it something along the lines of 'Ultima' or maybe something simple and basic... like 'Basic'.
Here is one of the reasons there have been so many languages over the years. Free electron is speaking as a user - someone who just wants to get something done.
Strongly typed languages, variable declarations, semicolon line endings, walrus operators and the like are the desires of someone who wants to make the compiler easier to write, easier (perhaps possible) to prove that the generated output represents the commands of the input.
A perfect language for one group will never be the perfect language for the other.
finally someone who understands my point of view. ( as opposed to bashing a language like 'basic' )
Why is it so hard to discriminate when = means 'assign' and when it means 'compare' ?
When used in conjunction with an 'if' clause : comparison , anything else : assignment. is it really that hard to figure out for a compiler ?
The same thing with semicolons. At the end of the line of code there is a CR/LF ( or LF if you are on oonix ) in the source file. There is your bloody line ending and end of statement. if, and only if, it is so long you really must spread it over multiple lines , use a continuation character. The number of times a command needs to be split over mulitple lines is far less than single line , so it saves keyboard hammering. besides, if your statement realyl is that long : maybe split it in a few simpler statements ? A book is made from many sentences. Not just one long sentence. If you want to keep it simple and approach a wide audience : use short sentences.
Basic used to have the 'LET' keyword to make assignments. Then they figured out there was no need for it because the compiler can figure it out. (historical note : BASIC is several years OLDER than C. If it could already do that that makes C a piss-poor compiler.
Same thing with creating variables. Why do i need to create a variable ?
a=4
tells the compiler create something called 'a' and stuff the number 5 in it
since computers discriminate between integers and floats we can use a modifier to 'cast' , but even then there should be no need.
a=5 ' a is integer
b = 5.0 ' a is float
c = a+b ' since the compiler knows the type of a and b it will resolve to float
d = int(c) ' the int operator always returns an integer so cast d as integer.
d = 5.0 ' throw an error . i goofed up .
b = b and &hff ' throw an error b is a float
d = int(b) and &hff ' works
for all i care you can have two operators : toint and tofloat
It is 2020 . The compilers should be able to figure these things out without us humans needing to spoonfeed them each and every time.