As I mentioned previously C and C++ are very difficult to parse reliably. I think the .h files (or more generally forward declarations) makes the compiler's job a lot easier.
same goes for the stupidity with typecasting. on some compilers an in is 8 bit , on other it is 16 bit , ... why on earth does that have to change ? remember the ariane rocket that blew up ? that was because the real hardware had an int as 16 bit while on the emulator it was treated as 32 bit.... the same code overflowed in the rocket ... because the idiots who build compilers think it is ok to have different definitions for the same type....
In addition to providing the interface, the .h files make it possible to provide the libraries of the functions without providing the actual source code.and why is that needed ?
if the compiler traverses all files in a directory ti will find the libraries. whether those exist as .c files or as .obj . doesn't matter.
my claim is : there is no need for includes and .h files. all the compiler has to do is look in the directory where the project is stored (and its subdirectories) and see if it can find the function, either in source or object. if it can't find it there : stop compilation.
As I mentioned previously C and C++ are very difficult to parse reliably. I think the .h files (or more generally forward declarations) makes the compiler's job a lot easier.multipass compilation solves parsing problem.
pass 1 : go over source file and collect functions and subroutines. build a map in memory of their definition (what goes in and out) and attach the source or object. once all the files are loaded in ram then all source and object is loaded as well. at this point all routines available are known and also all routines that are being called are known. if a callee is missing : stop compilation as something is missing. if something is not called : remove it as it is unused and not necessary for the program. (this results in automatically stripping superfluous functions from libraries.)
pl/m compilers back in 1972 already could do that. it's 2016. don't you think it is time C started doing that too ?
you have to be carefull with interrupt routines
which aren't called from anywhere else
void isr_init()
{
..
isr_set(isr_machine_exception, 0x00000100);
..
}
As I mentioned previously C and C++ are very difficult to parse reliably. I think the .h files (or more generally forward declarations) makes the compiler's job a lot easier.multipass compilation solves parsing problem.
pass 1 : go over source file and collect functions and subroutines. build a map in memory of their definition (what goes in and out) and attach the source or object. once all the files are loaded in ram then all source and object is loaded as well. at this point all routines available are known and also all routines that are being called are known. if a callee is missing : stop compilation as something is missing. if something is not called : remove it as it is unused and not necessary for the program. (this results in automatically stripping superfluous functions from libraries.)
pass 2 : compile all standalone functions ( functions that do not call others ) into machine language. create a callable and in-line variant for each.
pass 3 : compile functions that call others. direct calls become in-line. nested calls become true calls.
repeat pass 3 until top level of hierarchy is reached.
pass 4 : map final memory allocation for all function and assign hard addresses. (essentially linking)
pass5 : spit out binary.
pl/m compilers back in 1972 already could do that. it's 2016. don't you think it is time C started doing that too ?
my claim is : there is no need for includes and .h files.
same goes for the stupidity with typecasting. on some compilers an in is 8 bit , on other it is 16 bit , ... why on earth does that have to change ? remember the ariane rocket that blew up ? that was because the real hardware had an int as 16 bit while on the emulator it was treated as 32 bit.... the same code overflowed in the rocket ... because the idiots who build compilers think it is ok to have different definitions for the same type....while i agree, fixed lenght types do exist (int8_t, uint32_t etc.)
same goes for the stupidity with typecasting. on some compilers an in is 8 bit , on other it is 16 bit , ... why on earth does that have to change ? remember the ariane rocket that blew up ? that was because the real hardware had an int as 16 bit while on the emulator it was treated as 32 bit.... the same code overflowed in the rocket ... because the idiots who build compilers think it is ok to have different definitions for the same type....while i agree, fixed lenght types do exist (int8_t, uint32_t etc.)
remember the ariane rocket that blew up ? that was because the real hardware had an int as 16 bit while on the emulator it was treated as 32 bitthat was Ada. "AdaTran", actually - badly converted Fortran.
my claim is : there is no need for includes and .h files.Correct. Just put everything in a single source file
- makes the compiler's job a lot easier!
they are defined poorly
word and half are not *C terms*
they come from CPU manuals
like a contribute to the confusion
I believe "word" was 16bit with 68K
(which is a true 32bit machine)
because the 68000 bus was 16bit
while the R2000 bus was 32bit
therefore half (word) makes sense for MIPS
the problem is: the C language should not stay
so close to the hardware, especially if it aims
for being a portable language
it's just an example, it shows the wrong attitude
m68k (CISC)
a long is 32bit
a short is 16bit
:
they are defined poorly
yep, a mess
e.g.
m68k (CISC)
a long is 32bit
a short is 16bit
a char is 8bit(2)
a word is 16bit
there is no half
a long long is 64bit
MIPS (RISC)
a long is 32bit
a short is 16bit(1)
a char is 8bit(2)
a word is 32bit <------
there is also half (word), which is 16bit
a long long is 64bit <-----
(1) but sometimes with PowerPC (RISC) is 32bit
I have seen sizeof(short)=2byte, sometimes =4byte
WTF
(2) some DSP by TI says that char is 16bit
It shows that the language doesn't fit your problem you are using it for.
I'm pretty sure that's true of MIPS64 as well
It's a bit shocking how much code breaks when sizeof (long*) != sizeof(long)...
computer science, feelings of pity and sorrow for someone else's misfortune
That's a valid observation, but it raises two important questions:
- why do people (plural) use the "wrong" language for the problem
- what is the domain for which there are no better languages
that is the reason why I am developing my own language
but it's pure hobby for my own research (=pleasure)
- why do people (plural) use the "wrong" language for the problem
- what is the domain for which there are no better languages