And in fact arduino has printf() (or will allow you to use printf()), but you'll have to set it up in mysterious ways before you can use it (sort of like using JCL to define the IO of your ForTran programs on the IBM mainframe.)
Look, there are about three pieces...
1) There is the syntax of the C language itself. Syntax, Functions, Variables, control statements, operators, and so on. This is probably best learned on a desktop, with a debugger handy, using a reference like K&R. There are a bunch of principles here common to many computer languages. If you've programmed in any other language before, or had one of the "Computer Science 101" classes, this should not be all that strange. If you don't have ANY programming background, you might want to look around for a basic programming or computer science class (exact language not too important.) Be aware that many classes, and books, jump right into manipulation of images on your desktop screen, because that's the fun part.
2) There are a bunch of concepts that are key to programming microcontrollers that aren't so common in desktop environments. IO Registers. Bit Masks vs bit numbers, bit shifting, and bit-banged IO. Volatile variables. Memory limitations. Interrupts. Concurrency (and the lack thereof.) The low speed of some peripherals. Locations that you write a 1 to to set them to zero. What peripherals do and what are their limitations. The fact that exit() isn't going to stop your program, and terrible bugs aren't going to put a nice error message on your screen.
This is probably the hardest part. If you have a chip/electronics/assembler background, it's not too difficult to add to the basic C knowledge, but books and tutorials that deal with these things are a lot rarer than books on Excel...
3) A particular environment will have a set of library functions that it uses for doing environment-specific things. This is true whether you're talking about Standard C functions like printf(), Posix functions (an operating system standard), arduino, cmsis, Atmel Software Framework, Stellarisware, or whatever. Arduino gets some negative press for "hiding important things that you should learn", but I don't see that it's that much different than vendor-provided libraries. (A vendor training video I watched recently had a sample program that used both LED_Toggle(LED0) and gpio_toggle_pin(NHD_blahblah_BACKLIGHT). Both to blink LEDs...) Arduino uses the same standard avr-libc that most people programming AVRs in C end up using, and adds some additional functions.
A lot of "modern" programming seems to be about learning what library functions exist, and how to get them and use them. Cause those GTK windowing classes are so much more advanced than the KDE `toolkits, you know. (not that this is really "modern"; back in the Fortran/COBOL/PL1 days, a lot of learning a language was about using the IO facilities of the language. But in those days those were part of the language, while now you can pick and choose different libraries that behave somewhat differently.)
And that will get you started. You still won't be able to immediately connect up that weird 9 DoF sensor chip and make an autonomous quadcopter assassin-bot, because you'd need to understand how that chip works (unless someone else has already done it and is willing to share their code, which is somewhat more likely in the arduino world than anywhere else these days.) And even after you've done that, autonomous robot assassins are hard research projects that DoD pays large teams big bucks to try to figure out...
A standard-conforming C implementation is expected to provide the standard library exactly as specified.
And one of the first things you need to learn if you want to program C on microcontrollers is that that isn't how it's going to be.
In Arduino you can use the write() and writeln() functions instead.
print and println class methods, actually (so "Serial.print(...)" rather than just "print(...)")
(writeln is Pascal...)