I agree, that's why loads of QB programmers use MS-DOS debug assembler to put binary code into their programs when they need a bit more speed. Remember call absolute?
Remember? I was beast at ABSOLUTE.
I remember the ~33 byte mouse routine (INT33 with register-to-stack wrapper). Also made my own, I think... though I don't remember what I changed, if anything. Also a keyboard interrupt for maintained keypresses. Better pray your program doesn't get an interpreter error in the middle! Made my own routines for certain graphics functions -- I wrote a raycaster engine (think Wolfenstein 3D ca. 1992) which ran, IIRC, about 20 FPS with SINGLEs, 35 FPS with INTEGERs (fixed point), and a comfortable 60 FPS+ with the assembly routines (also fixed point).
I also later ported much of it to assembly (since I didn't bother with memory allocation, it was a 200kB executable, about 8k of which was code -- from a ~30k commented .asm file!), which ran only a measly 400 FPS on my 1GHz processor at the time.
The assembly version (with a byte-per-pixel to EGA converter routine) took about ten seconds per frame on my 8086 computer.
(Oh and yes, that machine has proper 128k EGA w00t!
)
That, and a few other projects, made an interesting spring. Wanna say it was 2007 or so. Taught myself x86 for reals (I had merely dabbled -- DEBUG and such -- before then), fairly fluent in about two months. That was also before I had any "qualified" learning (my first class on that was AVR assembly the following year).
Tim