I'll do you one better.
Back, just after I got my first
real computer (a 286), I made the great climb from
Basic (or, rather, the incarnation known as BasicA) to Microsoft
QBasic (no line numbers needed, text labels, actual real functions), and then up to Pascal. My compiler was
Borland's
Turbo Pascal 3.0 (and I still have the shortcut keys hardwired into my brain), which could make for real
executable files out of my miles of underwhelming code.
My one interest at this time, and a main one ever since, was graphics programming. I'm talking the big
mode 13h. 256 colors (indexed, of course), with a resolution 320x200. After tinkering around with text-mode, there's nothing like a little lava demo magic; you'll never go back. Unfortunately, Turbo Pascal 3.0 had two sets of graphical functions: turtle graphics (ugh!), and some dreadfully slow drawpixel/drawline routines.
So, being the evil little reverse-engineer that I was at the time (and, as I had neither an internet connection or any reference books of any sort on the subject), I decided I would take apart the compiled code from a simple program I wrote and see if I couldn't speed it up a bit. And there began my long, rather racy affair with x86 assembly language.
See, the only assembly I could insert into my pascal programs was actually direct machine code (in the form of an inline(FF/FF/FF/FF) directive, where the FFs are the bytes machine code. So, I wrote a small program that inserted a string, inline, just before a call to the putpixel routine. Then the program exited. I compiled this program, and opened it in good old DOS
debug, and did a string search (the actual string was "012345," which I was fairly certain was not going to occur, um, naturally, in a string of machine code) and traced the code until I found the putpixel routine, which I then fiddled around with in debug until I came across a faster way to place a pixel on the screen in mode 13h (basically, the problem was the putpixel routine assumed that the pixel color changed every time, which means a lot of OUTs were wasted, which are expensive). Then, I wrote down the machine code into my notebook, byte by byte, fired up Turbo Pascal, and wrote myself a couple functions (basically consisting of inline(...) directives) that did something on the order of quadrupling the speed of my graphics code. It was still rather slow, but I got to feel like a hacker.