That delightful piece of advice was offered to me by my Linear Algebra/Multivariable Calculus professor (Ravi Ramakrishna, for reference). And it just confirms, among other things, that math jargon is probably the most unintelligible jargon around.

Next semester, I'll be in two math classes: "Honors Intro to Topology" and " Honors Intro to Algebra". The one that most amuses me is the latter, of course. It's a 400-level math course, which is usually for junior and senior undergrads (I'll be a wimpy li'l sophomore).

The course is _Honors _INTRODUCTION to fucking _ALGEBRA___.

Inwardly, I'm filled with schoolgirlish glee when I tell people the classes I signed up for. Without missing a beat, everyone goes "Wait, what? Didn't you take algebra in middle school?" It's a predictable response, and know, every time, who will express surprise. I honestly can't wait to take the 600-level graduate math class called "Algebra".

For reference, I just finished "Linear Algebra" and "Multivariable Calculus" this year. Okay, if you think about it, "Linear Algebra" SHOULDN'T sound intimidating, but the "linear" qualifier makes people twitch for some reason. And I've heard countless non-mathy people complain about "Calculus" (which really isn't THAT hard), but "Multivariable Calculus" just sounds nightmarish to them! "You mean, there's more than ONE variable?! AJRGJKHSLJA."

So, how did the title of this node come about? When we were perusing our professor's book shelf.

Let's start with a reference point. This year, I used the textbook "Vector Calculus, Linear Algebra, and Differential Forms: A Unified Approach" by Jon Hubbard. That's two different "OMGWTF" high level math courses for non-mathy people. The subtitle might as well be "Oh my!" Naturally, this big fancy title had math that was relatively simple to grasp (though to a large extent, you couldn't skip ahead much). It was also pretty coherent, well-written, and gave us a challenging, interesting, and thorough introduction to "What will you, as a math major, experience?" (the answer is: lots of proofs). Most of us could use the book with ease, and the expected mathematics background of the class was AP Calculus BC (for non-freshman, Calculus II).

The book, "Principles of Mathematical Analysis" by Walter Rudin (where "Analysis" is effectively a superfield of calculus) is something I could read in my free time with a little bit of effort. It deals with ideas like "building" the real numbers from the rational numbers (an involved but not entirely difficult task). It also proves (I believe) that the real numbers have no "gaps", and uses all that as a basis for calculus (both normal and multivariable). Towards the end, it goes into Lebesgue integrals and Measure theory.

The next book of note was "Abstract Algebra". My professor said that a little bit more experience (after my own algebra course next semester) would be enough to understand it (alternatively, I could wade through it on my own time).

The book called "Principles of Arithmetic", he cautioned, I would need at least one and a half more years of math to get. At this point, non-mathy people would be going "BULLSHIT! I know arithmetic! 1 + 1 = 2". Sorry, this shit is hardcore, mathematicians take arithmetic to levels you couldn't even DREAM of.

Finally, he told me not to even bother with "Algebra", I'd have to wait until I'm a graduate to even begin to understand it.

What's funny is that, after a year of college math, I know enough jargon to understand MOST wiki math articles. Of course, most of the math jargon collides with normal english words. Often, these english words have "simple" mathematical meanings that merely generalize a lot of what non-mathy people consider to be "math". For example, a "field" is a set (a mathematical set), and two operators "+" and "×" (called "addition" and "multiplication") that have a lot of properties that, to most sane people, seem reasonable (things like "a × b = b × a" and "a + 0 = a"). These properties are given fancy names like (in the parenthetical example) "Commutativity of multiplication" and "Existence of an additive identity". The rational numbers are an example of such a field. A field is NOT a grassy plain for romping around, and it is NOT a bunch of vectors anchored at points.

My "arithmetic" is not the same as your "arithmetic". Have you ever PROVEN that "a + 0 = a"? Do you have any idea HOW you would do it? I wrote a paper in which I constructed the integers, rationals, and real numbers (assuming existence and arithmetic on the natural numbers). After proving that "a + 0 = a" for any integer a, I took a breather. After looking at it again, the sobering reality descended upon me: I had just proven that adding nothing to a number gives you that number. Sure, it's good to have that reassuring fact, that suddenly all of mathematics won't suddenly come crumbling down. But I still just proved that adding or removing nothing to a number doesn't change the value of that number.

So while it's still fun (in some sort of philosophically appealing way), there's still that non-mathematician inside of me screaming "THIS IS TOTAL BULLSHIT!" I'm pretty good at shutting him up.

(incidentally, ever since I learned multiplication in third grade I've wondered, from a strictly algebraic perspective, why MUST "a * b = b * a"? Give me some time with the Peano axioms and I'll have that proven, too)