See (my) writeup in monotone sequence theorem, or none of this will make sense. Also, be warned! SPOILERS AHEAD!

Well, suppose our sequence a1,a2,... has no infinite ascending subsequence. Let's use this to find an infinite descending subsequence!

Consider a "chain" ascending from a1 (l1,1=1): al1,1<al1,2<al1,3<... (we'll see why we have the first index in a moment...). Since there's no infinite ascending subsequence, this chain must end at some last element ak1, so we know that ak1≥aj for any j>k1.

OK, so now take l2,1=k1+1, and extend an ascending chain from al2,1 as far as possible (again, by assumption it's finite!). Call the last element this time k2. Then k2>k1, so we have ak1≥ak2. On the other hand, ak2≥aj for any j>k2.

Continue in this fashion: always pick lj,1=kj-1+1, then a maximal ascending sequence alj,1<alj,2,..., conclude that it must be finite, and take akj as its last element.

This gives us an infinite descending subsequence, ak1≥ak2≥..., as required for a proof of the theorem.

Log in or register to write something here or to contact authors.