So chances are, if you've taken a first course in linear algebra you know about the Gram-Schmidt procedure (GS): it's an algorithm to produce, starting from one finite set of linearly independent vectors in any inner product space, a second set of vectors with the same linear span as the first one, but orthonormal. Particularly, the algorithm produces vectors such that the ith vector in the new set is in the linear span of the i first vectors of the original set (and vice versa). Quickly in formulas: given linearly independent vectors p1 through pn, the algorithm produces vectors q1 through qn, such that for all 1 ≤ i ≤ j ≤ n
span(q1, ... qn) = span(p1, ... pn),
qi ∈ span(p1, ... pi),
pi ∈ span(q1, ... qi),
<qi,qj> = δij.
Thus, GS is a type of QR decomposition algorithm. That's because if you stack the vectors p1 through pn horizontally as columns of a matrix P, then the algorithm produces a decomposition of P into the product of a matrix Q with orthonormal columns, and an upper triangular matrix R. The matrix Q consists of the vectors q1 through qn as its columns, and the ith column of the matrix R gives the representation of pi in the new orthonormal basis (its inverse, also upper triangular, gives the representations of the q-vectors in the p-basis). R is upper triangular because the representation of pi in terms of the vectors q1 through qn involves non-zero coefficients in only the i first terms.
The QR decomposition (the result) should be distinguished from GS (the algorithm). QR decompositions are very useful in various numerical applications, but it turns out that the algorithm you want to use in these applications to compute the decomposition is never GS, because of GS's numerical instability issues. Instead, any standard linear algebra numerical library (such as LAPACK) should provide an optimized QR procedure which uses some combination of Givens rotations and Householder reflections. Still, GS is a great tool when writing proofs.