For α>1, it seems plausible that the Beatty sequence {⌊nα⌋}n≥1 hits "every α'th natural number". Formally stated,

Lemma. The density of the Beatty sequence for α is 1/α.
In fact, for irrational α we have the following (stronger!)
Lemma. For irrational α, there are precisely ⌊(N+1)/α⌋ elements of the Beatty sequence for α which are no larger than N.
To prove this, let ⌊nα⌋≤N be the largest element below N in the Beatty sequence. Then ⌊(n+1)α⌋>N, so (being a natural number) (n+1)α≥N+1. On the other hand, N+1>nα. Dividing both inequalities by α and taking the floor, we have
n ≤ ⌊(N+1)/α⌋ ≤ n+1,
so n=⌊(N+1)/α⌋, the number of elements of the Beatty sequence up to N.

For rational α, we note that there are ⌈(N+1)/α⌉ such elements; in any case, the density of the Beatty sequence is α (and in fact the sequence converging to the density is the greatest possible rational approximation to 1/α with denominator N+1).

Note that the Beatty sequence exactly determines α. In fact, just knowing its density is enough to determine which real α generated it. (However, there are many more sequences with density than just Beatty sequences. In fact, there are even periodic sequences which are not Beatty sequences.)

BrianShader above states the following theorem (lacking further evidence, I shall think of it as Beatty's theorem):

Theorem. The Beatty sequences {⌊nα⌋: n=1,2,...} and {⌊nβ⌋: n=1,2,...} partition the natural numbers iff α and β are irrationals satisfying

1/α + 1/β = 1

This theorem is actually easy to prove.

  1. First, suppose the Beatty sequences for α and β do partition the natural numbers. Then their densities are 1/α and 1/β; since densities are (finitely) additive, and since we assume the two sequences are disjoint, we must have 1/α+1/β=1, the density of the natural numbers.

    To see that α and β must be irrational, note first that if one is rational, the other must also be (due to the equality we've just proved). But if α=a/c and β=b/d then abcd = bc2dα = ⌊bc2dα⌋ = acd2β = ⌊acd2β⌋ is a natural number appearing in both sequences, and they cannot form a partition. Note also that this occurs at every multiple of abcd; since the sum of the densitities is 1, a periodic set of density 1/abcd appears in neither sequence.

  2. Now to prove the hard part: Suppose α,β are irrationals satisfying 1/α+1/β=1. We need to show that the Beatty sequences are a partition, i.e. that they are disjoint, and that they cover each natural number.

    First suppose that for some n, ⌊jα⌋ = n = ⌊kβ⌋. Since α and β are irrational, jα and kβ are not natural numbers; we shall use this fact. So we must have

    n < jα < n+1;
    n < kβ < n+1.

    Dividing through and inverting, we see that

    j/n > 1/α > j/(n+1);
    k/n > 1/β > k/(n+1).

    But this means that (j+k)/n > 1/α+1/β = 1 > (j+k)/n+1. So n < j+k < n+1, except that there's no natural number j+k between n and n+1 -- a contradiction. Thus, the Beatty sequences for α and β are disjoint.

    Now we shall show that the number of elements up to any N of both sequences is N. Added to the fact that the sequences are disjoint, this will show that every natural number appears in exactly one of the sequences.

    Indeed, there are ⌊(N+1)/α⌋+⌊(N+1)/β⌋ elements of both sequences. Expressing β in terms of α, these are ⌊(N+1)/α⌋ + N+1 - ⌈(N+1)/α⌉ elements. But since (N+1)/α is not a natural number, ⌊(N+1)/α⌋ = ⌈(N+1)/α⌉-1! So we see that there are exactly N elements of both sequences up to N; since they are disjoint, the 2 sequences must exactly cover {1,2,...,N}.

QED.