To summarise some of the semi-
facts and bits and pieces on this node:
first of all, 20/20 does not refer to
perfect vision, but rather
to
normal vision. The distinction is
subtle but important.
The numbers break down as follow: the bottom number is how close someone
with normal vision has to be to see an object clearly, the top number is how
close the subject has to be. So if you have 20/20 vision, you can see an
object 20 feet away that a normal person can see from the same distance.
If you had 20/50 vision, you would have to be 20 feet away from an object
to see it as clearly as normal person would at 50, and so on.
As with all things biological the "norm" is simply that: there are
individuals who are worse, and individuals who are better. It is possible
(indeed not at all uncommon) for people to have 25/20 vision, meaning that
they can see from 25 feet what a normal person can see from 20, which is
why I said that 20/20 vision whilst normal, is not perfect.
Finally it's worth pointing out that the numbers are unimportant in
absolute terms, it's the ratio between numbers that matters most.
The number 20 is used for historical reasons (it being a reasonable
distance at which to detect most eye defects) but in most metric
countries you will now find optometrists using 6/6 as a measure (the six
referring to metres). "Twenty-twenty" is a phrase that seems to have
entered the english language however, and I think even if the USA went
metric tomorrow we wouldn't find people talking about "six-six vision".
Postscript: scraimer tells me that in some languages (the example he gives is Hebrew) there is an idiom of 6/6 vision.