The likelihood that a randomly-selected television show will be a rerun that the viewer has already seen can be portrayed as an inverted bell curve, with the y axis showing the likelihood of seeing a rerun and the x axis showing how familiar the viewer is with that particular show. It would look very roughly like this:
```'  '                          '  '
'                  '
'            '
'        '
'      '
'    '
''
```

The gist of it is that the more familiar you are with the show, the more likely you are to see an episode you've already seen, since presumably you have seen many episodes. The odd thing is that it works the other way around, too...if you've only seen one or two episodes of a particular show, the chance of a random selection resulting in a rerun of one of the episodes you've already seen actually seems to go up, defying any scientific explanation.

Case in point: I have never been a Seinfeld fan, and usually avoid watching the show if I can. However, once or twice, out of sheer boredom, I have watched an entire episode of Seinfeld. Uncannily, despite only ever having seen perhaps three or four full episodes of the show, whenever I decide to watch Seinfeld, the episode on TV always turns out to be one of the ones I've already seen before.