"Only in America" is something I hear
time and time again on television and see in print.
Almost invariably, the thing being referenced is NOT something that could only happen in the states. It
tends to piss me off(though, admittably I shouldn't actually get that upset about it) because it's just more evidence of
the self-centered attitude of Americans.
Give it
some thought the next time you find yourself
saying this phrase.