The American West in modern times generally refers to anywhere west of the Mississippi River if you're in the East. Culturally, though, I'd mark it as including the states of:


I don't include those last three states as part of the West because they have fairly unique cultures of their own (or multiple such cultures, in the case of California). Geographically, though, they do fit.

Log in or register to write something here or to contact authors.