The West

From Wikipedia, the free encyclopedia.

Jump to: navigation, search

The West is a generic term for western regions in a number of countries and regions:

In the United States -

  1. The Western United States

In Australia -

  1. Western Australia (also The West Australian newspaper)

It can also designate broader distinctions: -

  1. The Western world, or Western Civilization.
Personal tools