The West has become synonymous with ruggedness, adventure, desert climates, and shootouts amongst outlaws and lawmen. In American society, the West is where the nation must meet its foreordained destiny, amongst the plains, deserts, and great, snow-capped mountains. This same thinking believes the West is a place where prosperity is found with prospector’s gold pan or farmer’s plow. However, throughout history the West has often been synonymous with death. The ancient Egyptians buried their dead on the western banks of the great Nile River. The West is where the blazing sun sets at the end of the day. More specifically, the West is where the sun goes to die at night. It is renewed the next day by rising from the East. During World War One, British and Australian soldiers might say their dead comrades went West. Although the West has been appropriated by the sweet words of progress, hope, and life, it is death that is familiar with and even comfortable in the West.
*Sources: Cambridge Advanced Learners Dictionary & Thesaurus and Free Dictionary