Definitions of West:
- noun:    the countries of (originally) Europe and (now including) North and South America
- noun:    the region of the United States lying to the west of the Mississippi River
- noun:    English painter (born in America) who became the second president of the Royal Academy (1738-1820)
- noun:    United States film actress (1892-1980)
- noun:    British writer (born in Ireland) (1892-1983)
- noun:    the cardinal compass point that is a 270 degrees
- adjective:    situated in or facing or moving toward the west
- adverb:    to, toward, or in the west Example: "We moved west to Arizona" 
- name:  A surname (common: 1 in 1250 families; popularity rank in the U.S.: #109)