Definitions of West coast:

  • noun:   the western seaboard of the United States from Washington to California