Definitions of Western united states:

  • noun:   the region of the United States lying to the west of the Mississippi River