Definitions of Wild west:

  • noun:   the western United States during its frontier period