Definitions of Deutschland:

  • noun:   a republic in central Europe; split into East German and West Germany after World War II and reunited in 1990