Definitions of tropical medicine:

  • noun:   the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions