RhymeZone

 

Definitions of tropical medicine:
  • noun:   the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions

Related words...


 
Help  Feedback  Privacy  Terms of Use

Copyright © 2023