Get the FREE one-click dictionary software for Windows
or the iPhone/iPad and Android apps
Noun: tropical medicine
  1. The branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions
    "Experts in tropical medicine were called in to address the outbreak of malaria"

Derived forms: tropical medicines

Type of: medicine

Encyclopedia: Tropical medicine