Noun: tropical medicine
- The branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions
"Experts in tropical medicine were called in to address the outbreak of malaria"
Derived forms: tropical medicines
Type of: medicine
Encyclopedia: Tropical medicine