Definitions of dermatology:
  • noun:   the branch of medicine dealing with the skin and its diseases

Related words...
Use in a sentence...
Descriptive words...

Search for dermatology at other dictionaries: OneLook, Oxford, American Heritage, Merriam-Webster, Wikipedia

See dermatology used in context: 2 definitions

Help  Advanced  Feedback  Android  iPhone/iPad  API  Blog  Privacy

Copyright © 2020 Datamuse