dentistry: Meaning and Definition of

den•tist•ry

Pronunciation: (den'tu-strē), [key]
— n.
  1. the profession or science dealing with the prevention and treatment of diseases and malformations of the teeth, gums, and oral cavity, and the removal, correction, and replacement of decayed, damaged, or lost parts, including such operations as the filling and crowning of teeth, the straightening of teeth, and the construction of artificial dentures.
Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.
See also: