Dentistry University of Florida College of Dentistry
Dentistry is the medical field devoted to the art and science of maintaining the oral cavity's health of soft and hard tissues. Dentistry is not just about filling teeth! Dental procedures are becoming much more sophisticated and involve advancing technology to prevent, diagnose and/or treat periodontal disease, caries, malocclusion, and oral-facial abnormalities. Dentists also have the responsibility of detecting head and neck cancer and other mucosal diseases. More and more research is discovering that there is a strong link between oral and systemic health. Dentists are the dental health care team's directors, which include dental hygienists, dental assistants, and dental laboratory technicians. These dental professionals work together to develop and complete individualized and comprehensive treatment plans for each patient.