White men’s grip on U.S. health care may be slipping

(HealthDay)—The U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists, a new study finds.