During the 19th century, the medical profession was male-dominated as only men could receive formal medical training. Still, pioneering women began to receive formal training and become practicing doctors. Their efforts paved the way for others to enter the field. Women’s role in healthcare has continued to evolve to the present day as women continue to work as physicians, nurses, medical specialists and researchers.
Health, Education, Science, Aviation, and Athletics