health care
Addressing Racism Against Black Women In Health Care Is Key To Ending The US HIV Epidemic
Researchers have found that the health outcomes of Black patients improve when they are treated by Black doctors.