The U.S. Women's Health Alliance is a national women's healthcare organization of respcted healthcare practices across the United States, who work together to improve the quality of women's care, create a more affordable healthcare system and protect and preserve the private practice of medicine..