Western Culture Isn’t Feminized, It’s Transgender

3 hours ago 1


Helen Andrews argues woke culture is the inevitable result of women taking over pivotal industries such as law, media, and medicine.
Read Entire Article