u/DestinyL11

Looking for a book about the history of doctors and their misinformed notions about women!

Hello!

​I am a writer and have a character that requires me to understand the medical misinformation that doctors believed in the 17-1800s, about women with regards to their physical and mental health, as well as generally what they viewed as "inherent differences" between men and women on a biological level (which we now know to be false).

Any type of book is fine, even textbooks! Im willing to buy it.😭

Thank you!

reddit.com
u/DestinyL11 — 5 days ago