As I get older I feel almost disgusted with the idea of getting married and having kids due to the fact that as a woman, I have more on the line: my lifestyle, job, future, body, and mind. Men can just not pull out and go back to work and not be questioned about their "capabilities". Not just that, but I'm expected to cater to my partner more like cooking, cleaning, and taking care of the kids even if I have a full time job just like my husband would, but that's just not fair especially if I'm expected to make a whole human and push it out of my damn body. I've had situationships with men primarily because I can't commit as I'm terrified my life with become changing diapers and packing his lunch and fucking in only missionary until we die. I run for the hills all the time.
I also find most men are super into me being really driven career wise, but the more attached they get to me the more they become "traditional". They'll bring up getting married and having kids like as concepts even when they know I'm not really into it as if they think they can change my mind with time.
I also feel most women I talk to "learn to love" their situation. They'll say "I didn't want to get married, but after he proposed I knew he was the one!" or "I didn't want kids, but after he kept asking I love our babies more than anything!" All I hear is I knew what I wanted and I was pressured into my husbands ideal future. This scares me more than anything else because these are women with PhDs who are now just their husbands wives...even after they were clear with what they wanted.
So tell me, is it worth it. Be honest, but be kind. I'm in my early twenties but as I age I feel more strongly aligned with the mindset outlined here. I'm looking for outside perspectives if you felt the same or feel the same or even if you don't at all give me your outlook and how things were for you and if you'd change it.
Much love <3