Is it a problem to always tell women "get home safe?"
So a celebrity I follow posted this statement recently that we have this habit of telling women and girls "get home safe" rather than telling men and boys "don't rape or kill anyone tonight." And I was wondering if you had thoughts on how we usually just balance women having to not dress a certain way or make certain choices for their practical and immediate safety with actual complicity and accepting the world the way it is?
I am also curious about the specific framing as a quote or slogan, as women here are portrayed as mens prey and men as women's predators. I was wondering if it portrays men as these inherently dangerous individuals rather than groups who are socialized into violence or am I literally missing the point? Her statement absolutely addresses socialization, as it mentions what boys are told, but I also see rape culture as this broader societal issue not just an issue of what men and boys are like
Anyway, perhaps I am speaking from a position of ignorance and privilege and literally missing the point but I would love to know your thoughts as feminists