What specifically is it that would change about America that scares you if we became much more progressive?
This is a very genuine question. As someone who has been on the left my whole life, I’ve truly just never understood why many conservatives fear the country becoming more progressive.
I guess I just don’t really understand what changes would come about that would be a net negative enough that it makes people feel as though moving at all to the left is going to decimate our country. Obviously I view ideals to the right as more harmful (in my own personal opinion) but I would like to hear how you actually believe progressive ideology would play out in this country and what are the specific things that would change that scares you the most?