u/Critical_County2791

As a healthcare provider, I despise health insurance companies and all of the ways they make my and my patients lives harder. I also resent the barriers they've put in place to accessing my own healthcare.

I'm curious what employees at major health insurance companies (eg United, Aetna, etc) think of their and their company's work. Do higher ups try to convince you that your work is actually valued and appreciated by most Americans, and somehow the public simply doesn't understand? How do you rationalize being in this field, among all the areas you could devote your professional life?

I'm genuinely curious how this looks on the "other side".

reddit.com
u/Critical_County2791 — 11 days ago