Modern society pisses me off
You're supposed to work, but getting a job has become a privilege. You're inundated with CONSUMER SLOP that ends up in goddamn LANDFILLS for african kids to clean up. You're told to "go to college" instead of being able to contribute to the workforce with on the job training after highschool. Young people are losing more and more hope as time goes on. We always worry about "the economy" when it's just made up math with made up numbers. Economic growth is a CANCER yet we rely on it.
This all feels meaningless. I don't know. Am I alone?