I recently moved to California. Before i moved, people asked me “why are you moving there, its so bad?”. Now that I’m here, i understand it less. The state is beautiful. There is so much to do.
I know the cost of living is high, and people think the gun control laws are ridiculous (I actually think they are reasonable, for the most part). There is a guy I work with here that says “the policies are dumb” but can’t give me a solid answer on what is so bad about it.
So, what is it that California does (policy-wise) that people hate so much?
And that’s not America projecting itself? I’m a Pacific Islander, I could tell you a whole lot about what America has done. It wants itself everywhere. That’s projection if I’ve ever seen it.
no, that’s not projection. that’s American Imperialism. And very very very very very different than what you originally described.