Kind of a vague question. But I guess anyone that responds can state their interpretation.

Edit: I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.

  • @[email protected]OP
    link
    fedilink
    37 months ago

    I guess that’s not what I’m thinking either. It just feels like the “image” of America isn’t what America actually is. Like there’s a marketed campaign to make things seem better than they actually are.

    • @[email protected]
      link
      fedilink
      English
      7
      edit-2
      7 months ago

      I mean, yeah stuff like “land of the free”, “the land of opportunity” or “the american dream” are just slogans. But I think most people realise that by now.

      • amio
        link
        fedilink
        37 months ago

        “The American dream” was socioeconomic mobility, that shit is for commies these days.

    • amio
      link
      fedilink
      57 months ago

      That is just every country, countries would hardly try to look worse than they are.

    • @okamiueru
      link
      07 months ago

      The image of USA is not good, at all, if that’s what you’re asking. I used to care, but some time around 2016 I simply gave up. Something about an obvious grifter and professional fuckwit, seriously considered to lead anything other than a burger to his fat face. The alternative, although infinitely better, is clearly suffering from some dementia. It’s just a shit show.

      And that’s just the politics. But it mirrors most other fucked up things in the US. The obvious and effective approaches are not considered. So… best to not spend too much effort and hope the impact of it reaching critical mass isn’t too bad.