I’ve seen multiple times all over Reddit about American Politics, like now something about immigrants eating pets? 🤷 I’m sure I glanced at a news headline about Donald Trump saying that, but like most people I just scrolled right past it.

Most of the world really doesn’t know or care about what’s going on in America, but most Americans I come across online think they are the centre of the universe 😂

They are people dying all over the world in catastrophes that other countries are experiencing, from knife crime in England, immigrants sinking in boats near France, all the way to deadly floods in Poland and Austria. Do you really think people will care about delusional nonsense American Politicians say?