Senate Minority Leader Mitch McConnell appeared to freeze for about 30 seconds on Wednesday while speaking with reporters after a speech in Covington, Kentucky.

The incident is similar to an episode McConnell experienced at the US Capitol late last month and is likely to raise additional questions about the fitness of the 81-year-old to lead the Senate Republican caucus.

Wednesday’s episode occurred when a reporter asked the Republican leader if he was planning to run for reelection in 2026. McConnell had to ask him to repeat the question several times, chuckled for a moment, and then paused.

Someone at his side then asked him, “Did you hear the question, senator, running for reelection in 2026?” McConnell did not respond.

Article includes video of the incident.

  • @TheMauveAvenger
    link
    181 year ago

    Utilitarian moral theory is an entire field of ethical philosophy, even if you don’t agree with it. Someone who believes strongly in this could argue that we should have actually killed Mitch decades ago to prevent all of the pain, suffering, and death that he has wrought on this world.

    I think what you meant to say is that you have some form of feeling morally superior.

    • @Intralexical
      link
      11 year ago

      The same utilitarian ends could have been reached, and should have been reached, in any one of thousands of different ways, years ago, that wouldn’t require this.

      Fuck the ghoul and all he’s done. But just think of the level of political degeneration required to get to the point where a malicious actor gradually being paralyzed by illness in public is your most feasible option for resolving a political problem. I can understand why you would cheer for this, but the reminder that your country is even in such a situation that cheering for it makes any sense at all is itself something that I find horrifying.

      Besides, the entire point of utilitarianism is to evaluate possibilities dispassionately, and choose what results in the most useful outcome and end state. Like the other commenter said, “There is nothing utilitarian about taking ghoulish delight in the suffering of an evil old man”— The entire concept of evil is a more deontological or virtue-based construct— He’s on his way out, great, so celebrate the fact that he’s no longer causing harm, but once he’s gone, he actually doesn’t matter anymore— Contain the damage, tunc damnatio memoriae.

      Idk. I feel for y’all, I guess. In a well-functioning democracy, this should have probably been resolved almost an entire generation ago, and you should barely even have any reason to know his name anymore. Everything about this situation is horrifying, from what he’s done to what’s happening to him now to the fact that he even still matters at all in the first place.

    • Cylusthevirus
      link
      fedilink
      -11 year ago

      There is nothing utilitarian about taking ghoulish delight in the suffering of an evil old man.

      He’s falling apart and you’re enjoying it because you hate him. At least be honest with yourself.

    • @afraid_of_zombies
      link
      -51 year ago

      I don’t believe in utilitarian morality. They have no means to calculate, never solved the utility monster problem, it is a single value virtue system, and it ignores how humans work. Especially the last point, if morality doesn’t help humans live a better life it serves no purpose.

      • @Intralexical
        link
        21 year ago

        Not to mention that the first example that people always go to when trying to explain utilitarianism is always to use it to justify setting a precedent of murder, as long as doing so might feasibly produce benefits for somebody else in the future.

        Plus the future is never certain, while the present is arguably more real in any moment. Consequentialist frameworks in general are basically betting away guaranteed morality in the present in exchange for possible/imagined gains in the future.

        Personally I’ve also found that because everything is just so complicated, depending on what axioms you start with in a consequentialist perspective, you can also not only justify but make it compulsory to do literally any awful and disastrous things both to yourself and to other people.

        (Plus you necessarily end up having to introduce some sort of a concept of a “causal event horizon”— How far forwards into the future you are willing and able to try to predict events in order to evaluate the consequences of actions— If you actually try to apply and use utilitarianism or any consequentialist framework— Which means that your ethics change depending on your computing power, plus you’re in constant and actually agonising pain because having to always try as hard as you can to feel the consequences of future outcomes as far in advance as possible is fucking excruciating. … or maybe that was a specific situation (but still, what use is ethics if it just mutilates you as soon as difficult questions need to be asked?))


        Utilitarianism: The preferred ethical system for edgy youth, sheltered academics, idealistic extremists, and B-movie villains since the 19th century! …I say this having previously positioned myself somewhere between those categories.

        I do tend to think deontology is just delusion, and virtue is just ego, though— … Human “philosophy”, and academics: Maybe some dead old dudes being paid by Kings and Churches to sit around doing nothing but think weren’t actually the best qualified to empathise with how people actually live, and understand what is the right and wrong way to do it— The incentives for sounding good in a clean and neat published paper can hardly be expected to align with the incentives of actually being applicable to the messiness of reality, can they?


        Just don’t hurt people, Ffs— (And define “hurt” rigorously in terms of game theory and preserved options for autonomous constructive personal growth and equitable power dynamics, epistemology and the preservation of knowledge and feeling plus metaphysics and the preservation of identity and self, and thermodynamics and the avoidance of unconsensual entropification, if you need to)— just don’t hurt people; it shouldn’t be that complicated…