fallaciousBasis

  • 0 Posts
  • 394 Comments
Joined 3 months ago
cake
Cake day: January 3rd, 2026

help-circle
  • Your thought experiment quietly assumes that if a motivation includes pleasure, it becomes morally suspect. Odd. That’s not how we treat most other domains. People accept environmental damage for travel, resource use for comfort, even risk to others in things like driving or gun ownership or cohabitation. It’s typically because those activities provide value beyond bare survival.

    Again, you can argue animal agriculture crosses a line, but then you need to explain why this tradeoff is categorically different, not just say “what if it tasted bad.”








  • That sounds clean, but it assumes that animals and humans sit in the same moral category. A lot of people simply don’t grant that equivalence. They can fully imagine the suffering and still judge it differently because they see humans as having a higher moral status.

    There’s also a leap from empathy to obligation(or similar is to ought). Humans routinely recognize suffering and still accept it under various tradeoffs. Medicine, construction, law enforcement, even wildlife management all involve harm that people wouldn’t personally want inflicted on themselves. The fact that someone can imagine pain doesn’t mean they conclude “this must never happen.” It means they decide whether that harm is justified within their own moral framework. In the case of livestock, many decide that food, culture, convenience, or nutrition justify it. It catered to their hollow.

    Just look at guns, drugs and cars… And all the dead children. I digress.

    Not all land can grow crops. A large share of grazing land is only usable through animals. Most livestock systems convert otherwise inedible biomass into food. On the other hand, industrial animal agriculture clearly has major negative and positive impacts. The reality isn’t a clean moral on-off-switch. It’s a messy optimization problem with mutual tradeoffs, and people naturally disagree on where the balance ought land, and naturally for whom the bill tolls.

    Animals avoid suffering, but many also inflict it without moral hesitation(fight vs flight). Humans are perhaps the only ones trying to build any ethical systems at all, so disagreement is predictable. The presence of empathy doesn’t produce one universal conclusion, nor should it ever when considering the endless facets of those perceiving whatever this is.

    Cheers.








  • 7 was a facelift to Vista. It changed the control panels back to the dipshit XP-like way. That’s not an improvement.

    The kernel improvements landed in Vista. 7 inherited them. They “fixed” the driver support in Vista. 7 also inherited that.

    I’m a Windows developer. Well, more so was during the release of Vista and 7 and certainly prior. Much less these days. I loved Vista. Worked flawlessly. I’ve also used 7 extensively. They’re (near)exactly the same OS as far as the kernel and driver and API models.

    Maybe as support for Vista ended and 7 got continued support. That’s really the biggest difference, IMO.


  • Because it scales logarithmically or exponentially just fine. Precision here matters more than accuracy.

    You get a handful of values around zero. A handful of medium values. And a handful of increasing large values.

    Like in unsigned 4-bit integers(for AI) you’d likely have something binary/exponential like… 0, 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, 8192, 16384…

    Instead of 0 to 15(linear.) this is also how posits work. John Gustafson designed posits with AI in mind and explains better than I could how these tiny 4/8 bit types can fill in for much bigger types with minimal cons and massive pros(reduced memory and reduced compute). Like 16,384 is what 14 typical bits gets you(2^14). But by scaling you can get a similar range(precision) sacrificing ‘fine grain’ accuracy that AI doesn’t really benefit from. Which is kind of similar to how floats work with sign bit, exponent, and mantissa. But most times people want binary floats for AI.

    So you might get something like:

    0, 1/8, 1/4, 1/2, 1, 2, 4, 8 (also negatives)

    Or even:

    0, 1/32, 1/16, 1/4, 1, 4, 16, 32 (also negatives)

    Or even:

    0, 1/1000, 1/100, 1/10, 1, 10, 100, 1000 (also negatives)

    Honestly, AI doesn’t really care as long as you stick with the same scheme.


  • A simplified explanation is that decryption keys are delivered when a device with access is playing DRM protected media in order to make it viewable. You can then intercept this with various tools, take those keys and use them to decrypt such media by essentially acting as that device. A “WEB-DL” is when something has been successfully downloaded and decrypted using the right set of keys.




  • That’s the thing… 7 is Vista. Just with a new UI.

    Windows 2000 is XP. XP just has a new interface. 7 and XP are for consumers. 2000 and Vista are more like server editions (even the pro versions.)

    Vista has Aero but also the classic server(windows NT/2000) UI. Even modern server versions have essentially the same classic control panels. And server versions don’t have all the extra bullshit(7 wasn’t so bad yet, but 10 and 11 are). That’s why I prefer to run Windows server as my desktop.

    What made Vista/7 great were the under the hood(kernel) improvements, particularly to the threading model. They made safe handling non-negotiable. This retroactively fixed countless programs and improved overall system stability significantly.

    There’s really no other major differences between Vista and 7 besides aesthetics(the shell.)