• @Alteon
    link
    47 months ago

    This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.

    Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.

    • @[email protected]
      link
      fedilink
      137 months ago

      More “conservative” in terms of preserving the planet’s resources.

      You don’t need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.

      • NekuSoul
        link
        fedilink
        107 months ago

        I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.

        On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.

    • queermunist she/her
      link
      fedilink
      57 months ago

      It’s conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.

      • @[email protected]
        link
        fedilink
        17 months ago

        less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.

        I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.

        • queermunist she/her
          link
          fedilink
          3
          edit-2
          7 months ago

          I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.

          I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.

          • @[email protected]
            link
            fedilink
            27 months ago

            whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.

    • @PopOfAfrica
      link
      17 months ago

      Honestly we are hitting the bugetary limits of what game graphics can do, for example.

      A lot of new games look substantially worse than the Last of Us Part 2, which ran on ancient hardware.

      • @Alteon
        link
        67 months ago

        “Limitations foster creativity.”

        100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.

        I’m still gaming on my desktop that I built 10 years ago quite comfortably.

      • @Bytemeister
        link
        Ελληνικά
        57 months ago

        Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.

        But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.

        Not all creativity is good creativity.

    • Übercomplicated
      link
      fedilink
      -37 months ago

      The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.

      • NekuSoul
        link
        fedilink
        77 months ago

        The topic is bloatware, not games.

        The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.

        • Übercomplicated
          link
          fedilink
          17 months ago

          I suppose ray-tracing is rather suggestive of games, you’re right. Well, I’ll take it as an accident by the author and rest easy. Thanks for the correction!