• @gi1242
    link
    89
    edit-2
    5 days ago

    I used Gentoo for 3y. in hindsight I wasted so many CPU cycles just because I thought --march=native would make things faster.

    nope.

    you know what made things faster? switching to arch 😂

    • @atmur
      link
      English
      545 days ago

      When CPUs were a lot slower you could genuinely get noticeable performance improvements by compiling packages yourself, but nowadays the overhead from running pre-compiled binaries is negligible.

      Hell, even Gentoo optionally offers binary packages now.

      • @[email protected]
        link
        fedilink
        95 days ago

        Most of the reason to build your own packages is a form of runtime assurance - to know what your computer is running is 100% what you intend.

        At least as a guix user that’s what I tell myself.

        • @ByteJunk
          link
          175 days ago

          Compiling your own packages only ensures that, well, you’re running packages that you compiled. This definitely does not mean that your computer is running what you intend at all.

          Half the time I don’t know what my CPU is executing, and that’s code that I wrote myself.

          • @Skullgrid
            link
            65 days ago

            This definitely does not mean that your computer is running what you intend at all.

            This is true of all programming

            • @ByteJunk
              link
              45 days ago

              I like to imagine that the early heroes who programmed in punch cards and basically raw machine code knew exactly what the CPU was the computer was running, but who knows…

        • @[email protected]
          link
          fedilink
          75 days ago

          Do you audit all the code before compiling? Otherwise you’re just transferring your trust elsewhere.

          • @Bassman1805
            link
            22 days ago

            This is my experience playing with FreeBSD.

            “These ports are cool, I can compile all the software from source so I know exactly what I’m getting!”

            [This software has 100 dependencies]

            “Well I’m not reading all that, I’ll just click Yes for all”

      • @Im_old
        link
        65 days ago

        Yes, I tried it around 2002/2003, back when the recommended way was from stage1. I think I had a P4 with HT. It was noticeably faster than redhat or mandrake (yes, I was distro hopping a lot). Emerge gnome-mono was a night run. Openoffice about 24hrs.

        Lots of wasted time but I did learn how to setup some things manually.

        • @gi1242
          link
          25 days ago

          once there was a bug with dependencies of transcode and some other package (mplayer I think). it would ask to downgrade one and upgrade the other. then several hours of compiling later it would agree to upgrade both. then several more hours of compiling later it would again want to downgrade one again

          I think there was a groove worn in my hard drive from this

          • @Im_old
            link
            15 days ago

            Oh yeah, I remember those. My solution was to not emerge anything for 24 hours, by the next day usually they fixed the issue.

        • @gi1242
          link
          15 days ago

          so even after 24h compiling ur not done! u need to dispatch-config through so many config files…

    • Illecors
      link
      fedilink
      English
      85 days ago

      I did jump onto Gentoo ship chasing performance, but stayed because of USE flags.

    • Possibly linux
      link
      fedilink
      English
      15 days ago

      You know what was even faster? Switching to something easier like Fedora/Linux mint/Debian