Hello selfhosters,

I have opportunity for free CPU upgrade from g3930 to i3-8100, but not sure is it worth?

Server ATM is happy DIY PC build made from desktop components:

  • MBO: MSI Z270-A PRO
  • CPU: intel G3930
  • RAM: DDR4 8GB
  • PSU: ATX 550W
  • plus storage and no GPU

Im running debian with 30+ containers (*arrs, jellyfin, nextcloud, pihole, npm and many more) and its amazing for our needs (2 users), pulls around 20-25 W on idle from the wall. I could use bit more juice to speed up Nextcloud for example and maybe to be able to transcode 4K in the future (don’t have any 4K screen yet)? Not sure if this upgrade can help at all?

Now I got i3-8100 for free, but not sure is there any downside like extra power usage on idle? I would like to avoid replacing CPU for unnoticable performance increase. Comparing specs its pure upgrade and i3 is 2x more expensive.

One of them will probably end up like e-waste in one of the hidden boxes that I keep away from everyone including myself :D

All suggestions are welcome

  • @SheeEttin
    link
    English
    51 year ago

    https://www.cpubenchmark.net/compare/2957vs3103/Intel-Celeron-G3930-vs-Intel-i3-8100

    TDP goes from 51 to 65 W, not much of an increase. If you have it set to throttle down when not used, I don’t think you’ll see much of a change.

    Doubling the number of cores is a big performance boost. I would certainly upgrade if you’re considering 4K video, especially if you’re transcoding.

    I’d also read the Jellyfin article about hardware acceleration: https://jellyfin.org/docs/general/administration/hardware-acceleration/

    • @tomten
      link
      English
      11 year ago

      Afaik TDP isn’t power consumption, it’s more input to the manufacturers of the coolers and it’s not calculated the same between AMD and Intel.

      • @SheeEttin
        link
        English
        11 year ago

        Yeah it’s a rough number, but it can be used as a guide to estimate power consumption, max performance, and stuff like that. In this case, running an identical software stack, you can say that it would probably result in slightly increased power usage. A better CPU might mean more efficiency, but you’re also running more GHz and more cores.

        If you wanted hard data, you’d have to run benchmarks.

        • @tomten
          link
          English
          11 year ago

          I agree but many use it as if it’s actual power consumption