Comcast says it represents a 10 Gigabit cable internet network they are building (it doesn’t exist) so they are basically changing the meaning of the g from generation to gig to act like 10g is 5 generations better (or twice as fast)…or that they have a 10 gigabit network. Neither is accurate. It’s still just cable internet that people have to use because they have no other option.

Fuck Comcast.

I read online they are abandoning the “confusing” 10g branding but I just saw a commercial for it. They think all of their customers are morons and count on folks having no other choices in a lot of cases.

Apologies to anyone outside the United States, this is just complaining about our poor internet options and deceptive advertising by greedy corporations.

  • @[email protected]
    link
    fedilink
    English
    559 months ago

    You are conflating Internet service speed and mobile generations. I work for an ISP. I hear this all the time. Especially since there’s also “5G WiFi” which is 5 GHz band. People confuse it all, and it’s understandable but still annoying.

    My company offers 1 Gbps service. No one is getting confused by that yet, but our modems have 2.5 Gbps Ethernet ports now, and I had a customer that was outraged the other day because “Your modem is only 2.5 G and all my devices use 5G! You need to send me a 5G modem!!” FFS

    • Nougat
      link
      fedilink
      239 months ago

      Sure, but they really should be describing it as 10Gb (gigabit). Even that could easily get confused with 10GB (gigabyte), which would be used for a file size.

      • @Mr_Dr_Oink
        link
        English
        89 months ago

        Internet providers have always done this. Its not a new thing.

        • @[email protected]
          link
          fedilink
          English
          39 months ago

          Not just internet providers. Data communication speeds have always been in bits per second. Historically it makes perfect sense.

          Specifying speed in bytes per second would be inconvenient because while we settled on 8 bits per byte in the early days of computing this was not the case. 6-bit bytes were common, but other sizes were used too, 7,8, 9, 10 and sometimes even larger.

          So when you’re talking about communication between different types of computers with different size bytes, it would be confusing to use bytes/second as a unit.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            9 months ago

            Even now that we’ve by and large settled on 8 bits per byte it’s still useful to call out the communication rate as distinct from the actual payload data transfer rate, as there are other sources of overhead.

            You’ll never actually see a 1MB/s transfer over an 8Mbps connection because some of those bits are going to be used for things like packet headers, keep alive messages, etc.

        • @hikaru755
          link
          English
          39 months ago

          Doesn’t really matter for the point they’re making, does it?

            • WorseDoughnut 🍩
              link
              fedilink
              English
              29 months ago

              I don’t think the average person even knows GiB exists, since Windows and all the random flash drive manufacturers have mislabeled and confused the two for ages now.

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        9 months ago

        I think you’ll find, if your look deep, that the average customer IS DUMB AS FUCKING BRICKS AND DOESN’T KNOW WHAT FROM WHAT, BIG NUMBER GOOD