• @[email protected]
    link
    fedilink
    English
    21 year ago

    The cable has to carry the negotiated power safely. It’s not unnecessary, it’s absolutely critical. I’ve personally seen and diagnosed the result of when this fails.

    For your low power applications there is no need and the spec allows for that.

    • TWeaK
      link
      fedilink
      English
      11 year ago

      It wouldn’t be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you’re gonna have a bad time. If you use a 3A or better cable, then you don’t need a cable chip to tell the actual devices to only work at 0.5A.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?

        Divider resistors are okay, but the IC is a better choice for future proofing and reliability.