• @Limonene
    link
    3229 days ago

    This is stupid. Their justification is an “unusual degree of vulnerabilities.”

    So why not outlaw vulnerabilities? Impose real fines or jail time, or at the very least a civil liability that can’t be waived be EULA. Better than an unconstitutional bill of attainder.

    • NaibofTabr
      link
      fedilink
      English
      4629 days ago

      So why not outlaw vulnerabilities?

      Of course! If we make vulnerabilities illegal, then all the programmers will make perfect software! The solution was so easy!

      • @scrion
        link
        1729 days ago

        There is definitely a difference in quality when talking about import software.

        Also, “outlawing vulnerabilities” would not mean to simply assume everyone starts making perfectly secure software, but rather that you’re fined if you can’t prove your processes are up to spec and you adhered to best practices during development. Additionally, vendors are obliged to maintain their software and keep it secure.

        And surprise, surprise, the EU ratified laws that do exactly that (and more) recently. In fact, they’ll be in effect very soon:

        https://en.m.wikipedia.org/wiki/Cyber_Resilience_Act

    • @Manifish_Destiny
      link
      16
      edit-2
      2 days ago

      Outlaw vulnerabilities? Do they just get little virtual handcuffs when they’re found? If I find a Microsoft vulnerability I get arrested? Not sure I’m following this one.

      Edit: it’s really obvious most of you haven’t worked in infosec.

      • Queue
        link
        fedilink
        1329 days ago

        When WannaCry was a major threat to cybersecurity, shutting down banks and hospitals, it was found that it used a backdoor Microsoft intentionally kept open for governments to use.

        https://en.wikipedia.org/wiki/WannaCry_ransomware_attack

        EternalBlue is an exploit of Microsoft’s implementation of their Server Message Block (SMB) protocol released by The Shadow Brokers. Much of the attention and comment around the event was occasioned by the fact that the U.S. National Security Agency (NSA) (from whom the exploit was likely stolen) had already discovered the vulnerability, but used it to create an exploit for its own offensive work, rather than report it to Microsoft.[15][16]

        https://en.wikipedia.org/wiki/EternalBlue

        EternalBlue[5] is a computer exploit software developed by the U.S. National Security Agency (NSA).[6] It is based on a vulnerability in Microsoft Windows that allowed users to gain access to any number of computers connected to a network. The NSA knew about this vulnerability but did not disclose it to Microsoft for several years, since they planned to use it as a defense mechanism against cyber attacks.

        In real life, if I do not prevent someone from doing a crime that I am aware of was premeditated, I am guilty of not doing my duty. Corporations are people thanks to Citizens United, and governments are ran by people, so uphold them to the same standards they subject the populace to.

        • troed
          link
          fedilink
          628 days ago

          Well. Your sources don’t say Microsoft kept it. They say NSA didn’t report it to Microsoft so that they would be able to keep using it.

      • @Limonene
        link
        329 days ago

        If you are Microsoft, then yeah. You’d go to jail when a Windows vulnerability is found.

        In all seriousness though: it would be more likely to be just a civil penalty, or a fine. If we did want corporate jail sentences, there are a few ways to do it. These are not specific to my proposal about software vulnerabilities being crimes; it’s about corporate accountability in general.

        First, a corporation could have a central person in charge of ethical decisions. They would go to prison when the corporation was convicted of a jailable offense. They would be entitled to know all the goings on in the company, and hit the emergency stop button for absolutely anything whenever they saw a legal problem. This is obviously a huge change in how things work, and not something that could be implemented any time soon in the US because of how much Congress loves corporations, and because of how many crimes a company commits on a daily basis.

        Second, a corporation could be “jailed” for X days by fining them X/365 of their annual profit. This calculation would need to counter clever accounting tricks. For example some companies (like Amazon, I’ve heard) never pay dividends, and might list their profit as zero because they reinvest all the profit into expanding the company. So the criminal fine would take into account some types of expenditures.

      • Pennomi
        link
        English
        229 days ago

        Presumably that, once exploited, vulnerabilities are an offense that the DOJ can fine the company for. I think that’s quite reasonable.

        • BlueÆther
          link
          fedilink
          429 days ago

          I’d go further, an unpatched vulnerability is offense that the DOJ can fine the company for

          • Pennomi
            link
            English
            229 days ago

            Sounds fair enough to me.

    • Queue
      link
      fedilink
      11
      edit-2
      29 days ago

      Because the NSA, CIA, and FBI love them. Vault 7, Magic Lantern, Intel ME and AMD PSP, Dual elliptic curve, COTTONMOUTH-I, ANT/TAO catalog, etc.

      Hell, Microsoft willingly reports vulnerabilities and exploits to the government for them to use.

      North Korea wishes it had this level of control on the goods its citizens willingly buy.

    • @[email protected]
      link
      fedilink
      English
      129 days ago

      Why not?

      Well…

      It discourages self-reporting, makes vendors hostile to security researchers, opens the door to endless litigation over whose component actually “caused” a vulnerability… encourages CYA culture (like following a third-party spec you know is bad rather than making a good first-party one, because it guarantees blame will fall on another party)

      In a complex system with tight coupling, failure is normal, so you want to have a good way to monitor and remedy failure rather than trying to prevent 100% of it. The last thing you wanna do is encourage people to be hostile to failure-monitoring.

      (See also: Normal Accident theory)