• @Manifish_Destiny
    link
    16
    edit-2
    2 months ago

    Outlaw vulnerabilities? Do they just get little virtual handcuffs when they’re found? If I find a Microsoft vulnerability I get arrested? Not sure I’m following this one.

    Edit: it’s really obvious most of you haven’t worked in infosec.

    • Queue
      link
      fedilink
      133 months ago

      When WannaCry was a major threat to cybersecurity, shutting down banks and hospitals, it was found that it used a backdoor Microsoft intentionally kept open for governments to use.

      https://en.wikipedia.org/wiki/WannaCry_ransomware_attack

      EternalBlue is an exploit of Microsoft’s implementation of their Server Message Block (SMB) protocol released by The Shadow Brokers. Much of the attention and comment around the event was occasioned by the fact that the U.S. National Security Agency (NSA) (from whom the exploit was likely stolen) had already discovered the vulnerability, but used it to create an exploit for its own offensive work, rather than report it to Microsoft.[15][16]

      https://en.wikipedia.org/wiki/EternalBlue

      EternalBlue[5] is a computer exploit software developed by the U.S. National Security Agency (NSA).[6] It is based on a vulnerability in Microsoft Windows that allowed users to gain access to any number of computers connected to a network. The NSA knew about this vulnerability but did not disclose it to Microsoft for several years, since they planned to use it as a defense mechanism against cyber attacks.

      In real life, if I do not prevent someone from doing a crime that I am aware of was premeditated, I am guilty of not doing my duty. Corporations are people thanks to Citizens United, and governments are ran by people, so uphold them to the same standards they subject the populace to.

      • troed
        link
        fedilink
        63 months ago

        Well. Your sources don’t say Microsoft kept it. They say NSA didn’t report it to Microsoft so that they would be able to keep using it.

    • @Limonene
      link
      33 months ago

      If you are Microsoft, then yeah. You’d go to jail when a Windows vulnerability is found.

      In all seriousness though: it would be more likely to be just a civil penalty, or a fine. If we did want corporate jail sentences, there are a few ways to do it. These are not specific to my proposal about software vulnerabilities being crimes; it’s about corporate accountability in general.

      First, a corporation could have a central person in charge of ethical decisions. They would go to prison when the corporation was convicted of a jailable offense. They would be entitled to know all the goings on in the company, and hit the emergency stop button for absolutely anything whenever they saw a legal problem. This is obviously a huge change in how things work, and not something that could be implemented any time soon in the US because of how much Congress loves corporations, and because of how many crimes a company commits on a daily basis.

      Second, a corporation could be “jailed” for X days by fining them X/365 of their annual profit. This calculation would need to counter clever accounting tricks. For example some companies (like Amazon, I’ve heard) never pay dividends, and might list their profit as zero because they reinvest all the profit into expanding the company. So the criminal fine would take into account some types of expenditures.

    • Pennomi
      link
      English
      23 months ago

      Presumably that, once exploited, vulnerabilities are an offense that the DOJ can fine the company for. I think that’s quite reasonable.

      • BlueÆther
        link
        fedilink
        43 months ago

        I’d go further, an unpatched vulnerability is offense that the DOJ can fine the company for

        • Pennomi
          link
          English
          23 months ago

          Sounds fair enough to me.