• @voracitude
    link
    English
    3
    edit-2
    6 months ago

    Yep! From the article:

    For context on the size of the brain sample and the data collected from it, we need to get into mind-numbingly colossal numbers. The cubic millimeter of brain matter is only one-millionth of the size of an adult human brain, and yet the imaging scans and full map of its intricacies comprises 1.4 petabytes, or 1.4 million gigabytes. If someone were to utilize the Google/Harvard approach to mapping an entire human brain today, the scans would fill up 1.6 zettabytes of storage.

    Taking these logistics further, storing 1.6 zettabytes on the cheapest consumer hard drives (assuming $0.03 per GB) would cost a cool $48 billion, and that’s without any redundancy. The $48 billion price tag does not factor in the cost of server hardware to put the drives in, networking, cooling, power, and a roof to put over this prospective data center. The roof in question will also have to be massive; assuming full server racks holding 1.8 PB, the array of racks needed to store the full imaging of a human brain would cover over 140 acres if smushed together as tightly as possible. This footprint alone, without any infrastructure, would make Google the owner of one of the top 10 largest data centers in the world…

    • @Carrolade
      link
      English
      56 months ago

      One minor detail that the brain is not homogenous. I don’t know where the sample came from, but you’d probably get different results from grey vs white matter, or the cerebellum vs cerebrum, etc.

      Still going to be a gargantuan amount of data though, no matter how you slice it.

      • @[email protected]
        link
        fedilink
        English
        26 months ago

        This is also pretty lightly compressed, though. If you’re trying to do mind uploads you can probably shave off orders of magnitude pretty easily, since in silico neurons don’t need any of the functional structures as long as they act the same way.