For example, I love how the human brain consists of layers from different evolutionary phases (like the mammalian and reptilian brains), which reminds me of seeing remnants of teletype code in modern macOS.

  • @Waveform
    link
    34 months ago

    Not exactly about codebases, but I believe the universe operates like a cellular automaton (CA) at its most fundamental level. (Idea originally from Stephen Wolfram.)

    A CA, if you don’t know, is a simulation in which a cell in a grid evaluates nearby cells and returns a value based on what it finds and the rules given to it. What happens next is called ‘emergent behavior’, and in some ways mimics physics and even primitive life. In fact, many physics models use CA already.

    What this means to me is that there is ultimately only one type of energy/matter, and that everything we can detect (quarks, photons, atoms, etc.) is made from the same ‘stuff’, and that nothing is truly random… it’s just that we lack the tools and models to predict what happens below the smallest observable scale.

    • Codex
      link
      4
      edit-2
      4 months ago

      Have you seen Lenia?

      My understanding of what a cellular automata could be was greatly expanded when I learned about it. To me, something like loop quantum gravity seems to have the same rough “shape” as a very complex cellular automata. I think we’re (humanity) getting closer to a breakthrough in understanding on that front.

      • @Waveform
        link
        24 months ago

        I didn’t know it was called Lenia, but I have seen implementations of it here and there. It’s pretty cool that people have gotten it working, and I wonder what other totalistic cellular automata besides the Game of Life would look like when evaluated in a continuous manner.

        The problem with making truly continuous CA is that we are using digital computers to do it, so we can only ever get an approximation of what a CCA would look like. I’ve seen just how big a difference 8 bit vs 16 bit values make, and I imagine that even though higher and higher bit depths would converge upon a truer model, issues will still persist. Plus, we are still stuck with using grids…