• Rain World: Slugcat Game
    link
    02 months ago

    neural network, takes two numbers as input, outputs sum. no hidden layers or activation function.

    • @jacksilver
      link
      12 months ago

      Yeah, but since Neural networks are really function approximators, the farther you move away from the training input space, the higher the error rate will get. For multiplication it gets worse because layers are generally additive, so you’d need layers = largest input value to work.

        • @jacksilver
          link
          12 months ago

          Is that a thing? Looking it up I really only see a couple one off papers on mixing deep learning and finite state machines. Do you have examples or references to what you’re talking about, or is it just a concept?