• @[email protected]
      link
      fedilink
      5
      edit-2
      1 year ago

      I wouldn’t say I’m smarter than you, rather I just know some stuff about how computer components work, but what you’re looking at is the latter.

      The problem with trying to move to another type of computer is that modern software is designed solely for digital machines. Considering what’s been stated above, how do you port these programs to another type of computer?

      The answer is that you don’t. Porting to different CPU architectures can already take some time for most programs, but asking for a port to a fundamentally different type of computer will take an incredibly long amount of time.

      That is, if you can even port anything. Considering that digital and analogue computers are completely different, functional clones would have to be made instead by referencing source code. If you don’t have the source, you’re outta luck.

      TL,DR: We’ve over-invested in digital computers and there’s no going back.

      • @schroedingershat
        link
        51 year ago

        If it turns out there’s something they do way better, they’d enter as an accelerator card like GPUs did.

        • @[email protected]
          link
          fedilink
          11 year ago

          Yup! I just wonder how that would work. Since digital and analogue signals are completely different, signal conversion would be required. The overhead caused by conversion may result in delay between the next instruction, or even reduced performance depending on the other components in the machine. A lot of research would have to be done on getting an accurate, low overhead signal converter built into the device.

          • Osayidan
            link
            fedilink
            11 year ago

            Have to think of it more like how quantum computers are right now. You aren’t going to be running minecraft or a web browser on it, but it’ll probably be very good at doing certain things. Those things can either be in their own silo never interacting directly with a traditional computer, or information will be sent between them in some way (such as sending a calculation job, then receiving the answers). That send/receive can afford to be slow if some translation is needed, if the performance gains on the actual task are worth it. It’s not like a GPU where you would expect your frames to be rendered in real time to play a game.

            Eventually that may change but until then it’s no more than that, articles like these put a lot of hype on things that while very interesting can end up misleading people.

          • @[email protected]
            link
            fedilink
            11 year ago

            it would be the same way expansion cards work now; it would have digital control circuitry that can communicate with the analog circuitry.

            We already have expansion cards that can do this. Audio cards are an example of an expansion card that convert between digital and analog signals.

            Even things like graphics cards, ASICs, or FPGAs; it’s not a different type of signal, but it’s an architecture that isn’t compatible with the rest of the computer because it’s specialized for a certain purpose. So there’s control circuitry that allows it to do that and a driver on the computer that tells it how to.

            • @[email protected]
              link
              fedilink
              11 year ago

              I don’t think you know what I’m getting at. I know about audio cards, as I’m an audiophile. I can tell you with confidence that DACs can only convert digital sound data into analogue, and that’s due to the audio jack being older than digital audio.

              The problem with your examples (GPUs, ASICs, and FPGAs) is that they’re digital devices. An analogue device isn’t compatible with a digital device, much like how digital sound data (songs, audio tracks in videos, system sounds, etc…) and analogue audio don’t technically work. They only work because the quality of sound get downgraded during the jump to digital recording methods.

              If you look at many older albums, like “The Piper at the Gates of Dawn,” you may notice that they are offered at a very high quality (24bit 192kHz is common.) This is due to them being recorded on audio tapes, which could store incredibly high resolution sound. This is the same situation with film, and is the reason why old films can still be rerealeased in higher resolutions (assuming that the film the movie was originally recorded is still around.) Newer albums. however, often cap out at 24bit 48kHz, as digital sound requires the sound quality to be preconfigured. Analogue just records.

              When you’re listening to sound on a digital device, you’re always dealing with compression of some kind, as the sound may be “lossless” in the sense that the audio file was recorded in CD quality. This is because storing analogue data is impossible on digital storage devices. What’s actually done is a lot like a smooth wavelength that a bunch of pillars trying to match. The pillars may get close, but they will never actually be the wavelength due to their shape.

              Using an analogue device to accelerate something requires at least some information to be lost on translation, even if the file size is stupidly large. All in all, getting analogue data to a digital device will always be lossy in some regard, and storing truely analogue data on digital storage is impossible.

              TL,DR: Analogue and digital are inherently different, and mixing the two is only possible through a lot of compromises having to be made.

              • @[email protected]
                link
                fedilink
                11 year ago

                I can tell you with confidence that DACs can only convert digital sound data into analogue, and that’s due to the audio jack being older than digital audio.

                Right. But the principle is the same; hardware that isn’t compatible with pre-existing systems has a control circuit, and a digital interface. The digital computer sends instructions to the controller, and the controller carries out the instructions.

                An analogue device isn’t compatible with a digital device, much like how digital sound data (songs, audio tracks in videos, system sounds, etc…) and analogue audio don’t technically work.

                Correct. That is why there is dedicated control circuitry designed for making analog and digital systems talk to each other – as there will be for optical analog computers and every other type of non-conventional computing system.

                It’s true that conventional systems will not, by default, be able to communicate with analog computers like this one. To control them, you will send the question (instructions) to the control circuitry, which does the calculation on the hardware, and returns an answer. That’s true for DACs, it’s true for FPGAs, it’s true for CPUs, it’s true for ASICs.

                Every temperature sensor, fan controller, camera, microphone, and monitor are also doing some sort of conversion between digital and analog signals. The light being emitted by the monitor to your eyes is a physical phenomenon that can be measured as an analog value (by taking a picture of your computer monitor on film, say). How does your monitor produce this analog signal? It has a control circuit that can take digital commands and convert them into light in specific patterns.

                Using an analogue device to accelerate something requires at least some information to be lost on translation, even if the file size is stupidly large.

                I don’t think you’ve understood what analog computers are used for (actually, I’m not sure that you’ve understood what analog computing even really is beyond that it involves analog electrical signals). Analog computers aren’t arbitrarily precise like digital computers are in the first place, because they are performing the computation with physical values – voltage, current, light color, light intensity – that are subject to interference from physical phenomenona – resistance, attenuation, redshift, inertia. In other words, you’re really worried about losing information that doesn’t exist in a reliable/repeatable way in the first place.

                A lot of iterative numerical methods need an initial guess and can be iterated to an arbitary degree. Analog computers are usually used to provide the initial guess to save iteration flops. The resolution just is not that important when you’re only trying to get into the ballpark in the first place.

                In other words, this computer is designed to solve optimization problems. Say you’re getting results based on the color and intensity of the light coming out of it, right, like you might get values of tides based on electrical voltage on an old desktop analog computer. It’s not that relevant to get the exact values for every millisecond at a sampling rate of a bajillion kilohertz; you’re looking for the average value that isn’t falsely precise.

                So if you were designing an expansion card, you would design a controller that can modulate the color and intensity of the light going in, and modulate the filter weights in the matrix. Then you can send a digital instruction to “do the calculation with these values of light and these filter values”. The controller would read those values, set up the light sources and matrix, turn on the light, read the camera sensors at the back, and tell you what the cameras are seeing. Voila, you’re digitally controlling an analog computer.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          1 year ago

          No problem! I’m sorry if I came off as hostile towards analogue machines here. I actually think they’re cool, just not in the way people think they are (“unraveling Moore’s law” is a bit far-fetched, Microsoft.)

          Oh, and some advice for anyone who isn’t too well-versed in technology: The tech industry isn’t known for honesty. For them, hype comes before everything, even profitability. Take any “revolutionary” or “innovative” new piece of tech with a grain of salt, especially now that tech companies are getting a bit goofy with their promises due to investors realizing that unprofitable companies aren’t sustainable.

          EDIT: The two deleted comments are dupilcates of this one that were posted due to a bug in Jerboa. Sorry for any confusion!