• Slambert@lemmy.ml
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Yup! I just wonder how that would work. Since digital and analogue signals are completely different, signal conversion would be required. The overhead caused by conversion may result in delay between the next instruction, or even reduced performance depending on the other components in the machine. A lot of research would have to be done on getting an accurate, low overhead signal converter built into the device.

      • Osayidan@social.vmdk.ca
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Have to think of it more like how quantum computers are right now. You aren’t going to be running minecraft or a web browser on it, but it’ll probably be very good at doing certain things. Those things can either be in their own silo never interacting directly with a traditional computer, or information will be sent between them in some way (such as sending a calculation job, then receiving the answers). That send/receive can afford to be slow if some translation is needed, if the performance gains on the actual task are worth it. It’s not like a GPU where you would expect your frames to be rendered in real time to play a game.

        Eventually that may change but until then it’s no more than that, articles like these put a lot of hype on things that while very interesting can end up misleading people.