Someone smarter than me tell me if this is amazing and game changing or boring and niche.
I wouldn’t say I’m smarter than you, rather I just know some stuff about how computer components work, but what you’re looking at is the latter.
The problem with trying to move to another type of computer is that modern software is designed solely for digital machines. Considering what’s been stated above, how do you port these programs to another type of computer?
The answer is that you don’t. Porting to different CPU architectures can already take some time for most programs, but asking for a port to a fundamentally different type of computer will take an incredibly long amount of time.
That is, if you can even port anything. Considering that digital and analogue computers are completely different, functional clones would have to be made instead by referencing source code. If you don’t have the source, you’re outta luck.
TL,DR: We’ve over-invested in digital computers and there’s no going back.
If it turns out there’s something they do way better, they’d enter as an accelerator card like GPUs did.
Yup! I just wonder how that would work. Since digital and analogue signals are completely different, signal conversion would be required. The overhead caused by conversion may result in delay between the next instruction, or even reduced performance depending on the other components in the machine. A lot of research would have to be done on getting an accurate, low overhead signal converter built into the device.
Have to think of it more like how quantum computers are right now. You aren’t going to be running minecraft or a web browser on it, but it’ll probably be very good at doing certain things. Those things can either be in their own silo never interacting directly with a traditional computer, or information will be sent between them in some way (such as sending a calculation job, then receiving the answers). That send/receive can afford to be slow if some translation is needed, if the performance gains on the actual task are worth it. It’s not like a GPU where you would expect your frames to be rendered in real time to play a game.
Eventually that may change but until then it’s no more than that, articles like these put a lot of hype on things that while very interesting can end up misleading people.
For some reason this 3D model reminds me of a very specific pixel art…
Oh no no no. I erased that GIF from my mind, I don’t want it back.