- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://lemmy.world/post/13805928
It’s a long vid. I suggest prepping your fav drink before viewing.
It’s re Nvidia’s new gpu architecture for ai, NVlink switch, RAS diagnostics and other Nvidia announcements.
Nvidia knows it’s the star of the backbone of the current ai boom and seems to be going full steam. I’m hoping for more innovations on tools for ai and gaming in the future.
It’s not marketing, AMD sucks for ML stuff. I don’t just play games. Everything is harder, with fewer features and less performant with AMD.
The situation is mostly reversed on Linux. Nvidia has fewer features, more bugs and stuff that plain won’t work at all. Even onboard intel graphics is going to be less buggy than a pretty expensive Nvidia card.
I mention that because language model work is pretty niche and so is Linux ( maybe similar sized niches? ).
Yeah. Linux boys on AMD are eatin’ good.
Really? I’ve only dabbled with locally run AI for a bit, but performance in something like ollama or stable diffusion has been really great on my 6900xt.
The problem isn’t, that it isn’t great, but that nvidia cards are just better at a given price point, partially thanks to cuda.
And for gaming and general use, my experience in the last few years has been, that nvidia still has the leg up, when it comes to drivers on windows. Never had a nvidia card make any problems. AMD, not so much.
Would still happily trade my GTX 1650 with a RX 6400 because I recently switched to Linux and it’s a whole different world there…