I still use a 1070, so the GPU comparisons here aren’t relevant.
The main issue I hit was deciding between DDR4 and DDR5 RAM since we’re in an awkward transition phase - and that affects motherboard and so CPU choices too.
Well, I’ve had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I’m finally seeing some issues with some modern AAA content. Hogwarts legacy won’t really run at all, for example.
I also haven’t wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!
I think it’s a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it’s pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.
The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.
I’ve come to realize that I don’t really “upgrade” anything but the GPU and adding storage. I’ve never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.
I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.
But for the last 10 years or so, there’s much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it’s just a few FPS at highest settings. So now I just upgrade every few years.
Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I’m not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.
The only reason I upgraded my 10 series to a 30 series is because I’m a dummy and bought a monitor with only HDMI 2.1 and no display port, so I needed to upgrade my GPU or I would have no gsync. Otherwise, I probably would have waited at least 2 more generations to upgrade.
Do that many people upgrade every generation?
I still use a 1070, so the GPU comparisons here aren’t relevant.
The main issue I hit was deciding between DDR4 and DDR5 RAM since we’re in an awkward transition phase - and that affects motherboard and so CPU choices too.
Upgrading every generation is stupid. I try to upgrade every 5 years if I can afford it.
My 1080ti says the performance gap versus cost to upgrade is not affordable right now. So I gotta keep waiting.
Well, I’ve had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I’m finally seeing some issues with some modern AAA content. Hogwarts legacy won’t really run at all, for example.
I also haven’t wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!
FYI it probably isn’t the 5700XT that’s causing issues in Hogwarts, mine works fine.
I think it’s a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it’s pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.
IT BELONGS IN A MUSEUM!
The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.
I’ve come to realize that I don’t really “upgrade” anything but the GPU and adding storage. I’ve never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.
I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.
But for the last 10 years or so, there’s much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it’s just a few FPS at highest settings. So now I just upgrade every few years.
Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I’m not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.
The only reason I upgraded my 10 series to a 30 series is because I’m a dummy and bought a monitor with only HDMI 2.1 and no display port, so I needed to upgrade my GPU or I would have no gsync. Otherwise, I probably would have waited at least 2 more generations to upgrade.