The starting premise of the article is based on upgrading from a previous generation. Which sane mind does that? Aside from the one time I got a freebie, all my upgrades were at least two generations apart.
Edit: Also, coming with certain prices on the RX 6000 series, as long as you’re from three generations behind you’ll get a good upgrade. I went from a GTX 1070 to a RX 6700 XT. Felt a big improvement there.
They do make the point in the article that even upgrading from two generations back is a waste, as you’re getting basically getting no real benefit to having waited two generations instead of one. You may as well upgrade to last generation instead of this one and save yourself some money.
If you’re three generations behind, no matter what your upgrade path is, you’re getting a significant upgrade, but it’s still not worth upgrading to the current gen when last gen is much better value for a marginal performance difference.
The exception to all this is buying the absolute top-of-the-line, which is never good value, but is again significantly inflated in price from the previous gen.
Do that many people upgrade every generation?
I still use a 1070, so the GPU comparisons here aren’t relevant.
The main issue I hit was deciding between DDR4 and DDR5 RAM since we’re in an awkward transition phase - and that affects motherboard and so CPU choices too.
Upgrading every generation is stupid. I try to upgrade every 5 years if I can afford it.
My 1080ti says the performance gap versus cost to upgrade is not affordable right now. So I gotta keep waiting.
Well, I’ve had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I’m finally seeing some issues with some modern AAA content. Hogwarts legacy won’t really run at all, for example.
I also haven’t wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!
FYI it probably isn’t the 5700XT that’s causing issues in Hogwarts, mine works fine.
I think it’s a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it’s pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.
IT BELONGS IN A MUSEUM!
The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.
I’ve come to realize that I don’t really “upgrade” anything but the GPU and adding storage. I’ve never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.
Sometimes you get around that for longer by upgrading to the highest possible configuration on that platform. Often for cheap second hand.
I replaced my 2017 Ryzen 1800x with a Ryzen 5800x3D recently which is supported on my x370 Motherboard. Huge upgrade, no platform change required. I think I can wait for DDR5 and a new motherboard for years to come.
I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.
But for the last 10 years or so, there’s much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it’s just a few FPS at highest settings. So now I just upgrade every few years.
Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I’m not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.
This article needs a clearer title. I agree that upgrading from a 6000 or 3000 series card right now is almost completely pointless, and even going back another generation it’s still not a great proposition. But I know people with “gaming PCs” rocking 1650s or even 1050s. Lots of folks with medium or low end several generations old hardware out there, for whom great upgrade options exist.
I finished school and want to start gaming again. My PC has a AMD 370 I bought back in 2015. Is that still a decent GPU to play games on today?
The 370 will struggle with most anything recent since it’s only 2GB VRAM. The Radeon 6600 would be an excellent upgrade there.
In March, I upgraded my video card from 1660 to 6750. I am really happy with how much better things look now, especially while gaming.
repost
you know that on Lemmy you can just edit the post, right? Title, url and text, all can be changed.
The technology existing all this time. We just never knew.
I (am not op, but) didn’t know that! Thanks!
Didn’t know that. I was used to ol’ reddit. Thanks for the info tho, I’ll come handy later!
Well, like some, I am still on the 10xx series (1060 3gb 🤣🥲) and starting to look to the futur full system upgrade for a Rx79xx or 78xx when out. Targeting Black Friday sale jump
I would be curious to know if many others are on a refresh cycle up to 4-5years
(Need to check how to create a poll in Lemmy)
I’m on a 970 and it still plays all the games I want it to play.
970 + PatientGamers = name a more iconic duo
(or current AAA in lowspecgamer mode XD )
Lol. My flair on pcmr used to be “GTX 970: definitely 4gb of VRAM”. Which is itself a pretty outdated joke nowadays
Exactly. I’m on a 1050Ti and I’m not sure Starfield will be happy with that. Cyberpunk wasn’t too happy. And of course if I get a new card I will need new MB/CPU/RAM/etc.
I upgraded last summer to a 6700XT from a 1070ti. I didn’t need the upgrade, since the 1070ti is still a solid performer even now. There’s not much that it wouldn’t still run now and reasonable settings. I really only upgraded because there was a decent sale, and I had some money burning a hole in my pocket. I could have easily waited another year, and gone with a 6800XT or better for a similar price.
I’m halfway through a 10 year cycle, with a 1060 3Gb on a 7th Gen i5. It’s mostly Civ6, Stellaris, and Rocksmith 2014 @ 1080p so it’s fine. The main problem is end-of-life for Windows 10 without support in the current hardware, and Rocksmith doesn’t work well on Linux. I’ll probably keep it as-is and start from scratch… when I see a title that I want to play enough to drop big cash on hardware.
Clone hero works on Linux, not sure if that’s the same type of game since I’m just guessing rock smith is like rock band. https://clonehero.net/
Rocksmith uses a real guitar and purports to teach you how to play… Not really like Rock band.
My b