• als@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 hours ago

    My RX480 from 2016 is still kicking, if crashing a bit. At this rate when it breaks I’ll just use my steam deck docked instead of selling my liver to buy a new GPU

  • nahostdeutschland@feddit.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    14 hours ago

    I’ve been gaming on my SteamDeck for quite a while and I will continue that. There’s no way that I’m paying that much money for a new gaming computer

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    18 hours ago

    Been that way for years. There was a brief respite when people were switching to ASIC bitcoin mining and away from GPU intensive mining and you could actually get a GPU for a fair price, retail, non-scalper price gouging.

    Now it’s right back to basically unaffordable for a name brand GPU. Literally more than a mortgage payment.

  • Dasnap@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 day ago

    Crypto, followed by NFTs, followed by LLMs… The GPU market has been fucked for years now. Whenever something starts to drop off, another tech bro idea that requires 10,000 GPUs to process takes its place.

    • merc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 hours ago

      The Crypto to AI transition was brutal. Just as demand for GPUs was coming down because people were starting to use ASICs to mine Bitcoin, along comes AI to drive up non-gaming demand again.

      The only good news is that eventually when the AI bubble pops there will be massive R&D and manufacturing geared towards producing GPUs. Unless something else comes along… But really, I can’t see that happening because the AI bubble is so immense and is such an enormous part of the entire world’s economy.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      5
      ·
      21 hours ago

      Truly just the brute force solution. Need a shitload of compute? GPUs can do it! No one stops to think if we really need it. It’s all about coulda, not shoulda. Yeah, ML and AI has a place, but big tech just thinks “slap an LLM everywhere”. Just such horseshit

  • Vizth@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    19 hours ago

    You know, I was thinking about upgrading my graphics card this year along with the rest of my PC but I think I can squeeze a couple more years out of my 3060 TI at this rate.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      18 hours ago

      I squeezed 7 years out of my 1070 before I replaced it.

      You can easily get another 2-4 years out of your GPU. A 3060ti isn’t even old.

  • HeyListenWatchOut@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 day ago

    I just kept an eye on Micro Center’s refurbished cards for a few weeks and was able to snag a 3090Ti late last year with a 3-yr warranty for the same price I paid for a 980Ti in 2015.

    • taiyang@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      I think that might be my plan too, but I’m still waiting a paycheck or two before I even monitor the situation. My 2070 is fine and ultimately I just want to pass it down to a spare PC for kids to mess around on as my oldest hits 3. I know my the time I hit 5 I was playing shit like Dune 2, admittedly with hacked save files my dad setup.

  • paultimate14@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    1 day ago

    The 9070’s on eBay are getting cheaper and cheaper the further we get from the launch. I think scalpers underestimated AMD’s stock and they are slowly discovering that.

    Immediately after the launch the XT seemed to be starting at $1,200. Now they are down to $800. The non-xt is down to $650.

    Depends on how much stock AMD can provide in the coming weeks and months, but I’m still thinking I’ll be able to get one at MSRP this year.

    • SmokedBillionaire@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      I’m not sure where you are seeing this but on eBay they are still showing a ton of cards from 12-1800.

      This shit is a problem and only the apathetic retailers can fix it.

      • Zorque@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 day ago

        Being bought, or being sold? There’s a difference. If they’re not being bought at those numbers, they’ll still show up the most.

        They also said “starting”, which implies that’s what it’s being sold at, not what you see the most listings for.

    • Mouette@jlai.lu
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      It still see them more around 800-900€ which is more than what i paid for a 7900XT wtf

  • Elevator7009@ani.social
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    7
    ·
    1 day ago

    I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.

    I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.

    • Agent Karyo@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      Modern high end iGPUs (e.g. AMD Strix Halo) are going to start replacing dGPUs in the entry and mid-range segments.

      • Elevator7009@ani.social
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        7
        ·
        1 day ago

        I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?

        • vithigar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          22 hours ago

          Your laptop uses an iGPU. The “i” stands for integrated, as it’s built into the same package as the CPU.

          The alternative, a dGPU, is a discrete part, separate from other components.

          They’re saying that your situation is becoming increasingly common. People can do the gaming they want to without a dGPU more easily as time goes by.

          • Elevator7009@ani.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            22 hours ago

            Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?

            • Khanzarate@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              20 hours ago

              Well at one point to be a computer gamer you basically needed to put together your own desktop PC.

              Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.

              So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.

              As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.

              It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.

              It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.

              • Elevator7009@ani.social
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                19 hours ago

                Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.

                I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.

                I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.

                But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?

                Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.

        • Agent Karyo@lemmy.world
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          1
          ·
          1 day ago

          Laptops (and desktops) with no GPUs will become increasingly viable not just for older games. This was a general comment. :)

          • merc@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            16 hours ago

            Depending on what happens with GPUs for datacenters, external GPUs might be so rare that nobody does it anymore.

            My impression right now is that for nVidia gamer cards are an afterthought now. Millions of gamers can’t compete with every company in Silicon Valley building entire datacenters stacked with as many “GPUs” as they can find.

            AMD isn’t the main choice for datacenter CPUs or GPUs. Maybe for them, gamers will be a focus, and there are some real advantages with APUs. For example, you’re not stuck with one particular amount of GPU RAM and a different amount of CPU RAM. Because you’re not multitasking as much when gaming, you need less CPU RAM, so you can dedicate more RAM to games and less to other apps. So, you can have the best of both worlds: tons of system RAM when you’re browsing websites and have a thousand tabs open, then start a game and you have gobs of RAM dedicated to the game.

            It’s probably also more efficient to have one enormous cooler for a combined GPU and CPU vs. a GPU with one set of heatsinks and fans and a separate CPU heatsink and fan.

            External GPUs are also a pain in the ass to manage. They’re getting bigger and heavier, and they take up more and more space in your case. Not to mention the problems their power draw is causing.

            If I could get equivalent system performance with an APU vs. a combined CPU and GPU, I’d probably go for it, even with the upgradeability concerns. OTOH, soldered-in RAM is not appealing because I’ve upgraded my RAM more often than other components on my PCs, and having to buy a whole new motherboard to get a RAM upgrade is not appealing.

    • icecreamtaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Same here, have never owned a graphics card in my life. When I occasionally do want a modern game it doesn’t need to be 200FPS RTX.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      edit-2
      1 day ago

      I’ve been using minipcs with integrated graphics ( and one with a laptop class GPU) instead of desktops and see no reason to stop.

  • Noite_Etion@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    1 day ago

    Probs the only reason for many to buy a console these days. For the cost of a high end GPU you can get an entire system and some games.

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      14 hours ago

      I have a powerful computer for my work with a 3090, i9 etc. I still prefer gaming on the Xbox, it’s in the living room, it doesn’t bother me with driver issues, CPU overheating, and other random bugs I get on the big machine, and it’s in the living room where I’m also spending the time with my family rather than stuck in my cave upstairs.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    1 day ago

    Funny thing about AMD is the MI300X is supposedly not selling well, largely because they priced gouge everything as bad as Nvidia, even where they aren’t competitive. Other than the Framework desktop, they are desperate to stay as uncompetitive in the GPU space as they possibly can, and not because the hardware is bad.

    Wasn’t the Intel B580 a good launch, though? It seems to have gotten rave reviews, and it’s in stock, yet has exited the hype cycle.

    • SupraMario@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      I looked for months for a b580 for my wifes pc. Couldn’t get one in stock for MSRP during that time. Finally caved and grabbed a 6900xt for $400 used. The intel cards are awesome, if you can get one. I do hope intel keeps up the mid range offerings at sane prices

    • Thorry84@feddit.nl
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 day ago

      Well the B580 is a budget / low-power GPU. All the discussions going around are for flagship and high-end GPUs. Intel isn’t in that space yet, but we can hope they have a B7xx or B9xx lined up which makes some waves.

  • zecg@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 day ago

    It’s pretty okay if you’re like me, i.e. have no needs above full hd res and can either take or leave rtx