• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    83
    arrow-down
    2
    ·
    edit-2
    3 days ago

    Reminder that Bethesda is owned by Microsoft, the company that insists it’s going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn’t officially support perfectly functional but somewhat old CPUs. So of course they don’t care about GPUs too old to support ray tracing.

    • nossaquesapao@lemmy.eco.br
      link
      fedilink
      English
      arrow-up
      33
      ·
      3 days ago

      They make gaming more and more elitist hobby, and then get surprised when indie games with pixel graphics that can run even on potato devices make a great success.

    • kemsat@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 days ago

        Earlier than they thought?

        How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.

        For technology, six years is vintage.

        The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what’s happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.

        In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?

        Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?

        • kemsat@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 day ago

          I’m still hearing from people that they’re using an Nvidia 1000 card. I was expecting to hear 2000 instead of 1000, and then it would happen.

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 day ago

            I have a 20 series card, albeit one of the higher tier ones, and I probably won’t be upgrading this year. I probably also won’t be playing any new AAA titles either.

            It’s fine to have an older card, but nobody in that position should be expecting to play the latest and greatest games at reasonable framerates, if at all.

            It is the way of things.

            I am personally rather miffed about the fact that if you want any performance from a GPU, you basically need to spend $800+. Even though some cards are saying they’re available for less, they almost never are, either due to scalping or greed (which are kind of the same thing), or something else like idiotic tariffs. I don’t have nearly a grand I can burn every year to upgrade my GPU the last GPU I bought was a 1060, and my current card was a gift. I haven’t had a budget for a decent GPU in many, many years.

            When I upgrade, I’m likely going Intel arc, because the value proposition makes sense to me. I can actually spend less than $600 and get a card that will have some reasonable level of performance.

            • kemsat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 day ago

              The current Intel GPUs aren’t better than a RTX 2070, so that won’t be an upgrade if you’re on a higher tier 2000 series.

              I just went up to a 4070 Ti from a 2080 Ti, because it was the only worthwhile upgrade. $660 used. So you don’t need to spend $800.

              • MystikIncarnate@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                14 hours ago

                Yeah, the gifted card I’m using is a 2080 Ti. My friend that gifted it, went from a dual 2080 ti SLI setup to a 4090 IIRC, he kept one for his old system so it’s still useful, but gave me one of the two since SLI is dead and he doesn’t need the extra card in a system he’s not frequently using.

                11G of memory is an odd choice, but it was a huge uplift from the 3G I was using before then. I had a super budget GTX 1060 3G (I think it was made by palit?) before.

                I still have to play on modest settings for anything modern, but my only real challenge has been feeding it with fresh air. My PC case puts the GPU on a riser with front to back airflow and very little space front-back and top/bottom. The card uses a side intake, which is fairly typical for GPUs, which is basically starved for air if I install the card normally. For now, I’ve got it on a riser, sitting on top of the system with the cover off, so my GPU is in open air. Not ideal. I need to work on a better solution… But it works great otherwise.

              • DacoTaco@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                20 hours ago

                Euh, no. The intel battlemage cards are way way better than rtx 2070. They even beat the 4060… For 250$.
                Intel battlemage gpu’s are really good cards if you dont need pure, raw, power because everything must be in 4k and on ultra etc.
                Which is a good value since that raw, pure, power comes with an electricity bill i would not want to pay

                • kemsat@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  18 hours ago

                  Yeah, I got the cards wrong. They are around a 2080, which is around the same as a 4060. Still not much of an upgrade from an upper end 2000 series, which to me is 2070 and up.

                  • DacoTaco@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    17 hours ago

                    I would actually want to see the actual performance differences, because the intel cards have a way faster memory bandwidth which is giving them the performance. Still, 250 for 4060 performance ( which is way way more ) is one hell of a good deal in comparison

    • djsaskdja@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Somewhat old CPUs are 8 years old+ now? Windows 11 is crap, but I don’t think the hardware requirements are the reason.

      • Laurel Raven@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        21 hours ago

        Honestly? Yeah.

        They’re perfectly functional still and capable of pretty much anything for a modern workload, spec depending… If they can run win 11 fine (and they should be able to if they can run 10), then the cutoff is arbitrary and will cause more systems to find their way to landfills sooner than they otherwise would have.

        • djsaskdja@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 hours ago

          How many 8 year old computers still function halfway decently? Most of those people are probably due for an upgrade whether they know it or not.

          Even if we go with this popular narrative, no one is actually going to immediately run to throw out their working Win10 PC when Microsoft cuts off updates. They’ll just continue to use it insecurely. Just like millions of people did and still do with Win7.

          This is the issue with using a proprietary operating system in general. Eventually they’ll cut you off arbitrarily because there’s a profit motive to do so. Relying on them to keep your system updated and secured indefinitely is a naive prospect to begin with.