• Lemminary@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      I thought I was happy I went AMD until my card started overrunning its fans for no reason a month after the warranty ran out. I manually had to reseat the card on the PCIe for it to stop because nothing else would, not even restarting the PC. And then one day it heated up so bad it stopped working. I think they gave me a defective card on purpose because people are less likely to return the items when they’re buying from outside the US.

      I’ve since gone back to Nvidia and my current card hasn’t given me any issues. What a nightmare that was.

        • simple@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          4
          ·
          18 hours ago

          Neither of them are as good, especially if you factor in raytracing. DLSS Ray Reconstruction is basically required to not have a noisy image with RTX.

          • potustheplant@feddit.nl
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            18 hours ago

            Ray tracing*

            RTX is a brand.

            Regardless, given the performance impact and how few games actually have ray tracing (implemented correctly), it makes more sense to just disregard ray tracing altoghether.

            It’s an undercooked technology used to push more expensive products, nothing more.

            Regarding dlss vs fsr and xess, yes dlss has better quality but it’s also proprietary so I honestly do not care about it. Just like gsync died, dlss will eventually die as well.

            • fuckwit_mcbumcrumble@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              17 hours ago

              Just like gsync died

              (true) gsync isn’t dead, it’s only in the highest end of monitors which is basically where it’s always been. It only “died” because it requires an expensive module vs adaptive sync being built into basically every modern display controller so it’s basically free.

              • potustheplant@feddit.nl
                link
                fedilink
                English
                arrow-up
                1
                ·
                17 hours ago

                The proprietary gsync approach with a dedicated hw module is indeed dead and most “g-sync” monitors just use the now pretty common vesa’s vrr (aka freesync).

                However I did research a bit and found some “gsync pulsar” monitors but none have been released yet, I believe. They do sound like unnecessary overpriced products though. That’s Nvidia for ya.

      • Zetta@mander.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        22 hours ago

        ROCM works mostly well in replacement of CUDA, and it gets better and better every year